1. Plugin Store
You can directly search for 302.AI in the plugin store to find our official plugin.
After adding it to your workflow, enter the API KEY generated from the 302.AI backend - Use API — API Keys to use it.
2. Create a Plugin
You can also choose not to use the plugin from the store and create one yourself.
In Personal Space - Plugins, select Import in the upper right corner.
Choose: URL and Raw Data
Copy the following content:
info:
description: 302_AI
title: 302_AI
version: v1
openapi: 3.0.1
paths:
/v1/chat/completions:
post:
operationId: Chat
parameters:
- description: AI Management Backend - Use API - API Keys, copy the generated API Key
in: header
name: Authorization
required: true
schema:
default: sk-xxx
type: string
requestBody:
content:
application/json:
schema:
properties:
message:
default: Hello
description: messages sent to LLM
type: string
model:
default: gpt-4o
description: LLM model names
type: string
required:
- message
- model
type: object
responses:
"200":
content:
application/json:
schema:
properties:
output:
description: the output of LLM
type: string
type: object
description: new desc
default:
description: ""
summary: |-
Engage in single-round conversations with various LLMs. Please check the 302.AI official website API documentation for the model list; both domestic and international open-source models are fully supported.
You only need to input the model and message to obtain the output. It is the simplest plugin for calling various large models.
It is particularly suitable for use in various workflows that have high requirements for text capabilities, serving as a pure input and output process.
servers:
- url: https://api.302.ai/v1/chat/completions
Click Confirm:
Enable the plugin, then click Debug:
Enter the API KEY generated from the 302.AI backend - Use API - API Keys, click Run, then click Done.
After completion, click Publish in the upper right corner, then click No, and the plugin can be used in the workflow.