Skip to main content
POST
/
api
/
chat
Stream chat responses with AI models
curl --request POST \
  --url http://localhost:5173/api/chat \
  --header 'Content-Type: application/json' \
  --cookie apiKeys= \
  --data '
{
  "messages": [
    {
      "id": "<string>",
      "role": "user",
      "content": "<string>"
    }
  ],
  "files": {},
  "promptId": "<string>",
  "contextOptimization": false,
  "designScheme": {},
  "supabase": {},
  "enableMCPTools": false
}
'
"<string>"

Authorizations

apiKeys
string
cookie
required

Cookie-based authentication storing API keys and provider settings

Body

application/json
messages
object[]
required
files
object
promptId
string

Optional prompt template ID

contextOptimization
boolean
default:false

Enable smart context optimization

designScheme
object

UI design scheme configuration

supabase
object

Supabase integration configuration

enableMCPTools
boolean
default:false

Enable Model Context Protocol tools

Response

Streaming response

The response is of type string.