Skip to main content
POST
/
api
/
llmcall
Direct LLM invocation
curl --request POST \
  --url http://localhost:5173/api/llmcall \
  --header 'Content-Type: application/json' \
  --cookie apiKeys= \
  --data '
{
  "messages": [
    {
      "id": "<string>",
      "role": "user",
      "content": "<string>"
    }
  ],
  "provider": "<string>",
  "model": "<string>",
  "stream": false
}
'
{
  "response": "<string>",
  "usage": {
    "promptTokens": 123,
    "completionTokens": 123,
    "totalTokens": 123
  }
}

Authorizations

apiKeys
string
cookie
required

Cookie-based authentication storing API keys and provider settings

Body

application/json
messages
object[]
required
provider
string
required

LLM provider name

model
string
required

Model identifier

stream
boolean
default:false

Enable streaming response

Response

LLM response

response
string
usage
object