cURL
curl --request POST \ --url http://localhost:5173/api/llmcall \ --header 'Content-Type: application/json' \ --cookie apiKeys= \ --data ' { "messages": [ { "id": "<string>", "role": "user", "content": "<string>" } ], "provider": "<string>", "model": "<string>", "stream": false } '
{ "response": "<string>", "usage": { "promptTokens": 123, "completionTokens": 123, "totalTokens": 123 } }
Execute a direct call to a language model with optional streaming support.
Cookie-based authentication storing API keys and provider settings
Show child attributes
LLM provider name
Model identifier
Enable streaming response
LLM response
Was this page helpful?