POST /v1/chat/completions
Use this endpoint for OpenAI-style chat completions with optional streaming.
Auth: Authorization: Bearer bos_live_sk_...
Request body
Request fields
| Field | Type | Required | Notes |
|---|---|---|---|
model | string | Yes | Chat alias from /v1/models |
messages | array | Yes | OpenAI-style chat messages |
stream | boolean | No | When true, the endpoint returns SSE |
Response behavior
- Streaming: returns OpenAI-compatible SSE chunks
- Non-streaming: returns the provider JSON from
/v1/chat/completions
Typical errors
404withcode: "model_not_found"when the alias is unknown or inactive422withcode: "invalid_payload"for malformed JSON or invalid request structure
POST /v1/messages
Use this endpoint when your client already speaks the Anthropic/basicOS-style message format. Internally, it calls /v1/chat/completions.
Auth: required
Request body
Request fields
| Field | Type | Required | Notes |
|---|---|---|---|
model | string | No | Defaults to basics-chat-smart |
messages | array | Yes | Each item includes role and content |
max_tokens | number | No | Maximum tokens for the response |
stream | boolean | No | Returns SSE when true |
Notes
- Allowed roles are
user,assistant, andsystem contentcan be a string or richer JSON content- Internally,
systemmessages are mapped torole: "user"to fit provider shape
POST /v1/embeddings
Use this endpoint to create embeddings with an OpenAI-style request.
Auth: required
Request body
Request fields
| Field | Type | Required | Notes |
|---|---|---|---|
model | string | Yes | Embeddings alias from /v1/models |
input | string or string[] | Yes | Text to embed |
Response
BYOK support
These endpoints also support BYOK on the routes below:- Chat and messages support BYOK with
openai,anthropic, orgemini - Embeddings support BYOK with
openaiorgemini

