LLMCapabilities Model
LLMCapabilities Model documentation for JavaScript SDK
Properties
| Name | Type | Description | Notes |
|---|---|---|---|
| supportsChat | Boolean | Supports conversational/chat completion format with message roles | [optional] |
| supportsCompletion | Boolean | Supports raw text completion with prompt continuation | [optional] |
| supportsFunctionCalling | Boolean | Supports function/tool calling with structured responses | [optional] |
| supportsSystemMessages | Boolean | Supports system prompts to define model behavior and context | [optional] |
| supportsStreaming | Boolean | Supports real-time token streaming during generation | [optional] |
| supportsSamplingParameters | Boolean | Supports sampling parameters like temperature, top_p, and top_k for generation control | [optional] |