LLMCapabilities Model
LLMCapabilities Model documentation for .NET SDK
Capabilities and features supported by an LLM service
Properties
| Name | Type | Description | Notes |
|---|---|---|---|
| SupportsChat | bool? | Supports conversational/chat completion format with message roles | [optional] |
| SupportsCompletion | bool? | Supports raw text completion with prompt continuation | [optional] |
| SupportsFunctionCalling | bool? | Supports function/tool calling with structured responses | [optional] |
| SupportsSystemMessages | bool? | Supports system prompts to define model behavior and context | [optional] |
| SupportsStreaming | bool? | Supports real-time token streaming during generation | [optional] |
| SupportsSamplingParameters | bool? | Supports sampling parameters like temperature, top_p, and top_k for generation control | [optional] |