LLMSamplingParams Model
LLMSamplingParams Model documentation for .NET SDK
Sampling and generation parameters for controlling LLM text output
Properties
| Name | Type | Description | Notes |
|---|---|---|---|
| MaxTokens | int? | Maximum tokens to generate (>0 if set; provider-dependent limits apply) | [optional] |
| Temperature | float? | Sampling temperature 0.0-2.0 (0.0=deterministic, 2.0=highly random) | [optional] |
| TopP | float? | Nucleus sampling threshold 0.0-1.0 (smaller values focus on higher probability tokens) | [optional] |
| TopK | int? | Top-k sampling limit (>0 if set; primarily for local/open-source models) | [optional] |
| FrequencyPenalty | float? | Frequency penalty -2.0 to 2.0 (positive values reduce repetition based on frequency) | [optional] |
| PresencePenalty | float? | Presence penalty -2.0 to 2.0 (positive values encourage topic diversity) | [optional] |
| StopSequences | List<string> | Generation stop sequences (≤10 sequences; each ≤100 chars; generation halts on exact match) | [optional] |