LLMSamplingParams Model
LLMSamplingParams Model documentation for Java SDK
LLMSamplingParams
Sampling and generation parameters for controlling LLM text output
Properties
| Name | Type | Description | Notes |
|---|---|---|---|
| maxTokens | Integer | Maximum tokens to generate (>0 if set; provider-dependent limits apply) | [optional] |
| temperature | Float | Sampling temperature 0.0-2.0 (0.0=deterministic, 2.0=highly random) | [optional] |
| topP | Float | Nucleus sampling threshold 0.0-1.0 (smaller values focus on higher probability tokens) | [optional] |
| topK | Integer | Top-k sampling limit (>0 if set; primarily for local/open-source models) | [optional] |
| frequencyPenalty | Float | Frequency penalty -2.0 to 2.0 (positive values reduce repetition based on frequency) | [optional] |
| presencePenalty | Float | Presence penalty -2.0 to 2.0 (positive values encourage topic diversity) | [optional] |
| stopSequences | List<String> | Generation stop sequences (≤10 sequences; each ≤100 chars; generation halts on exact match) | [optional] |