GoodMem
ReferenceCLI

Llm Update

Llm Update CLI command reference

goodmem llm update

Update an LLM

Synopsis

Update an existing LLM's configuration.

goodmem llm update <llm-id> [flags]

Examples

  # Update LLM display name and description
  goodmem llm update abc123... \
    --display-name "Updated GPT-4" \
    --description "Updated description"

  # Update capabilities
  goodmem llm update abc123... \
    --supports-chat \
    --no-supports-completion \
    --supports-function-calling

  # Update sampling parameters
  goodmem llm update abc123... \
    --sampling-max-tokens 2048 \
    --sampling-temperature 0.5

  # Rotate inline API key credentials
  goodmem llm update abc123... \
    --cred-api-key "sk-new"

  # Replace all labels
  goodmem llm update abc123... \
    --replace-labels \
    --label env=production \
    --label version=2.0

  # Merge labels (add or update specific labels)
  goodmem llm update abc123... \
    --merge-labels \
    --label updated=true

Options

      --client-config string                 New provider-specific client configuration as JSON string
      --cred-api-key string                  Inline API key stored by GoodMem (sends Authorization: Bearer <key>)
      --cred-gcp                             Use Google Application Default Credentials
      --cred-gcp-quota string                Quota project for Google ADC requests
      --cred-gcp-scope strings               Additional Google ADC OAuth scope (can be specified multiple times)
      --description string                   New description for the LLM
      --display-name string                  New display name for the LLM
  -h, --help                                 help for update
  -l, --label strings                        Labels in key=value format (can be specified multiple times)
      --label-strategy string                Label update strategy: 'replace' to overwrite all existing labels, 'merge' to add to existing labels (default "replace")
      --max-context-length int32             New maximum context length in tokens
      --monitoring-endpoint string           New monitoring endpoint for the LLM
      --no-supports-chat                     LLM does not support chat/conversation mode
      --no-supports-completion               LLM does not support text completion mode
      --no-supports-function-calling         LLM does not support function calling
      --no-supports-sampling-parameters      LLM does not support sampling parameters
      --no-supports-streaming                LLM does not support streaming responses
      --no-supports-system-messages          LLM does not support system messages
      --sampling-frequency-penalty float32   Frequency penalty (-2.0 to 2.0)
      --sampling-max-tokens int32            Maximum number of tokens to generate
      --sampling-presence-penalty float32    Presence penalty (-2.0 to 2.0)
      --sampling-stop-sequences strings      Stop sequences (can be specified multiple times)
      --sampling-temperature float32         Sampling temperature (0.0-2.0)
      --sampling-top-k int32                 Top-k sampling parameter
      --sampling-top-p float32               Top-p sampling parameter (0.0-1.0)
      --supports-chat                        LLM supports chat/conversation mode
      --supports-completion                  LLM supports text completion mode
      --supports-function-calling            LLM supports function calling
      --supports-sampling-parameters         LLM supports sampling parameters (temperature, top_p, etc.)
      --supports-streaming                   LLM supports streaming responses
      --supports-system-messages             LLM supports system messages
      --version string                       New version for the LLM

Options inherited from parent commands

      --api-key string   API key for authentication (can also be set via GOODMEM_API_KEY environment variable)
      --server string    GoodMem server address (gRPC API) (default "https://localhost:9090")

SEE ALSO