GoodMem
ReferenceClient SDKsPython SDKModels

LLMCapabilities Model

LLMCapabilities Model documentation for Python SDK

Capabilities and features supported by an LLM service

Properties

NameTypeDescriptionNotes
supports_chatboolSupports conversational/chat completion format with message roles[optional]
supports_completionboolSupports raw text completion with prompt continuation[optional]
supports_function_callingboolSupports function/tool calling with structured responses[optional]
supports_system_messagesboolSupports system prompts to define model behavior and context[optional]
supports_streamingboolSupports real-time token streaming during generation[optional]
supports_sampling_parametersboolSupports sampling parameters like temperature, top_p, and top_k for generation control[optional]

Example

from goodmem_client.models.llm_capabilities import LLMCapabilities

# TODO update the JSON string below
json = "{}"
# create an instance of LLMCapabilities from a JSON string
llm_capabilities_instance = LLMCapabilities.from_json(json)
# print the JSON string representation of the object
print(LLMCapabilities.to_json())

# convert the object into a dict
llm_capabilities_dict = llm_capabilities_instance.to_dict()
# create an instance of LLMCapabilities from a dict
llm_capabilities_from_dict = LLMCapabilities.from_dict(llm_capabilities_dict)

↑ Back to Python SDK ↑ Back to Python SDK ↑ Back to Python SDK

On this page