GoodMem
ReferenceClient SDKsPython SDKModels

LLMResponse Model

LLMResponse Model documentation for Python SDK

LLM configuration information

Properties

NameTypeDescriptionNotes
llm_idstrUnique identifier of the LLM
display_namestrUser-facing name of the LLM
descriptionstrDescription of the LLM[optional]
provider_typeLLMProviderType
endpoint_urlstrAPI endpoint URL
api_pathstrAPI path for chat/completions request
model_identifierstrModel identifier
supported_modalitiesList[Modality]Supported content modalities[optional]
labelsDict[str, str]User-defined labels for categorization[optional]
versionstrVersion information[optional]
monitoring_endpointstrMonitoring endpoint URL[optional]
capabilitiesLLMCapabilities
default_sampling_paramsLLMSamplingParams[optional]
max_context_lengthintMaximum context window size in tokens[optional]
client_configDict[str, object]Provider-specific client configuration[optional]
owner_idstrOwner ID of the LLM
created_atintCreation timestamp (milliseconds since epoch)
updated_atintLast update timestamp (milliseconds since epoch)
created_by_idstrID of the user who created the LLM
updated_by_idstrID of the user who last updated the LLM

Example

from goodmem_client.models.llm_response import LLMResponse

# TODO update the JSON string below
json = "{}"
# create an instance of LLMResponse from a JSON string
llm_response_instance = LLMResponse.from_json(json)
# print the JSON string representation of the object
print(LLMResponse.to_json())

# convert the object into a dict
llm_response_dict = llm_response_instance.to_dict()
# create an instance of LLMResponse from a dict
llm_response_from_dict = LLMResponse.from_dict(llm_response_dict)

↑ Back to Python SDK ↑ Back to Python SDK ↑ Back to Python SDK

On this page