LLMs API
LLMs API documentation for JavaScript SDK
All URIs are relative to http://localhost:8080
| Method | HTTP request | Description |
|---|---|---|
| createLLM | POST /v1/llms | Create a new LLM |
| deleteLLM | DELETE /v1/llms/{id} | Delete an LLM |
| getLLM | GET /v1/llms/{id} | Get an LLM by ID |
| listLLMs | GET /v1/llms | List LLMs |
| updateLLM | PUT /v1/llms/{id} | Update an LLM |
createLLM
CreateLLMResponse createLLM(lLMCreationRequest)
Create a new LLM
Creates a new LLM configuration for text generation services. LLMs represent connections to different language model API services (like OpenAI, vLLM, etc.) and include all the necessary configuration to use them for text generation. DUPLICATE DETECTION: Returns ALREADY_EXISTS if another LLM exists with identical {endpoint_url, api_path, model_identifier} after URL canonicalization. The api_path field defaults to '/v1/chat/completions' if omitted. Requires CREATE_LLM_OWN permission (or CREATE_LLM_ANY for admin users).
Example
import GoodMemClient from '@pairsystems/goodmem-client';
let apiInstance = new GoodMemClient.LLMsApi();
let lLMCreationRequest = {"displayName":"GPT-4 Turbo","description":"OpenAI's GPT-4 Turbo model for chat completions","providerType":"OPENAI","endpointUrl":"https://api.openai.com","apiPath":"/v1/chat/completions","modelIdentifier":"gpt-4-turbo-preview","supportedModalities":["TEXT"],"credentials":{"kind":"CREDENTIAL_KIND_API_KEY","apiKey":{"inlineSecret":"sk-your-api-key-here"}},"capabilities":{"supportsChat":"true","supportsCompletion":"true","supportsFunctionCalling":"true","supportsSystemMessages":"true","supportsStreaming":"true","supportsSamplingParameters":"true"},"defaultSamplingParams":{"maxTokens":"2048","temperature":"0.7","topP":"0.9"},"maxContextLength":"32768","labels":{"environment":"production","team":"ai"}}; // LLMCreationRequest | LLM configuration details
apiInstance.createLLM(lLMCreationRequest).then((data) => {
console.log('API called successfully. Returned data: ' + data);
}, (error) => {
console.error(error);
});Parameters
| Name | Type | Description | Notes |
|---|---|---|---|
| lLMCreationRequest | LLMCreationRequest | LLM configuration details |
Return type
Authorization
No authorization required
HTTP request headers
- Content-Type: application/json
- Accept: application/json
deleteLLM
deleteLLM(id)
Delete an LLM
Permanently deletes an LLM configuration. This operation cannot be undone and removes the LLM record and securely deletes stored credentials. IMPORTANT: This does NOT invalidate or delete any previously generated content using this LLM - existing generations remain accessible. Requires DELETE_LLM_OWN permission for LLMs you own (or DELETE_LLM_ANY for admin users).
Example
import GoodMemClient from '@pairsystems/goodmem-client';
let apiInstance = new GoodMemClient.LLMsApi();
let id = "550e8400-e29b-41d4-a716-446655440000"; // String | The unique identifier of the LLM to delete
apiInstance.deleteLLM(id).then(() => {
console.log('API called successfully.');
}, (error) => {
console.error(error);
});Parameters
| Name | Type | Description | Notes |
|---|---|---|---|
| id | String | The unique identifier of the LLM to delete |
Return type
null (empty response body)
Authorization
No authorization required
HTTP request headers
- Content-Type: Not defined
- Accept: Not defined
getLLM
LLMResponse getLLM(id)
Get an LLM by ID
Retrieves the details of a specific LLM configuration by its unique identifier. Requires READ_LLM_OWN permission for LLMs you own (or READ_LLM_ANY for admin users to view any user's LLMs). This is a read-only operation with no side effects.
Example
import GoodMemClient from '@pairsystems/goodmem-client';
let apiInstance = new GoodMemClient.LLMsApi();
let id = "550e8400-e29b-41d4-a716-446655440000"; // String | The unique identifier of the LLM to retrieve
apiInstance.getLLM(id).then((data) => {
console.log('API called successfully. Returned data: ' + data);
}, (error) => {
console.error(error);
});Parameters
| Name | Type | Description | Notes |
|---|---|---|---|
| id | String | The unique identifier of the LLM to retrieve |
Return type
Authorization
No authorization required
HTTP request headers
- Content-Type: Not defined
- Accept: application/json
listLLMs
ListLLMsResponse listLLMs(opts)
List LLMs
Retrieves a list of LLM configurations accessible to the caller, with optional filtering. PERMISSION-BASED FILTERING: With LIST_LLM_OWN permission, you can only see your own LLMs (owner_id filter is ignored if set to another user). With LIST_LLM_ANY permission, you can see all LLMs or filter by any owner_id. This is a read-only operation with no side effects.
Example
import GoodMemClient from '@pairsystems/goodmem-client';
let apiInstance = new GoodMemClient.LLMsApi();
let opts = {
'ownerId': "550e8400-e29b-41d4-a716-446655440000", // String | Filter LLMs by owner ID. With LIST_LLM_ANY permission, omitting this shows all accessible LLMs; providing it filters by that owner. With LIST_LLM_OWN permission, only your own LLMs are shown regardless of this parameter.
'providerType': "OPENAI", // String | Filter LLMs by provider type (e.g., OPENAI, VLLM, OLLAMA, etc.)
'label': "?label.environment=production&label.team=ai" // String | Filter by label value. Multiple label filters can be specified (e.g., ?label.environment=production&label.team=ai)
};
apiInstance.listLLMs(opts).then((data) => {
console.log('API called successfully. Returned data: ' + data);
}, (error) => {
console.error(error);
});Parameters
| Name | Type | Description | Notes |
|---|---|---|---|
| ownerId | String | Filter LLMs by owner ID. With LIST_LLM_ANY permission, omitting this shows all accessible LLMs; providing it filters by that owner. With LIST_LLM_OWN permission, only your own LLMs are shown regardless of this parameter. | [optional] |
| providerType | String | Filter LLMs by provider type (e.g., OPENAI, VLLM, OLLAMA, etc.) | [optional] |
| label | String | Filter by label value. Multiple label filters can be specified (e.g., ?label.environment=production&label.team=ai) | [optional] |
Return type
Authorization
No authorization required
HTTP request headers
- Content-Type: Not defined
- Accept: application/json
updateLLM
LLMResponse updateLLM(id, lLMUpdateRequest)
Update an LLM
Updates an existing LLM configuration including display information, endpoint configuration, model parameters, credentials, and labels. All fields are optional - only specified fields will be updated. IMPORTANT: provider_type is IMMUTABLE after creation and cannot be changed. Requires UPDATE_LLM_OWN permission for LLMs you own (or UPDATE_LLM_ANY for admin users).
Example
import GoodMemClient from '@pairsystems/goodmem-client';
let apiInstance = new GoodMemClient.LLMsApi();
let id = "550e8400-e29b-41d4-a716-446655440000"; // String | The unique identifier of the LLM to update
let lLMUpdateRequest = {"displayName":"Updated GPT-4 Turbo","description":"Updated OpenAI GPT-4 Turbo with enhanced configuration for production use","endpointUrl":"https://api.openai.com","apiPath":"/v1/chat/completions","modelIdentifier":"gpt-4-turbo-preview","supportedModalities":["TEXT"],"credentials":{"kind":"CREDENTIAL_KIND_API_KEY","apiKey":{"inlineSecret":"sk-updated-api-key-here"}},"capabilities":{"supportsChat":"true","supportsCompletion":"true","supportsFunctionCalling":"true","supportsSystemMessages":"true","supportsStreaming":"true","supportsSamplingParameters":"true"},"version":"2.0.1","monitoringEndpoint":"https://monitoring.company.com/llms/status","replaceLabels":{"environment":"production","team":"ml-platform","cost-center":"ai-infrastructure"}}; // LLMUpdateRequest | LLM update details
apiInstance.updateLLM(id, lLMUpdateRequest).then((data) => {
console.log('API called successfully. Returned data: ' + data);
}, (error) => {
console.error(error);
});Parameters
| Name | Type | Description | Notes |
|---|---|---|---|
| id | String | The unique identifier of the LLM to update | |
| lLMUpdateRequest | LLMUpdateRequest | LLM update details |
Return type
Authorization
No authorization required
HTTP request headers
- Content-Type: application/json
- Accept: application/json