Getting Started
Getting started with GoodMem configuration and testing
By now you should have installed GoodMem, either manually or through the devcontainer. If you have not completed this step, please proceed with the installation.
Devcontainer Setup (Skip if you installed GoodMem manually)
-
Click below to open a Codespace using the GoodMem template repository:
-
After your Codespace launches, check the bottom-left corner of VS Code. Click on the
Codespaces: [name]badge and choose View Creation Logs.
Configuration Steps
-
In the logs, locate output similar to the following:
Connecting to gRPC API at https://localhost:9090 Using TLS with certificate verification disabled (insecure mode) System initialized successfully Root API key: gm_xxxxxxxxxxxxxxxxxxxxxxxx User ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Tip: gRPC calls default to
https://localhost:9090, while the REST API listens onhttp://localhost:8080. Use the REST URL for HTTP clients and the gRPC URL for the CLI.
-
Save Your Root API Key\
The Root API Key
(gm_xxxxxxxxxxxxxxxxxxxxxxxx)is shown only once. It’s best to copy and store it now. -
Obtain your OpenAI API Key from the OpenAI dashboard and keep it ready for the next step.
-
Create an embedder (must be created before a space):
goodmem embedder create \ --display-name "OpenAI Small Embedder" \ --provider-type OPENAI \ --endpoint-url "https://api.openai.com/v1" \ --model-identifier "text-embedding-3-small" \ --dimensionality 1536 \ --credentials "YOUR_OPENAI_API_KEY_FROM_STEP_3"The command should output:
Embedder created successfully!
ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Display Name: OpenAI Small Embedder
Owner: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Provider Type: OPENAI
Distribution: DENSE
Endpoint URL: https://api.openai.com/v1\ API Path: /embeddings
Model: text-embedding-3-smallSAVE THE ID
-
Create a space linked to that embedder:
goodmem space create \ --name "My OpenAI Small Space" \ --embedder-id <YOUR_EMBEDDER_ID_FROM_STEP_4>The command should output:
Space created successfully!
ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Name: My OpenAI Small Space
Owner: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Created by: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Created at: 2025-08-20T21:08:20Z
Public: false
Embedder: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx (weight: 1.0)SAVE THE ID
-
Create an LLM:
goodmem llm create \ --display-name "My GPT-4" \ --provider-type OPENAI \ --endpoint-url "https://api.openai.com/v1" \ --model-identifier "gpt-4o"The command should output:
LLM created successfully!
ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Name: My GPT-4
Provider: LLM_PROVIDER_TYPE_OPENAI
Model: gpt-4oSAVE THE ID
Testing the Queries
The next major step is to upload content into memory so it can be queried. To do this, we will first upload a PDF and store it in memory. After that, we will run some queries. Follow the directions below:
-
Begin by creating a memory. In this case, I will be using this PDF, which I recommend you use as well for testing:
Then run this command:
goodmem memory create \ --space-id <YOUR_SPACE_ID_FROM_STEP_5> \ --file "path to where you downloaded the pdf" \ --content-type "application/pdf"It should output:
Memory created successfully!
ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Space ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Content Type: application/pdf
Status: PENDING
Created by: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Created at: 2025-08-20T21:20:00ZSAVE THE ID (not the space ID since you already have that)
-
To run a query, you have two options: non-interactive mode and interactive mode.
Non-interactive mode:
goodmem memory retrieve \ "what are the top three negative affects of social media?" \ --space-id <YOUR_SPACE_ID_FROM_STEP_5>Interactive mode (easier to retrieve results):
goodmem memory retrieve \ --space-id <YOUR_SPACE_ID_FROM_STEP_5>\ --post-processor-interactive "what are the top three negative affects of social media?"