Skip to main content

Memory Contexts

Memory contexts represent an entire chat history of messages. They can be used everywhere an LLM can be used.

Creating a Memory Context

Memory contexts are created via the createContext API by optionally supplying a Persona/Scenario or list of ChatContextMessages.

ChatContextMessages follow the same role, content, etc. structure you are used to with OpenAI-compatible APIs.

Using with Realtime Sessions

Realtime sessions always use a memory context. They can be started with a previously created context when creating the session. If a context is not supplied during session creation, one will be created automatically.

During realtime sessions, both the human and AI messages for a turn are automatically saved to the context as soon as the AI produces audio. When this happens, the realtime_session.message_committed is triggered.

See Generating Connection Details.

Using with chat/completions

Memory contexts can be used with the chat/completions endpoint as well. When using a memory context, the messages field should only contain the latest new message. The previous message history of the context will automatically be filled in by Gabber.

Memory contexts are supplied in the gabber body of the request. See Chat Completions for reference.

To maintain maximum flexibility, new messages are not automatically saved into memory contexts. Upon receiving a response, it's up to you to save both the request (typically user) message and response assistant message to the context using the createContextMessage API.

OpenAI SDKs can be used by supplying including a gabber section in the body as an extra field. See the more detailed guide for Gabber's /chat/completions endpoint.