Memory
Memory lets your agent save information and recall it later. Without memory, every conversation starts fresh. With memory, your agent builds context over time, recalls previous interactions, and adapts to the user.Three Approaches
You can add memory to your agent with the AI SDK in three ways, each with different tradeoffs:| Approach | Effort | Flexibility | Provider Lock-in |
|---|---|---|---|
| Provider-Defined Tools | Low | Medium | Yes |
| Memory Providers | Low | Low | Depends on memory provider |
| Custom Tool | High | High | No |
Provider-Defined Tools
Provider-defined tools are tools where the provider specifies the tool’sinputSchema and description, but you provide the execute function. The model has been trained to use these tools, which can result in better performance compared to custom tools.
Anthropic Memory Tool
The Anthropic Memory Tool gives Claude a structured interface for managing a/memories directory. Claude reads its memory before starting tasks, creates and updates files as it works, and references them in future conversations.
view, create, str_replace, insert, delete, rename), each with a path scoped to /memories. Your execute function maps these to your storage backend (the filesystem, a database, or any other persistence layer).
When to use this: you want memory with minimal implementation effort and are already using Anthropic models. The tradeoff is provider lock-in, since this tool only works with Claude.
Memory Providers
Another approach is to use a provider that has memory built in. These providers wrap an external memory service and expose it through the AI SDK’s standard interface. Memory storage, retrieval, and injection happen transparently, and you do not define any tools yourself.Letta
Letta provides agents with persistent long-term memory. You create an agent on Letta’s platform (cloud or self-hosted), configure its memory there, and use the AI SDK provider to interact with it. Letta’s agent runtime handles memory management (core memory, archival memory, recall).Mem0
Mem0 adds a memory layer on top of any supported LLM provider. It automatically extracts memories from conversations, stores them, and retrieves relevant ones for future prompts.Supermemory
Supermemory is a long-term memory platform that adds persistent, self-growing memory to your AI applications. It provides tools that handle saving and retrieving memories automatically through semantic search.addMemory and searchMemories operations that handle storage and retrieval.
See the Supermemory provider documentation for full setup and configuration.
Hindsight
Hindsight provides agents with persistent memory through five tools:retain, recall, reflect, getMentalModel, and getDocument. It can be self-hosted with Docker or used as a cloud service.
bankId identifies the memory store and is typically a user ID. In multi-user apps, call createHindsightTools inside your request handler so each request gets the right bank. Hindsight works with any AI SDK provider.
See the Hindsight provider documentation for full setup and configuration.
When to use memory providers: these providers are a good fit when you want memory without building any storage infrastructure. The tradeoff is that the provider controls memory behavior, so you have less visibility into what gets stored and how it is retrieved. You also take on a dependency on an external service.
Custom Tool
Building your own memory tool from scratch is the most flexible approach. You control the storage format, the interface, and the retrieval logic. This requires the most upfront work but gives you full ownership of how memory works, with no provider lock-in and no external dependencies. There are two common patterns:- Structured actions: you define explicit operations (
view,create,update,search) and handle structured input yourself. Safe by design since you control every operation. - Bash-backed: you give the model a sandboxed bash environment to compose shell commands (
cat,grep,sed,echo) for flexible memory access. More powerful but requires command validation for safety.