Observational Memory by Mastra
Give your AI agents human-like memory
Observational Memory by Mastra – Human-like memory system for AI agents
Summary: Observational Memory is a state-of-the-art memory system for AI agents that mimics human memory by compressing and reorganizing conversations to retain important information while discarding irrelevant details. It achieves a 95% score on LongMemEval, the highest recorded, enabling stable, prompt-cacheable context windows without typical tradeoffs between memory, cost, and coherence.
What it does
It uses two background agents: the Observer compresses conversations into dense, timestamped observations reducing tokens by 6-40x, and the Reflector reorganizes long-term memory by combining related items and removing outdated information. This process maintains relevant context efficiently over time.
Who it's for
It is designed for developers building AI agents who need efficient, coherent long-term memory without the latency, cost, or information loss of traditional retrieval or compaction methods.
Why it matters
Observational Memory solves the tradeoff between memory capacity, cost, and coherence by providing a scalable, human-like memory approach that preserves critical details and supports prompt caching.