Chinese Researchers Unveil MemOS, the First “Memory Operating System” for AI

In early July 2025, a team of researchers from Shanghai Jiao Tong University, Zhejiang University, and partners introduced MemOS, the first operational “memory operating system” designed to bring AI a level of human-like persistent recall (VentureBeat).

What is MemOS?

Traditional large language models (LLMs) use short-lived context windows or rely on retrieval hacks, which lack true memory retention. MemOS reimagines memory as a fundamental resource—managed through MemCubes, self‑contained memory units that pair content with metadata like provenance, versioning, and governance rules (Reddit).

This transforms memory into a schedulable, shareable, and evolvable element—much like how operating systems manage CPU or storage.

Why It Matters

  • Bridge short- and long-term memory: MemOS unifies transient activations, persistent plaintext, and parameter-based memories under one structure (Reddit).
  • Lifecycle control & governance: MemCubes support scheduling, migration, auditing, privacy and usage policies—so memory isn’t just stored; it’s automatically managed over time (HyperAI超神经).
  • Cross-platform portability: AI “memory islands” dissolve—MemOS enables memory migration across different tools or platforms.
  • Expert memory marketplace: Imagine buying a “medical knowledge cube” from specialists—that vision of monetized, installable memory modules is in the researchers’ roadmap (VentureBeat).

Impressive Gains in Testing

  • 159 % boost in temporal reasoning tasks versus OpenAI’s memory system
  • 38.9 % overall improvement on the LOCOMO benchmark
  • Up to 94 % reduction in latency through efficient KV-cache injections (VentureBeat)

These results show that treating memory as a first-class computational resource significantly boosts AI reasoning and performance.

Architecture Overview

MemOS follows a three-layer OS-like architecture (VentureBeat):

LayerPurpose
InterfaceLLM-friendly APIs for memory operations
OperationContains MemScheduler, MemLifecycle, and governance logic
InfrastructureStorage engines (vector DBs, file systems), policy enforcement, cross-device support

Community Reaction

On r/singularity, a user summarized:

“MemOS positions ‘memory’ as a first-class operating‑system resource for LLM agents… conceptually elegant and empirically promising.” (Reddit)

Another noted limitations:

“Advertised numbers come from GPT‑4o‑mini… where their MemCube method will scale we’ll only know with time.” (Reddit)

Open-Source & Roadmap

MemOS is open-source—available under MIT license on GitHub and supports LLMs via Hugging Face, OpenAI, and Ollama. Linux support is ready; Windows and macOS are in development(VentureBeat).

Future directions include:

  • Cross‑model memory transfer
  • Self‑evolving memory blocks
  • A marketplace ecosystem for “paid memory modules” (HyperAI超神经, VentureBeat)

Why This Could Be a Game-Changer

AI developers and enterprises have a long-standing memory gap in multi-session workflows—be it ongoing customer service, personal assistants, education tools, or diagnostic systems. MemOS’s structured, persistent, governed memory could finally bridge that gap and redefine long-term AI interaction.