MemGPT Logo MemGPT

Towards LLMs as Operating Systems

Paper Discord GitHub Dataset

Teach LLMs to manage their own memory for unbounded context!

MemGPT system overview


  • LLMs are increasingly being used for perpetual chats
  • Limited context lengths makes perpetual chat challenging
  • MemGPT manages a virtual context (inspired by virtual memory in operating systems) to create unbounded LLM context
  • With MemGPT, we demonstrate that LLMs can be taught to manage their own memory!


Large language models (LLMs) have revolutionized AI, but are constrained by limited context windows, hindering their utility in tasks like extended conversations and document analysis. To enable using context beyond limited context windows, we propose virtual context management, a technique drawing inspiration from hierarchical memory systems in traditional operating systems which provide the illusion of an extended virtual memory via paging between physical memory and disk. Using this technique, we introduce MemGPT (MemoryGPT), a system that intelligently manages different storage tiers in order to effectively provide extended context within the LLM’s limited context window. We evaluate our OS-inspired design in two domains where the limited context windows of modern LLMs severely handicaps their performance: document analysis, where MemGPT is able to analyze large documents that far exceed the underlying LLM’s context window, and multi-session chat, where MemGPT can create conversational agents that remember, reflect, and evolve dynamically through long-term interactions with their users. We release MemGPT code and data for our experiments at


  title={{MemGPT}: Towards LLMs as Operating Systems},
  author={Packer, Charles and Wooders, Sarah and Lin, Kevin and Fang, Vivian and Patil, Shishir G. and Stoica, Ion and Gonzalez, Joseph E.},
  journal={arXiv preprint arXiv:2310.08560},