RetentionLabs

Expanding the Memory of Artificial Intelligence

About RetentionLabs

RetentionLabs is an open-source project group focused on researching and developing AI memory solutions. Our mission is to overcome the context length limitations of Self-Attention mechanisms in large language models.

We believe that by enhancing AI systems with better memory capabilities, we can create more helpful, consistent, and personalized AI experiences.

Long-term Memory

Enabling AI to retain information across extended interactions

Weight-Based Storage

Storing memories directly in model weights rather than external databases

Open Source

Collaborative development for the advancement of AI memory research

Our Projects

RetentionEngine 

Active

A PyTorch adapter that enhances language models with memory capabilities. Unlike RAG approaches, RetentionEngine stores information directly in the adapter weights, allowing for more efficient and integrated memory retention.

Conversation Memory
Weight-Based Storage
Easy Integration

Future Projects

Planned

We're constantly exploring new approaches to AI memory. Stay tuned for more innovative projects that will push the boundaries of what's possible with artificial intelligence memory systems.

Get Involved

Interested in contributing to RetentionLabs or have questions about our projects?

Board