Lyellr88/MARM-Systems
Turn AI into a persistent, memory-powered collaborator. Universal MCP Server (supports HTTP, STDIO, and WebSocket) enabling cross-platform AI memory, multi-agent coordination, and context sharing. Built with MARM protocol for structured reasoning that evolves with your work.
Overview
Lyellr88/MARM-Systems is a Python MCP server licensed under MIT. Turn AI into a persistent, memory-powered collaborator. Universal MCP Server (supports HTTP, STDIO, and WebSocket) enabling cross-platform AI memory, multi-agent coordination, and context sharing. Built with MARM protocol for structured reasoning that evolves with your work. Topics: claude-code, developer-tools, docker-image, embeddings, fastapi, gemini-cli, knowledge-based-systems, mcp-server, memory-management, semantic-search, marm, mcp, openai, opensource, claude.
Ranked #242 out of 25632 indexed tools.
In the top 1% of all indexed tools.
Ecosystem
Signal Breakdown
How to Improve
Matched Queries
From the README
<div align="center"> <picture> </picture> <h1 align="center">MARM: The AI That Remembers Your Conversations</h1> Memory Accurate Response Mode v2.2.6 - The intelligent persistent memory system for AI agents (supports HTTP and STDIO), stop fighting your memory and control it. Experience long-term recall, session continuity, and reliable conversation history, so your LLMs never lose track of what matters. **Note:** This is the *official* MARM repository. All official versions and releases are managed here. Forks may experiment, but official updates will always come from this repo. </div> --- ## 📢 Project Update (December 2025) I've been focused on developing a new build powered by MARM Systems as its memory layer, a real-world application of the technology I've been creating. This deep dive into production memory systems has given me valuable insights into how MARM performs under real workflows. I'm returning focus to MARM-MCP in **Q1-2 2026** with lessons learned and new imRead full README on GitHub →