albert-ying/agentic-cortex
A personal AI operating system built on OpenClaw/Claude Code, MCP servers, and structured markdown. Feedback RL, voice cloning, Screenpipe integration.
Overview
albert-ying/agentic-cortex is a Python MCP server licensed under MIT. A personal AI operating system built on OpenClaw/Claude Code, MCP servers, and structured markdown. Feedback RL, voice cloning, Screenpipe integration.
Ranked #6 out of 104 indexed tools.
In the top 6% of all indexed tools.
Actively maintained with commits in the last week.
Ecosystem
Score Breakdown
6 stars → early stage
Last commit 1d ago → actively maintained
No issues filed → no history to score
1 contributor → solo project
No dependents → no downstream usage
Weights: Freshness 25% · Issue Health 25% · Dependents 25% · Stars 15% · Contributors 10% · How we score →
How to Improve
Matched Queries
From the README
# 🧠 Agentic Cortex > 🦞 Built for [OpenClaw](https://github.com/nicobailon/openclaw) / [Claude Code](https://docs.anthropic.com/en/docs/claude-code) — a personal AI operating system that shares your memory, your voice, and your taste. **[Quick Start](#-quick-start)** · **[Installation](#-installation)** · **[Architecture](#architecture)** · **[Chapters](#chapters)** · **[FAQ](#faq)** <p align="center"> </p> --- ## What Makes This Different <p align="center"> </p> <p align="center"> </p> Storing notes in markdown is not new. Obsidian, Logseq, and Dendron all do it. What none of them do is close the loop: structured markdown as persistent AI memory + MCP integrations for live data + a structured feedback mechanism that reshapes AI behavior from natural language corrections + voice cloning extracted from your writing history + ambient intelligence from screen and audio recording. The result is an AI agent that has the same memory as you — and even more precise memory. IRead full README on GitHub →
Claim this listing to add a tagline, mark deprecation status, and get a verified maintainer badge.
Get the weekly AgentRank digest
Top movers, new tools, ecosystem insights — straight to your inbox.