prih/mcp-graph-memory
MCP server that builds a semantic graph memory from your project — indexes docs, code, and files, exposes 57 tools for search, knowledge, tasks, and skills.
Overview
prih/mcp-graph-memory is a TypeScript MCP server. MCP server that builds a semantic graph memory from your project — indexes docs, code, and files, exposes 57 tools for search, knowledge, tasks, and skills. Topics: graph, knowledge-graph, mcp, mcp-server, react, semantic-search, typescript.
Ranked #57 out of 104 indexed tools.
Actively maintained with commits in the last week.
Ecosystem
Score Breakdown
1 stars → early stage
Last commit today → actively maintained
No issues filed → no history to score
1 contributor → solo project
No dependents → no downstream usage
Weights: Freshness 25% · Issue Health 25% · Dependents 25% · Stars 15% · Contributors 10% · How we score →
How to Improve
Matched Queries
From the README
# mcp-graph-memory
An MCP server that builds a **semantic graph memory** from a project directory.
It indexes markdown documentation and TypeScript/JavaScript source code into graph structures,
then exposes them as MCP tools that any AI assistant can use to navigate and search the project.
## Quick start with Docker
### 1. Create a config file
Create `graph-memory.yaml` — paths must be relative to the container filesystem:
```yaml
server:
host: "0.0.0.0"
port: 3000
modelsDir: "/data/models"
projects:
my-app:
projectDir: "/data/projects/my-app"
docsPattern: "docs/**/*.md"
codePattern: "src/**/*.{ts,tsx}"
excludePattern: "node_modules/**,dist/**"
```
### 2. Run with Docker
```bash
docker run -d \
--name graph-memory \
-p 3000:3000 \
-v $(pwd)/graph-memory.yaml:/data/config/graph-memory.yaml:ro \
-v /path/to/my-app:/data/projects/my-app:ro \
-v graph-memory-models:/data/models \
ghcr.io/prih/mcp-graph-memory
```
Three mounts:
| Mount | Containe Read full README on GitHub → Claim this listing to add a tagline, mark deprecation status, and get a verified maintainer badge.
Get the weekly AgentRank digest
Top movers, new tools, ecosystem insights — straight to your inbox.