smat-dev/jinni
Bring your project into LLM context - tool and MCP server
Overview
smat-dev/jinni is a Python MCP server licensed under Apache-2.0. Bring your project into LLM context - tool and MCP server
Ranked #486 out of 25632 indexed tools.
In the top 2% of all indexed tools.
Ecosystem
Python Apache-2.0
Signal Breakdown
Stars 271
Freshness 3mo ago
Issue Health 80%
Contributors 3
Dependents 5
Forks 19
Description Good
License Apache-2.0
How to Improve
Description low impact
Freshness high impact
Matched Queries
From the README
# Jinni: Bring Your Project Into Context
<a href="https://glama.ai/mcp/servers/@smat-dev/jinni">
</a>
Jinni is a tool to efficiently provide Large Language Models the context of your projects. It gives a consolidated view of relevant project files, overcoming the limitations and inefficiencies of reading files one by one. Each file's content is preceded by a simple header indicating its path:
```
```path=src/app.py
print("hello")
```
The philosophy behind this tool is that LLM context windows are large, models are smart, and directly seeing your project best equips the model to help with anything you throw at it.
There is an MCP (Model Context Protocol) server for integration with AI tools and a command-line utility (CLI) for manual use that copies project context to the clipboard ready to paste wherever you need it.
These tools are opinionated about what counts as relevant project context to best work out of the box in most use cases, automatically excluding:
* Binary fi Read full README on GitHub →
Are you the maintainer? Claim this listing