giuliolibrando/llm-rag-mcp-example
A complete deployment of AnythingLLM with local LLM, RAG and MCP (Model Context Protocol) servers. This stack provides a self-hosted AI platform with multi-user chat interface, vector database for semantic search, automated data ingestion from Redmine and Wiki.js, and integration with multiple MCP servers for extended functionality.
Overview
giuliolibrando/llm-rag-mcp-example is a Python MCP server. A complete deployment of AnythingLLM with local LLM, RAG and MCP (Model Context Protocol) servers. This stack provides a self-hosted AI platform with multi-user chat interface, vector database for semantic search, automated data ingestion from Redmine and Wiki.js, and integration with multiple MCP servers for extended functionality.
Ranked #13041 out of 25632 indexed tools.