The reputation layer for AI skills, tools & agents

patruff/ollama-mcp-bridge

Score: 25.8 Rank #8452

Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools

Overview

patruff/ollama-mcp-bridge is a TypeScript MCP server licensed under MIT. Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools

Ranked #8452 out of 25632 indexed tools.

Ecosystem

TypeScript MIT

Signal Breakdown

Stars 970
Freshness 11mo ago
Issue Health 14%
Contributors 3
Dependents 0
Forks 113
Description Good
License MIT

How to Improve

Description low impact

Expand your description to 150+ characters for better discoverability

Freshness high impact

Last commit was 330 days ago — a recent commit would boost your freshness score

Issue Health high impact

You have 18 open vs 3 closed issues — triaging stale issues improves health

Badge

AgentRank score for patruff/ollama-mcp-bridge
[![AgentRank](https://agentrank-ai.com/api/badge/tool/patruff--ollama-mcp-bridge)](https://agentrank-ai.com/tool/patruff--ollama-mcp-bridge)
<a href="https://agentrank-ai.com/tool/patruff--ollama-mcp-bridge"><img src="https://agentrank-ai.com/api/badge/tool/patruff--ollama-mcp-bridge" alt="AgentRank"></a>

Matched Queries

"mcp server""mcp-server""model context protocol""model-context-protocol"

From the README

# MCP-LLM Bridge

A TypeScript implementation that connects local LLMs (via Ollama) to Model Context Protocol (MCP) servers. This bridge allows open-source models to use the same tools and capabilities as Claude, enabling powerful local AI assistants.

## Overview

This project bridges local Large Language Models with MCP servers that provide various capabilities like:
- Filesystem operations
- Brave web search
- GitHub interactions
- Google Drive & Gmail integration
- Memory/storage
- Image generation with Flux

The bridge translates between the LLM's outputs and the MCP's JSON-RPC protocol, allowing any Ollama-compatible model to use these tools just like Claude does.

## Current Setup

- **LLM**: Using Qwen 2.5 7B (qwen2.5-coder:7b-instruct) through Ollama
- **MCPs**:
  - Filesystem operations (`@modelcontextprotocol/server-filesystem`)
  - Brave Search (`@modelcontextprotocol/server-brave-search`)
  - GitHub (`@modelcontextprotocol/server-github`)
  - Memory (`@modelcontextprotocol
Read full README on GitHub →
Are you the maintainer? Claim this listing