bartolli/mcp-llm-bridge
MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs
Overview
bartolli/mcp-llm-bridge is a Python MCP server licensed under MIT. MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs
Ranked #3159 out of 25632 indexed tools.
Ecosystem
Python MIT
Signal Breakdown
Stars 335
Freshness 11mo ago
Issue Health 71%
Contributors 2
Dependents 0
Forks 42
Description Good
License MIT
How to Improve
Description low impact
Freshness high impact
Dependents medium impact
Matched Queries
From the README
# MCP LLM Bridge A bridge connecting Model Context Protocol (MCP) servers to OpenAI-compatible LLMs. Primary support for OpenAI API, with additional compatibility for local endpoints that implement the OpenAI API specification. The implementation provides a bidirectional protocol translation layer between MCP and OpenAI's function-calling interface. It converts MCP tool specifications into OpenAI function schemas and handles the mapping of function invocations back to MCP tool executions. This enables any OpenAI-compatible language model to leverage MCP-compliant tools through a standardized interface, whether using cloud-based models or local implementations like Ollama. Read more about MCP by Anthropic here: - [Resources](https://modelcontextprotocol.io/docs/concepts/resources) - [Prompts](https://modelcontextprotocol.io/docs/concepts/prompts) - [Tools](https://modelcontextprotocol.io/docs/concepts/tools) - [Sampling](https://modelcontextprotocol.io/docs/concepts/sampling) Demo:Read full README on GitHub →
Are you the maintainer? Claim this listing