disler/just-prompt
just-prompt is an MCP server that provides a unified interface to top LLM providers (OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama)
Overview
disler/just-prompt is a Python MCP server. just-prompt is an MCP server that provides a unified interface to top LLM providers (OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama)
Ranked #15060 out of 25632 indexed tools.
Ecosystem
Python No license
Signal Breakdown
Stars 719
Freshness 7mo ago
Issue Health 0%
Contributors 1
Dependents 0
Forks 133
Description Good
License None
How to Improve
Description low impact
License low impact
Freshness high impact
Matched Queries
From the README
# Just Prompt - A lightweight MCP server for LLM providers
`just-prompt` is a Model Control Protocol (MCP) server that provides a unified interface to various Large Language Model (LLM) providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. See how we use the `ceo_and_board` tool to make [hard decisions easy with o3 here](https://youtu.be/LEMLntjfihA).
## Tools
The following MCP tools are available in the server:
- **`prompt`**: Send a prompt to multiple LLM models
- Parameters:
- `text`: The prompt text
- `models_prefixed_by_provider` (optional): List of models with provider prefixes. If not provided, uses default models.
- **`prompt_from_file`**: Send a prompt from a file to multiple LLM models
- Parameters:
- `abs_file_path`: Absolute path to the file containing the prompt (must be an absolute path, not relative)
- `models_prefixed_by_provider` (optional): List of models with provider prefixes. If not provided, uses default models. Read full README on GitHub →
Are you the maintainer? Claim this listing