The reputation layer for AI skills, tools & agents

disler/just-prompt

Score: 18.8 Rank #15060

just-prompt is an MCP server that provides a unified interface to top LLM providers (OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama)

Overview

disler/just-prompt is a Python MCP server. just-prompt is an MCP server that provides a unified interface to top LLM providers (OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama)

Ranked #15060 out of 25632 indexed tools.

Ecosystem

Python No license

Signal Breakdown

Stars 719
Freshness 7mo ago
Issue Health 0%
Contributors 1
Dependents 0
Forks 133
Description Good
License None

How to Improve

Description low impact

Expand your description to 150+ characters for better discoverability

License low impact

Add an MIT or Apache-2.0 license to signal trust and enable adoption

Freshness high impact

Last commit was 218 days ago — a recent commit would boost your freshness score

Badge

AgentRank score for disler/just-prompt
[![AgentRank](https://agentrank-ai.com/api/badge/tool/disler--just-prompt)](https://agentrank-ai.com/tool/disler--just-prompt)
<a href="https://agentrank-ai.com/tool/disler--just-prompt"><img src="https://agentrank-ai.com/api/badge/tool/disler--just-prompt" alt="AgentRank"></a>

Matched Queries

"mcp server""mcp-server"

From the README

# Just Prompt - A lightweight MCP server for LLM providers

`just-prompt` is a Model Control Protocol (MCP) server that provides a unified interface to various Large Language Model (LLM) providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. See how we use the `ceo_and_board` tool to make [hard decisions easy with o3 here](https://youtu.be/LEMLntjfihA).

## Tools

The following MCP tools are available in the server:

- **`prompt`**: Send a prompt to multiple LLM models
  - Parameters:
    - `text`: The prompt text
    - `models_prefixed_by_provider` (optional): List of models with provider prefixes. If not provided, uses default models.

- **`prompt_from_file`**: Send a prompt from a file to multiple LLM models
  - Parameters:
    - `abs_file_path`: Absolute path to the file containing the prompt (must be an absolute path, not relative)
    - `models_prefixed_by_provider` (optional): List of models with provider prefixes. If not provided, uses default models.
Read full README on GitHub →
Are you the maintainer? Claim this listing