The reputation layer for AI skills, tools & agents

jonigl/mcp-client-for-ollama

Score: 49.7 Rank #270

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

Overview

jonigl/mcp-client-for-ollama is a Python MCP server licensed under MIT. A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs. Topics: ai, command-line-tool, mcp, mcp-client, mcp-server, open-source, pypi-package, model-context-protocol, tool-management, generative-ai, ollama, llm, local-llm, sse, stdio, streamable-http, linux, macos, windows, agentic-ai.

Ranked #270 out of 25632 indexed tools.

In the top 2% of all indexed tools.

Ecosystem

Python MIT
aicommand-line-toolmcpmcp-clientmcp-serveropen-sourcepypi-packagemodel-context-protocoltool-managementgenerative-aiollamallmlocal-llmssestdiostreamable-httplinuxmacoswindowsagentic-ai

Signal Breakdown

Stars 563
Freshness 24d ago
Issue Health 47%
Contributors 5
Dependents 2
Forks 82
Description Detailed
License MIT

How to Improve

Issue Health high impact

You have 28 open vs 25 closed issues — triaging stale issues improves health

Badge

AgentRank score for jonigl/mcp-client-for-ollama
[![AgentRank](https://agentrank-ai.com/api/badge/tool/jonigl--mcp-client-for-ollama)](https://agentrank-ai.com/tool/jonigl--mcp-client-for-ollama)
<a href="https://agentrank-ai.com/tool/jonigl--mcp-client-for-ollama"><img src="https://agentrank-ai.com/api/badge/tool/jonigl--mcp-client-for-ollama" alt="AgentRank"></a>

Matched Queries

"mcp server""mcp-server"

From the README

<p align="center">

  
</p>
<p align="center">
<i>A simple yet powerful Python client for interacting with Model Context Protocol (MCP) servers using Ollama, allowing local LLMs to use tools.</i>
</p>

---

# MCP Client for Ollama (ollmcp)

<p align="center">
  
</p>
<p align="center">
  <a href="https://asciinema.org/a/jxc6N8oKZAWrzH8aK867zhXdO" target="_blank">🎥 Watch this demo as an Asciinema recording</a>
</p>

## Table of Contents

- [Overview](#overview)
- [Features](#features)
- [Requirements](#requirements)
- [Quick Start](#quick-start)
- [Usage](#usage)
  - [Command-line Arguments](#command-line-arguments)
  - [Usage Examples](#usage-examples)
  - [How Tool Calls Work](#how-tool-calls-work)
  - ✨**NEW** [Agent Mode](#agent-mode)
- [Interactive Commands](#interactive-commands)
  - [Tool and Server Selection](#tool-and-server-selection)
  - [Model Selection](#model-selection)
  - [Advanced Model Configuration](#advanced-model-configuration)
  - [Server Reloading for Development
Read full README on GitHub →
Are you the maintainer? Claim this listing