ollama-mcp
by: rawveg
An MCP Server for Ollama
📌Overview
Purpose: The Ollama MCP Server facilitates seamless communication between Ollama's local LLM models and MCP-compatible applications.
Overview: This server implements the Model Context Protocol (MCP), allowing users to easily access and interact with various Ollama models and integrate them into applications such as Claude Desktop.
Key Features:
-
Model Listing: Allows users to view all available Ollama models, enhancing discoverability and ease of use.
-
Model Management: Supports pulling and updating models from Ollama, ensuring that users have access to the latest models.
-
Chat Interaction: Provides an API for chatting with models, enabling interactive user experiences.
-
Detailed Model Info: Offers endpoints for retrieving detailed information about specific models, aiding in informed decision-making.
-
Automatic Port Management: Simplifies server setup by automatically managing ports, preventing conflicts during runtime.
-
Configuration Flexibility: Supports environment variable configurations for easy customization of server settings.
Ollama MCP Server
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
Features
- List available Ollama models
- Pull new models from Ollama
- Chat with models using Ollama's chat API
- Get detailed model information
- Automatic port management
- Environment variable configuration
Prerequisites
- Node.js (v16 or higher)
- npm
- Ollama installed and running locally
Installation
Manual Installation
Install globally via npm:
npm install -g @rawveg/ollama-mcp
Installing in Other MCP Applications
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
{
"mcpServers": {
"@rawveg/ollama-mcp": {
"command": "npx",
"args": [
"-y",
"@rawveg/ollama-mcp"
]
}
}
}
The settings file location varies by application:
Claude Desktop: claude_desktop_config.json
in the Claude app data directory
Cline: cline_mcp_settings.json
in the VS Code global storage
Usage
Starting the Server
Simply run:
ollama-mcp
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
PORT=3457 ollama-mcp
Environment Variables
PORT
: Server port (default: 3456). Can be used when running directly:# When running directly PORT=3457 ollama-mcp
OLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)
API Endpoints
GET /models
- List available modelsPOST /models/pull
- Pull a new modelPOST /chat
- Chat with a modelGET /models/:name
- Get model details
Development
- Clone the repository:
git clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
- Install dependencies:
npm install
- Build the project:
npm run build
- Start the server:
npm start
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
However, this does not grant permission to incorporate this project into third-party services or commercial platforms without prior discussion and agreement. Recent experiences with services attempting to use this project commercially without consent have led to an update of the license.
To protect the integrity of the project and its contributors, the license has been updated to the GNU Affero General Public License v3.0 (AGPL-3.0). This ensures any commercial or service-based use complies with the AGPL terms and obtains a separate commercial license. If you wish to include this project in a commercial offering, please contact the maintainers first.
License
AGPL v3.0
Related
- Ollama: https://ollama.ai
- Model Context Protocol: https://github.com/anthropics/model-context-protocol
This project was previously MIT-licensed. As of 20th April 2025, it is licensed under AGPL-3.0 to prevent unauthorized commercial exploitation. If your use predates this change, please refer to the relevant Git tag or commit for the applicable license.