mcp-ollama-agent
by: ausboss
A TypeScript example showcasing the integration of Ollama with the Model Context Protocol (MCP) servers. This project provides an interactive command-line interface for an AI agent that can utilize the tools from multiple MCP Servers..
📌Overview
Purpose: This framework aims to facilitate the integration of AI models with various tools through a unified interface using the Model Context Protocol (MCP) and Ollama.
Overview: The TypeScript MCP Agent with Ollama integration provides a robust platform for AI models to perform file system operations and web research. It allows easy configuration and testing through a standalone demo mode, ensuring fluid interaction with the tools via a chat interface.
Key Features:
-
Multi-Server Support: Supports multiple MCP servers (tested with uvx and npx), enabling flexibility in deployment.
-
Tool Operations: Built-in functionality for file system actions and web research, enhancing the capabilities of AI models.
-
Simple Configuration: Contains an easy setup process utilizing
mcp-config.json
for tool and model configurations, making it user-friendly. -
Interactive Chat Interface: Facilitates seamless interaction with Ollama, supporting any compatible tools for a more engaging user experience.
-
Standalone Demo Mode: Allows testing of web and filesystem tools independently of a language model, providing a straightforward evaluation environment.
TypeScript MCP Agent with Ollama Integration
This project demonstrates integration between Model Context Protocol (MCP) servers and Ollama, allowing AI models to interact with various tools through a unified interface.
✨ Features
- Supports multiple MCP servers (both uvx and npx tested)
- Built-in support for file system operations and web research
- Easy configuration through
mcp-config.json
similar toclaude_desktop_config.json
- Interactive chat interface with Ollama integration that supports any tools
- Standalone demo mode for testing web and filesystem tools without an LLM
🚀 Getting Started
Prerequisites
- Node.js (version 18 or higher)
- Ollama installed and running
- Install the MCP tools globally that you want to use:
# For filesystem operations
npm install -g @modelcontextprotocol/server-filesystem
# For web research
npm install -g @mzxrai/mcp-webresearch
Clone and install
git clone https://github.com/ausboss/mcp-ollama-agent.git
cd mcp-ollama-agent
npm install
Configure tools and Ollama model in mcp-config.json
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["@modelcontextprotocol/server-filesystem", "./"]
},
"webresearch": {
"command": "npx",
"args": ["-y", "@mzxrai/mcp-webresearch"]
}
},
"ollama": {
"host": "http://localhost:11434",
"model": "qwen2.5:latest"
}
}
Run the demo (without an LLM)
npx tsx ./src/demo.ts
Start the chat interface with Ollama
npm start
⚙️ Configuration
- Add any MCP-compatible server to the
mcpServers
section - Configure host and model for Ollama (model must support function calling)
- Supports both Python (uvx) and Node.js (npx) MCP servers
💡 Example Usage
This example uses the model qwen2.5:latest
Chat started. Type "exit" to end the conversation.
You: can you use your list directory tool to see whats in test-directory then use your read file tool to read it to me?
Model is using tools to help answer...
Using tool: list_directory
With arguments: { path: 'test-directory' }
Tool result: [ { type: 'text', text: '[FILE] test.txt' } ]
Assistant:
Model is using tools to help answer...
Using tool: read_file
With arguments: { path: 'test-directory/test.txt' }
Tool result: [ { type: 'text', text: 'rosebud' } ]
Assistant: The content of the file `test.txt` in the `test-directory` is:
rosebud
You: thanks
Assistant: You're welcome! If you have any other requests or need further assistance, feel free to ask.
System Prompts
Customize the system prompt in ChatManager.ts
if local models need help with tool selection.
🤝 Contributing
Contributions are welcome! Feel free to submit issues or pull requests.