dolphin-mcp
by: cognitivecomputations
dolphin mcp
📌Overview
Purpose:
Enable seamless interaction with multiple Model Context Protocol (MCP) servers using any LLM model, via both a Python library and a command-line interface.
Overview:
Dolphin MCP is a flexible Python library and CLI tool designed to connect to any number of MCP servers and make their integrated tools accessible to popular language models (such as OpenAI, Anthropic, Ollama, and LMStudio). It provides a natural language conversational interface for querying and manipulating data from these servers, supporting robust and scalable multi-server workflows for both developers and end-users.
Key Features:
-
Multiple Provider Support:
Integrates with various LLM providers including OpenAI, Anthropic, Ollama, and LMStudio, allowing users to choose the best model for their use case. -
MCP Server Integration:
Simultaneously connects to multiple configurable MCP servers and exposes their tools and services through the language model interface, enhancing data accessibility and interactivity. -
Dual Interface (Library & CLI):
Offers both a clean Python API for programmatic access and a powerful command-line tool for quick, interactive queries. -
Automatic Tool Discovery:
Automatically detects and presents available tools from all connected MCP servers, enabling function calling and dynamic tool utilization by LLMs. -
Flexible, Modular Architecture:
Clean separation of provider-specific modules and configurable JSON-based setup makes Dolphin MCP easy to extend, configure, and secure (with environment variable support for sensitive information).
Dolphin MCP
A flexible Python library and CLI tool for interacting with Model Context Protocol (MCP) servers using any LLM model.
Overview
Dolphin MCP is a Python library and command-line tool that enables querying and interacting with MCP servers via natural language. It connects to multiple configured MCP servers, makes their tools available to language models (OpenAI, Anthropic, Ollama, LMStudio), and offers a conversational interface for accessing and manipulating server data.
Key Capabilities
- Connect to multiple MCP servers simultaneously
- Automatically discover and use tools from connected servers
- Function calling for interaction with external data sources
- Dual interface: usable as a Python library or CLI tool
- Modular architecture with provider-specific modules
- Flexible configuration via JSON
- Secure API key management through environment variables
Requirements
- Python 3.8+
- API key for supported providers (e.g., OpenAI)
- [Optional] SQLite for demo database
Core dependencies are automatically installed with pip install.
Installation
Recommended: Install from PyPI
pip install dolphin-mcp
This installs both the Python library and the dolphin-mcp-cli
command-line tool.
Alternative: Install from Source
git clone https://github.com/cognitivecomputations/dolphin-mcp.git
cd dolphin-mcp
pip install -e .
Set up your API keys by copying .env.example
to .env
and editing as needed. Optional: Initialize the demo database with:
python setup_db.py
Configuration
Two main config files are used:
-
.env
– Stores API keys and settings.OPENAI_API_KEY=your_openai_api_key_here OPENAI_MODEL=gpt-4o
-
mcp_config.json
– Defines the MCP servers to connect to.{ "mcpServers": { "server1": { "command": "command-to-start-server", "args": ["arg1", "arg2"], "env": { "ENV_VAR1": "value1" } } } }
Usage
CLI Example
Query via CLI:
dolphin-mcp-cli "What dolphin species are endangered?"
Common CLI options:
--model <name> Specify the model to use
--quiet Suppress intermediate output
--config <file> Specify config file (default: mcp_config.json)
--help, -h Show help
Programmatic Usage
Use Dolphin MCP in your Python scripts:
import asyncio
from dolphin_mcp import run_interaction
async def main():
result = await run_interaction(
user_query="What dolphin species are endangered?",
)
print(result)
asyncio.run(main())
Legacy Script
You can also use:
python dolphin_mcp.py "Your query here"
Example Queries
With the demo or your own MCP servers:
dolphin-mcp-cli "What dolphin species are endangered?"
dolphin-mcp-cli --model gpt-4o "What are the evolutionary relationships between dolphin species?"
dolphin-mcp-cli --quiet "List all dolphin species in the Atlantic Ocean"
Demo Database
Running setup_db.py
creates a sample SQLite database of dolphin species as a demonstration, including information about species, evolutionary relationships, and conservation status. You can connect Dolphin MCP to any MCP-compatible server to access various data sources or services.
How It Works
Structure
dolphin_mcp/
client.py
– Core implementationcli.py
– CLI interfaceutils.py
– Utilitiesproviders/
– Integrations for OpenAI, Anthropic, Ollama, LMStudio
Execution Flow
- The CLI parses arguments and calls the
run_interaction
function. - The library loads configuration and connects to the defined MCP servers.
- It retrieves tool definitions from servers.
- User queries and available tools are sent to the selected language model.
- Tool calls are routed to the corresponding server and results are returned.
- The model provides a final conversational answer.
Contributing
Contributions are welcome! Please submit a pull request.
License
[Add your license information here]