MCP-Bridge
by: SecretiveShell
A middleware to provide an openAI compatible endpoint that can call MCP tools
📌Overview
Purpose: MCP-Bridge serves as a connector between the OpenAI API and MCP tools, enabling the utilization of MCP resources through a simplified interface.
Overview: MCP-Bridge facilitates the seamless integration of MCP tools with the OpenAI API by providing a set of compatible endpoints. This allows developers to interact with MCP tools using any client that supports the OpenAI API, broadening access and enhancing functionality.
Key Features:
-
Non-streaming and Streaming Completeness: Offers both non-streaming and streaming chat completions with MCP, making it versatile for various use cases.
-
SSE Bridge for External Clients: Provides an SSE endpoint that allows external chat applications to utilize MCP tools effectively, fostering better integration and usability.
MCP-Bridge
MCP-Bridge acts as a bridge between the OpenAI API and MCP tools, allowing developers to leverage MCP tools through the OpenAI API interface.
Overview
MCP-Bridge facilitates integration of MCP tools with the OpenAI API by providing endpoints compatible with the OpenAI API. This allows any client to use MCP tools without explicit support for MCP.
Current Features
Working features:
- Non-streaming chat completions with MCP
- Streaming chat completions with MCP
- Non-streaming completions without MCP
- MCP tools
- MCP sampling
- SSE Bridge for external clients
Planned features:
- Streaming completions (not yet implemented)
- MCP resources support (planned)
Installation
The recommended method is using Docker.
Docker Installation
-
Clone the repository
-
Edit the
compose.yml
file
Add reference to the config.json
file in one of the following ways:
- Mount the config file via volume
- Provide a HTTP URL to download the config.json
- Embed the config JSON as an environment variable
Example environment variable options:
environment:
- MCP_BRIDGE__CONFIG__FILE=config.json
- MCP_BRIDGE__CONFIG__HTTP_URL=http://10.88.100.170:8888/config.json
- MCP_BRIDGE__CONFIG__JSON={"inference_server":{"base_url":"http://example.com/v1","api_key":"None"},"mcp_servers":{"fetch":{"command":"uvx","args":["mcp-server-fetch"]}}}
Example volume mount:
volumes:
- ./config.json:/mcp_bridge/config.json
- Run the service
docker-compose up --build -d
Manual Installation (No Docker)
-
Clone the repository
-
Set up dependencies:
uv sync
- Create
config.json
in the root directory
Example config:
{
"inference_server": {
"base_url": "http://example.com/v1",
"api_key": "None"
},
"mcp_servers": {
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"]
}
}
}
- Run the application:
uv run mcp_bridge/main.py
Usage
Once running, interact with MCP-Bridge using the OpenAI API.
Documentation is available at:
http://yourserver:8000/docs
This includes an endpoint to list all available MCP tools for testing.
REST API Endpoints
MCP-Bridge exposes REST API endpoints for interacting with MCP native primitives, allowing you to outsource MCP server complexity while retaining full functionality.
SSE Bridge
MCP-Bridge provides an SSE bridge for external clients, enabling chat apps with MCP support to use MCP-Bridge as an MCP server.
Use the SSE endpoint:
http://yourserver:8000/mcp-server/sse
Test your configuration with tools like:
npx @wong2/mcp-cli --sse http://localhost:8000/mcp-server/sse
For STDIO-only MCP clients (e.g., Claude Desktop), use tools such as mcp-gateway
.
Configuration
To add MCP servers, edit config.json
.
API Key Authentication
Enable API key authentication by adding an api_key
field:
{
"api_key": "your-secure-api-key-here"
}
Include the API key in requests via the Authorization header:
Authorization: Bearer your-secure-api-key-here
If api_key
is missing or empty, authentication is disabled for compatibility.
Full Configuration Example
{
"inference_server": {
"base_url": "http://localhost:8000/v1",
"api_key": "None"
},
"sampling": {
"timeout": 10,
"models": [
{
"model": "gpt-4o",
"intelligence": 0.8,
"cost": 0.9,
"speed": 0.3
},
{
"model": "gpt-4o-mini",
"intelligence": 0.4,
"cost": 0.1,
"speed": 0.7
}
]
},
"mcp_servers": {
"fetch": {
"command": "uvx",
"args": [
"mcp-server-fetch"
]
}
},
"network": {
"host": "0.0.0.0",
"port": 9090
},
"logging": {
"log_level": "DEBUG"
},
"api_key": "your-secure-api-key-here"
}
Section | Description |
---|---|
inference_server | Inference server configuration |
sampling | Sampling configuration |
mcp_servers | MCP server configuration |
network | Network (uvicorn) configuration |
logging | Logging configuration |
api_key | API key for server authentication |
How It Works
The application acts as a proxy between the OpenAI API and the inference engine. It adds tool definitions for MCP tools to incoming requests, forwards them to the inference engine, manages tool calls, and then returns the combined response.
Sequence:
sequenceDiagram
participant OpenWebUI as Open Web UI
participant MCPProxy as MCP Proxy
participant MCPserver as MCP Server
participant InferenceEngine as Inference Engine
OpenWebUI ->> MCPProxy: Request
MCPProxy ->> MCPserver: list tools
MCPserver ->> MCPProxy: list of tools
MCPProxy ->> InferenceEngine: Forward Request
InferenceEngine ->> MCPProxy: Response
MCPProxy ->> MCPserver: call tool
MCPserver ->> MCPProxy: tool response
MCPProxy ->> InferenceEngine: llm uses tool response
InferenceEngine ->> MCPProxy: Response
MCPProxy ->> OpenWebUI: Return Response
Contribution Guidelines
Contributions are welcome! To contribute:
- Fork the repository.
- Create a new branch for your feature or bug fix.
- Make your changes and commit them.
- Push to your fork.
- Create a pull request.
License
MCP-Bridge is licensed under the MIT License. See the LICENSE file for details.
Support
If you encounter issues, please open an issue or join the Discord:
https://discord.gg/4NVQHqNxSZ