MCP HubMCP Hub
SecretiveShell

MCP-Bridge

by: SecretiveShell

A middleware to provide an openAI compatible endpoint that can call MCP tools

646created 30/11/2024
Visit
middleware
openAI

📌Overview

Purpose: MCP-Bridge serves as a connector between the OpenAI API and MCP tools, enabling the utilization of MCP resources through a simplified interface.

Overview: MCP-Bridge facilitates the seamless integration of MCP tools with the OpenAI API by providing a set of compatible endpoints. This allows developers to interact with MCP tools using any client that supports the OpenAI API, broadening access and enhancing functionality.

Key Features:

  • Non-streaming and Streaming Completeness: Offers both non-streaming and streaming chat completions with MCP, making it versatile for various use cases.

  • SSE Bridge for External Clients: Provides an SSE endpoint that allows external chat applications to utilize MCP tools effectively, fostering better integration and usability.


MCP-Bridge

MCP-Bridge acts as a bridge between the OpenAI API and MCP tools, allowing developers to leverage MCP tools through the OpenAI API interface.

Overview

MCP-Bridge facilitates integration of MCP tools with the OpenAI API by providing endpoints compatible with the OpenAI API. This allows any client to use MCP tools without explicit support for MCP.

Current Features

Working features:

  • Non-streaming chat completions with MCP
  • Streaming chat completions with MCP
  • Non-streaming completions without MCP
  • MCP tools
  • MCP sampling
  • SSE Bridge for external clients

Planned features:

  • Streaming completions (not yet implemented)
  • MCP resources support (planned)

Installation

The recommended method is using Docker.

Docker Installation

  1. Clone the repository

  2. Edit the compose.yml file

Add reference to the config.json file in one of the following ways:

  • Mount the config file via volume
  • Provide a HTTP URL to download the config.json
  • Embed the config JSON as an environment variable

Example environment variable options:

environment:
  - MCP_BRIDGE__CONFIG__FILE=config.json
  - MCP_BRIDGE__CONFIG__HTTP_URL=http://10.88.100.170:8888/config.json
  - MCP_BRIDGE__CONFIG__JSON={"inference_server":{"base_url":"http://example.com/v1","api_key":"None"},"mcp_servers":{"fetch":{"command":"uvx","args":["mcp-server-fetch"]}}}

Example volume mount:

volumes:
  - ./config.json:/mcp_bridge/config.json
  1. Run the service
docker-compose up --build -d

Manual Installation (No Docker)

  1. Clone the repository

  2. Set up dependencies:

uv sync
  1. Create config.json in the root directory

Example config:

{
   "inference_server": {
      "base_url": "http://example.com/v1",
      "api_key": "None"
   },
   "mcp_servers": {
      "fetch": {
        "command": "uvx",
        "args": ["mcp-server-fetch"]
      }
   }
}
  1. Run the application:
uv run mcp_bridge/main.py

Usage

Once running, interact with MCP-Bridge using the OpenAI API.

Documentation is available at:
http://yourserver:8000/docs

This includes an endpoint to list all available MCP tools for testing.

REST API Endpoints

MCP-Bridge exposes REST API endpoints for interacting with MCP native primitives, allowing you to outsource MCP server complexity while retaining full functionality.

SSE Bridge

MCP-Bridge provides an SSE bridge for external clients, enabling chat apps with MCP support to use MCP-Bridge as an MCP server.

Use the SSE endpoint:
http://yourserver:8000/mcp-server/sse

Test your configuration with tools like:
npx @wong2/mcp-cli --sse http://localhost:8000/mcp-server/sse

For STDIO-only MCP clients (e.g., Claude Desktop), use tools such as mcp-gateway.

Configuration

To add MCP servers, edit config.json.

API Key Authentication

Enable API key authentication by adding an api_key field:

{
    "api_key": "your-secure-api-key-here"
}

Include the API key in requests via the Authorization header:

Authorization: Bearer your-secure-api-key-here

If api_key is missing or empty, authentication is disabled for compatibility.

Full Configuration Example

{
    "inference_server": {
        "base_url": "http://localhost:8000/v1",
        "api_key": "None"
    },
    "sampling": {
        "timeout": 10,
        "models": [
            {
                "model": "gpt-4o",
                "intelligence": 0.8,
                "cost": 0.9,
                "speed": 0.3
            },
            {
                "model": "gpt-4o-mini",
                "intelligence": 0.4,
                "cost": 0.1,
                "speed": 0.7
            }
        ]
    },
    "mcp_servers": {
        "fetch": {
            "command": "uvx",
            "args": [
                "mcp-server-fetch"
            ]
        }
    },
    "network": {
        "host": "0.0.0.0",
        "port": 9090
    },
    "logging": {
        "log_level": "DEBUG"
    },
    "api_key": "your-secure-api-key-here"
}
SectionDescription
inference_serverInference server configuration
samplingSampling configuration
mcp_serversMCP server configuration
networkNetwork (uvicorn) configuration
loggingLogging configuration
api_keyAPI key for server authentication

How It Works

The application acts as a proxy between the OpenAI API and the inference engine. It adds tool definitions for MCP tools to incoming requests, forwards them to the inference engine, manages tool calls, and then returns the combined response.

Sequence:

sequenceDiagram
    participant OpenWebUI as Open Web UI
    participant MCPProxy as MCP Proxy
    participant MCPserver as MCP Server
    participant InferenceEngine as Inference Engine

    OpenWebUI ->> MCPProxy: Request
    MCPProxy ->> MCPserver: list tools
    MCPserver ->> MCPProxy: list of tools
    MCPProxy ->> InferenceEngine: Forward Request
    InferenceEngine ->> MCPProxy: Response
    MCPProxy ->> MCPserver: call tool
    MCPserver ->> MCPProxy: tool response
    MCPProxy ->> InferenceEngine: llm uses tool response
    InferenceEngine ->> MCPProxy: Response
    MCPProxy ->> OpenWebUI: Return Response

Contribution Guidelines

Contributions are welcome! To contribute:

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Make your changes and commit them.
  4. Push to your fork.
  5. Create a pull request.

License

MCP-Bridge is licensed under the MIT License. See the LICENSE file for details.

Support

If you encounter issues, please open an issue or join the Discord:
https://discord.gg/4NVQHqNxSZ