MCP HubMCP Hub
SecretiveShell

MCP-Bridge

by: SecretiveShell

A middleware to provide an openAI compatible endpoint that can call MCP tools

542created 30/11/2024
Visit
middleware
openAI

📌Overview

Purpose: MCP-Bridge serves as a connector between the OpenAI API and MCP tools, enabling the utilization of MCP resources through a simplified interface.

Overview: MCP-Bridge facilitates the seamless integration of MCP tools with the OpenAI API by providing a set of compatible endpoints. This allows developers to interact with MCP tools using any client that supports the OpenAI API, broadening access and enhancing functionality.

Key Features:

  • Non-streaming and Streaming Completeness: Offers both non-streaming and streaming chat completions with MCP, making it versatile for various use cases.

  • SSE Bridge for External Clients: Provides an SSE endpoint that allows external chat applications to utilize MCP tools effectively, fostering better integration and usability.


MCP-Bridge

MCP-Bridge acts as a bridge between the OpenAI API and MCP (MCP) tools, allowing developers to leverage MCP tools through the OpenAI API interface.

Overview

MCP-Bridge facilitates the integration of MCP tools with the OpenAI API, providing a set of endpoints for interaction with MCP tools. This enables the use of any client with MCP tools without explicit support.

Current Features

Working Features

  • Non-streaming chat completions with MCP
  • Streaming chat completions with MCP
  • Non-streaming completions without MCP
  • MCP tools interaction
  • MCP sampling
  • SSE Bridge for external clients

Planned Features

  • Streaming completions
  • Support for MCP resources

Installation

Recommended Installation via Docker

  1. Clone the repository

  2. Edit the compose.yml file to reference the config.json file:

    • Mount the config.json using a volume
    • Provide an HTTP URL to download config.json
    • Directly set config.json as an environment variable

    Example:

    environment:
      - MCP_BRIDGE__CONFIG__FILE=config.json
      - MCP_BRIDGE__CONFIG__http_url=http://your.url/config.json
      - MCP_BRIDGE__CONFIG__JSON={"inference_server":{"base_url":"http://example.com/v1","api_key":"None"},"mcp_servers":{"fetch":{"command":"uvx","args":["mcp-server-fetch"]}}}
    
  3. Run the service

    docker-compose up --build -d
    

Manual Installation (No Docker)

  1. Clone the repository
  2. Install dependencies
    uv sync
    
  3. Create a config.json file in the root directory with the necessary configuration.
  4. Run the application
    uv run mcp_bridge/main.py
    

Usage

Interact with the application using the OpenAI API. Documentation is available at http://yourserver:8000/docs, which includes an endpoint to list all MCP tools.

REST API Endpoints

MCP-Bridge exposes various REST API endpoints for interacting with native MCP primitives, simplifying the handling of MCP servers without losing functionality.

SSE Bridge

MCP-Bridge provides an SSE bridge for external chat applications to access MCP tools. To test your configuration, you may use tools like wong2/mcp-cli.

Configuration

Edit the config.json file to add new MCP servers. API key authentication can be enabled by adding an api_key field.

Full Configuration Example

{
    "inference_server": {
        "base_url": "http://localhost:8000/v1",
        "api_key": "None"
    },
    "mcp_servers": {
        "fetch": {
            "command": "uvx",
            "args": ["mcp-server-fetch"]
        }
    },
    "network": {
        "host": "0.0.0.0",
        "port": 9090
    },
    "logging": {
        "log_level": "DEBUG"
    },
    "api_key": "your-secure-api-key-here"
}

Support

For issues, please open an issue on GitHub or join the Discord community.

Contribution Guidelines

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Make your changes and commit them.
  4. Push your changes to your fork.
  5. Create a pull request to the main repository.

License

MCP-Bridge is licensed under the MIT License. See the LICENSE file for more information.