MCP HubMCP Hub
owulveryck

gomcptest

by: owulveryck

A proof-of-concept demonstrating a custom-built host implementing an OpenAI-compatible API with Google Vertex AI, function calling, and interaction with MCP servers.

29created 27/01/2025
Visit
OpenAI
VertexAI

📌Overview

Purpose: The main goal of this project is to demonstrate the implementation of a Model Context Protocol (MCP) with a custom host, facilitating the testing and development of various agentic systems.

Overview: This project serves as a proof of concept showcasing how to utilize MCP with a custom-built host. It is founded on a primarily scratch-written codebase to promote a comprehensive understanding of its underlying systems, enabling the creation and testing of specialized agents for diverse applications.

Key Features:

  • OpenAI Compatibility: Ensures interoperability with the OpenAI v1 chat completion format, allowing seamless integration.

  • Google Gemini Integration: Leverages VertexAI API for interaction with Google Gemini models, enhancing capabilities with advanced AI features.

  • Streaming Support: Facilitates real-time data streaming responses from the server, optimizing user interactions.

  • Function Calling: Supports the ability for Gemini to invoke external functions, integrating their results into chat responses efficiently.

  • MCP Server Interaction: Provides a practical demonstration of interacting with an MCP server for executing tools, showcasing practical applications.

  • Single Chat Session: Streamlines user experience by maintaining a single chat session throughout interactions, avoiding unnecessary session triggers.


gomcptest: Proof of Concept for MCP with Custom Host

This project is a proof of concept (POC) demonstrating how to implement a Model Context Protocol (MCP) with a custom-built host to experiment with agentic systems. The code is primarily written from scratch to provide a clear understanding of the underlying mechanisms.

See the experimental website for documentation (auto-generated) at https://owulveryck.github.io/gomcptest/

Goal

The primary goal is to enable easy testing of agentic systems through the Model Context Protocol. For example:

  • The dispatch_agent could specialize in scanning codebases for security vulnerabilities.
  • Create code review agents that analyze pull requests for potential issues.
  • Build data analysis agents that process and visualize complex datasets.
  • Develop automated documentation agents that generate comprehensive docs from code.

These specialized agents can be easily tested and iterated upon using the tools provided in this repository.

Prerequisites

  • Go >= 1.21
  • Access to the Vertex AI API on Google Cloud Platform
  • github.com/mark3labs/mcp-go

The tools use the default GCP login credentials configured by gcloud auth login.

Project Structure

  • host/openaiserver: Implements a custom host that mimics the OpenAI API, using Google Gemini and function calling. This is the core of the POC.
  • tools: Contains various MCP-compatible tools usable with the host:
    • Bash: Execute bash commands
    • Edit: Edit file contents
    • GlobTool: Find files matching glob patterns
    • GrepTool: Search file contents with regular expressions
    • LS: List directory contents
    • Replace: Replace entire file contents
    • View: View file contents

Components

Key Features

  • OpenAI Compatibility: API is compatible with the OpenAI v1 chat completion format.
  • Google Gemini Integration: Utilizes the VertexAI API to interact with Google Gemini models.
  • Streaming Support: Supports streaming responses.
  • Function Calling: Allows Gemini to call external functions and incorporate their results into chat responses.
  • MCP Server Interaction: Demonstrates interaction with a hypothetical MCP (Model Control Plane) server for tool execution.
  • Single Chat Session: Uses a single chat session; new conversations do not trigger new sessions.

Building the Tools

Build all tools using the included Makefile:

# Build all tools
make all

# Or build individual tools
make Bash
make Edit
make GlobTool
make GrepTool
make LS
make Replace
make View

Configuration

Set the required environment variables as shown in .envrc in the bin directory:

export GCP_PROJECT=your-project-id
export GCP_REGION=your-region
export GEMINI_MODELS=gemini-2.0-flash
export IMAGEN_MODELS=imagen-3.0-generate-002
export IMAGE_DIR=/tmp/images

Testing the CLI

Test the CLI tool from the bin directory:

./cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./dispatch_agent -glob-path .GlobTool -grep-path ./GrepTool -ls-path ./LS -view-path ./View;./Bash;./Replace"

Caution

⚠️ WARNING: These tools can execute commands and modify files on your system. Use preferably within a chroot or container environment to prevent potential system damage.

Quickstart for openaiserver

Prerequisites

  • Go installed and configured.
  • Environment variables properly set.

Running the Server

  1. Navigate to the host/openaiserver directory:

    cd host/openaiserver
    
  2. Set the required environment variables. For example:

    export IMAGE_DIR=/path/to/your/image/directory
    export GCP_PROJECT=your-gcp-project-id
    export IMAGE_DIR=/tmp/images  # Directory must exist
    
  3. Run the server:

    go run .
    

    or

    go run main.go
    

The server will start and listen on the configured port (default: 8080).

openaiserver Configuration

Global Configuration

VariableDescriptionDefaultRequired
PORTThe port the server listens on8080No
LOG_LEVELLog level (DEBUG, INFO, WARN, ERROR)INFONo
IMAGE_DIRDirectory to store imagesYes

GCP Configuration

VariableDescriptionDefaultRequired
GCP_PROJECTGoogle Cloud Project IDYes
GEMINI_MODELSComma-separated list of Gemini modelsgemini-1.5-pro,gemini-2.0-flashNo
GCP_REGIONGoogle Cloud Regionus-central1No
IMAGEN_MODELSComma-separated list of Imagen modelsNo
IMAGE_DIRDirectory to store imagesYes
PORTThe port the server listens on8080No

Notes

  • This is a proof of concept and has limitations.
  • The code is provided as-is for educational purposes to understand MCP implementation with a custom host.