MCP HubMCP Hub
MegaGrindStone

mcp-web-ui

by: MegaGrindStone

MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.

43created 12/01/2025
Visit
Web
UI

📌Overview

Purpose: To provide a user-friendly web interface for interacting with Large Language Models (LLMs) while managing context aggregation within the Model Context Protocol (MCP) architecture.

Overview: MCP Web UI simplifies AI language model interactions by offering a unified platform for diverse LLM providers, fostering real-time chat experiences and comprehensive model management through robust context handling mechanisms.

Key Features:

  • Multi-Provider LLM Integration: Supports various models from Anthropic, OpenAI, Ollama, and OpenRouter, allowing users to switch easily between them.

  • Intuitive Chat Interface: Enhances user experience with an accessible and interactive chat environment for seamless communication with AI.

  • Real-time Response Streaming: Employs Server-Sent Events (SSE) for providing instantaneous responses, enriching the user engagement during conversations.

  • Dynamic Configuration Management: Facilitates customizable settings for model management, including advanced parameters for fine-tuning AI behavior.

  • Advanced Context Aggregation: Utilizes the MCP protocol for efficient context handling across multiple interactions, maintaining continuity in conversation.

  • Persistent Chat History: Ensures users can access past interactions via BoltDB, promoting better follow-up and analysis of previous AI interactions.

  • Flexible Model Selection: Empowers users to choose and configure different language models as per their specific needs and preferences.


MCP Web UI

MCP Web UI is a web-based user interface designed to facilitate interaction with Large Language Models (LLMs) within the Model Context Protocol (MCP) architecture.

Overview

MCP Web UI simplifies interactions with AI language models by providing:

  • A unified interface for multiple LLM providers
  • Real-time, streaming chat experiences
  • Flexible configuration and model management
  • Robust context handling using the MCP protocol

Features

  • Multi-Provider LLM Integration:
    • Anthropic (Claude models)
    • OpenAI (GPT models)
    • Ollama (local models)
    • OpenRouter (multiple providers)
  • Intuitive Chat Interface
  • Real-time Response Streaming
  • Dynamic Configuration Management
  • Advanced Context Aggregation
  • Persistent Chat History
  • Flexible Model Selection

Prerequisites

  • Go 1.23+
  • Docker (optional)
  • API keys for desired LLM providers

Installation

Quick Start

  1. Clone the repository:

    git clone https://github.com/MegaGrindStone/mcp-web-ui.git
    cd mcp-web-ui
    
  2. Configure your environment:

    mkdir -p $HOME/.config/mcpwebui
    cp config.example.yaml $HOME/.config/mcpwebui/config.yaml
    
  3. Set up API keys:

    export ANTHROPIC_API_KEY=your_anthropic_key
    export OPENAI_API_KEY=your_openai_key
    export OPENROUTER_API_KEY=your_openrouter_key
    

Running the Application

Local Development

go mod download
go run ./cmd/server/main.go

Docker Deployment

docker build -t mcp-web-ui .
docker run -p 8080:8080 \
  -v $HOME/.config/mcpwebui/config.yaml:/app/config.yaml \
  -e ANTHROPIC_API_KEY \
  -e OPENAI_API_KEY \
  -e OPENROUTER_API_KEY \
  mcp-web-ui

Configuration

The configuration file (config.yaml) customizes the MCP Web UI. Key settings include:

Server Configuration

  • port: Server port (default: 8080)
  • logLevel: Logging verbosity (options: debug, info, warn, error; default: info)

LLM Configuration

  • provider: Options include: ollama, anthropic, openai, openrouter
  • model: Name of the specific model
  • parameters: Customize model behavior (e.g., temperature, maxTokens)

Example Configuration Snippet

port: 8080
logLevel: info
llm:
  provider: anthropic
  model: claude-3-5-sonnet-20241022
  parameters:
    temperature: 0.7
    maxTokens: 1000

Project Structure

  • cmd/: Application entry point
  • internal/handlers/: Web request handlers
  • internal/models/: Data models
  • internal/services/: LLM provider integrations
  • static/: Static assets (CSS)
  • templates/: HTML templates

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push and create a Pull Request

License

MIT License