MCP HubMCP Hub
MegaGrindStone

mcp-web-ui

by: MegaGrindStone

MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.

43created 12/01/2025
Visit
Web
UI

πŸ“ŒOverview

Purpose: To provide a user-friendly web interface for interacting with Large Language Models (LLMs) while managing context aggregation within the Model Context Protocol (MCP) architecture.

Overview: MCP Web UI simplifies AI language model interactions by offering a unified platform for diverse LLM providers, fostering real-time chat experiences and comprehensive model management through robust context handling mechanisms.

Key Features:

  • Multi-Provider LLM Integration: Supports various models from Anthropic, OpenAI, Ollama, and OpenRouter, allowing users to switch easily between them.

  • Intuitive Chat Interface: Enhances user experience with an accessible and interactive chat environment for seamless communication with AI.

  • Real-time Response Streaming: Employs Server-Sent Events (SSE) for providing instantaneous responses, enriching the user engagement during conversations.

  • Dynamic Configuration Management: Facilitates customizable settings for model management, including advanced parameters for fine-tuning AI behavior.

  • Advanced Context Aggregation: Utilizes the MCP protocol for efficient context handling across multiple interactions, maintaining continuity in conversation.

  • Persistent Chat History: Ensures users can access past interactions via BoltDB, promoting better follow-up and analysis of previous AI interactions.

  • Flexible Model Selection: Empowers users to choose and configure different language models as per their specific needs and preferences.


MCP Web UI

MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.

🌟 Overview

MCP Web UI is designed to simplify and enhance interactions with AI language models by providing:

  • A unified interface for multiple LLM providers
  • Real-time, streaming chat experiences
  • Flexible configuration and model management
  • Robust context handling using the MCP protocol

πŸš€ Features

  • πŸ€– Multi-Provider LLM Integration:
    • Anthropic (Claude models)
    • OpenAI (GPT models)
    • Ollama (local models)
    • OpenRouter (multiple providers)
  • πŸ’¬ Intuitive Chat Interface
  • πŸ”„ Real-time Response Streaming via Server-Sent Events (SSE)
  • πŸ”§ Dynamic Configuration Management
  • πŸ“Š Advanced Context Aggregation
  • πŸ’Ύ Persistent Chat History using BoltDB
  • 🎯 Flexible Model Selection

πŸ“‹ Prerequisites

  • Go 1.23+
  • Docker (optional)
  • API keys for desired LLM providers

πŸ›  Installation

Quick Start

  1. Clone the repository:

    git clone https://github.com/MegaGrindStone/mcp-web-ui.git
    cd mcp-web-ui
    
  2. Configure your environment:

    mkdir -p $HOME/.config/mcpwebui
    cp config.example.yaml $HOME/.config/mcpwebui/config.yaml
    
  3. Set up API keys:

    export ANTHROPIC_API_KEY=your_anthropic_key
    export OPENAI_API_KEY=your_openai_key
    export OPENROUTER_API_KEY=your_openrouter_key
    

Running the Application

Local Development

go mod download
go run ./cmd/server/main.go

Docker Deployment

docker build -t mcp-web-ui .
docker run -p 8080:8080 \
  -v $HOME/.config/mcpwebui/config.yaml:/app/config.yaml \
  -e ANTHROPIC_API_KEY \
  -e OPENAI_API_KEY \
  -e OPENROUTER_API_KEY \
  mcp-web-ui

πŸ”§ Configuration

The configuration file (config.yaml) provides comprehensive settings for customizing the MCP Web UI.

Server Configuration

  • port: The port on which the server will run (default: 8080)
  • logLevel: Logging verbosity (debug, info, warn, error; default: info)
  • logMode: Log output format (json, text; default: text)

Prompt Configuration

  • systemPrompt: Default system prompt for the AI assistant
  • titleGeneratorPrompt: Prompt used to generate chat titles

LLM (Language Model) Configuration

The llm section supports multiple providers with provider-specific configurations:

Common LLM Parameters

  • provider: Choose from ollama, anthropic, openai, openrouter
  • model: Specific model name (e.g., 'claude-3-5-sonnet-20241022')
  • parameters: Fine-tune model behavior:
    • temperature: Randomness of responses (0.0-1.0)
    • topP: Nucleus sampling threshold
    • topK: Number of highest probability tokens to keep
    • frequencyPenalty: Reduce repetition
    • presencePenalty: Encourage new topics
    • maxTokens: Maximum response length
    • stop: Sequences to stop generation

Provider-Specific Configurations

  • Ollama:

  • Anthropic:

    • apiKey: Anthropic API key (can use ANTHROPIC_API_KEY env variable)
    • maxTokens: Maximum token limit
    • Stops sequences with only whitespace are ignored; whitespace is trimmed.
  • OpenAI:

    • apiKey: OpenAI API key (can use OPENAI_API_KEY env variable)
    • endpoint: OpenAI API endpoint (default: https://api.openai.com/v1)
    • For alternative OpenAI-compatible APIs, see the project's discussions.
  • OpenRouter:

    • apiKey: OpenRouter API key (can use OPENROUTER_API_KEY env variable)

Title Generator Configuration

The genTitleLLM section allows separate configuration for title generation, defaulting to the main LLM if not specified.

MCP Server Configurations

  • mcpSSEServers: Configure Server-Sent Events (SSE) servers

    • url: SSE server URL
    • maxPayloadSize: Maximum payload size
  • mcpStdIOServers: Configure Standard Input/Output servers

    • command: Command to run server
    • args: Arguments for the server command

Example MCP Server Configurations

SSE Server Example:

mcpSSEServers:
  filesystem:
    url: https://yoursseserver.com
    maxPayloadSize: 1048576 # 1MB

StdIO Server Examples:

  1. Using the official filesystem MCP server:
mcpStdIOServers:
  filesystem:
    command: npx
    args:
      - -y
      - "@modelcontextprotocol/server-filesystem"
      - "/path/to/your/files"
  1. Using go-mcp filesystem MCP server:
mcpStdIOServers:
  filesystem:
    command: go
    args:
      - run
      - github.com/your_username/your_app # Replace with your app
      - -path
      - "/data/mcp/filesystem"

Example Configuration Snippet

port: 8080
logLevel: info
systemPrompt: You are a helpful assistant.

llm:
  provider: anthropic
  model: claude-3-5-sonnet-20241022
  maxTokens: 1000
  parameters:
    temperature: 0.7

genTitleLLM:
  provider: openai
  model: gpt-3.5-turbo

πŸ— Project Structure

  • cmd/: Application entry point
  • internal/handlers/: Web request handlers
  • internal/models/: Data models
  • internal/services/: LLM provider integrations
  • static/: Static assets (CSS)
  • templates/: HTML templates

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push and create a Pull Request

πŸ“„ License

MIT License