MCP HubMCP Hub
daodao97

chatmcp

by: daodao97

ChatMCP is an AI chat client implementing the Model Context Protocol (MCP).

1449created 08/12/2024
Visit
AI
chat

📌Overview

Purpose:
To provide a cross-platform AI chat client enabling users to connect with multiple LLMs (Large Language Models) and MCP servers on macOS, Windows, Linux, iOS, and Android devices.

Overview:
chatmcp is a versatile, open-source chat client designed for seamless interaction with various AI models and servers. By supporting popular platforms and offering easy configuration, it empowers users to chat, generate images, preview code, and leverage advanced AI functionalities in a unified interface.

Key Features:

  • Cross-platform Compatibility:
    Runs on macOS, Windows, Linux, iOS, and Android, ensuring a consistent user experience across devices.

  • Multi-Model and Server Support:
    Integrates with several LLM providers (OpenAI, Claude, OLLama, DeepSeek) and allows chatting with different MCP servers for flexible AI interaction.

  • Interactive Chat and History:
    Facilitates real-time conversations and stores chat history for easy access and continuity.

  • Advanced Artifact Generation:
    Includes image generation (DALL·E, etc.), HTML code preview, and diagram support (e.g., Mermaid), expanding the scope of AI-assisted tasks.

  • Customizable Themes:
    Offers dark and light modes for improved usability and comfort.


chatmcp

Cross-platform AI Chat Client: macOS, Windows, Linux, iOS, Android

Install

macOSWindowsLinuxiOSAndroid
ReleaseReleaseRelease¹TestFlightRelease

¹ On Linux, install libsqlite3-0 and libsqlite3-dev:

sudo apt-get install libsqlite3-0 libsqlite3-dev

Features

  • Chat with MCP Server
  • SSE MCP Transport Support
  • Auto Choose MCP Server
  • Chat History
  • OpenAI, Claude, OLLama, DeepSeek LLM Model Support
  • Dark/Light Theme
  • Upcoming: MCP Server Market, Auto Install MCP Server, RAG, Better UI Design

Feedback and contributions are welcome in the Issues.

Usage

Ensure your system has uvx or npx installed:

# uvx
brew install uv

# npx
brew install node

Steps:

  1. Configure your LLM API Key and Endpoint in Setting Page
  2. Install MCP Server from MCP Server Page
  3. Start chatting with MCP Server

Debugging and App Data Locations

  • macOS: ~/Library/Application Support/ChatMcp
  • Windows: %APPDATA%\ChatMcp
  • Linux: ~/.local/share/ChatMcp
  • Mobile: Application Documents Directory

To reset the app:

macOS:

rm -rf ~/Library/Application\ Support/ChatMcp

Windows:

rd /s /q "%APPDATA%\ChatMcp"

Linux:

rm -rf ~/.local/share/ChatMcp

Development

flutter pub get
flutter run -d macos

The configuration file for the MCP server is at: ~/Library/Application Support/ChatMcp/mcp_server.json

MCP Server Market

Install and use MCP Server from the MCP Server Market to chat with different data sources.

Thanks

License

This project is licensed under the Apache License 2.0.