unichat-mcp-server
by: amidabuddha
unichat mcp server
📌Overview
Purpose: The Unichat MCP Server facilitates interaction with various AI models through the MCP protocol, enabling developers to send requests and retrieve responses efficiently.
Overview: Unichat MCP Server is a Python-based server designed for seamless integration with multiple AI APIs such as OpenAI, MistralAI, and Google AI. It allows users to perform operations using specific prompts, enhancing code functionality and documentation processes.
Key Features:
-
Multiple AI Integration: Connects with various AI service providers using the MCP protocol, allowing standardized requests and responses.
-
Customizable Prompts: Provides a set of predefined prompts such as
code_review
,document_code
,explain_code
, andcode_rework
to facilitate code management and documentation. -
Easy Installation and Configuration: Supports straightforward setup and configuration via Smithery or manual JSON configurations, making it accessible for developers.
-
Robust Tool Support: Implements a main tool,
unichat
, enabling the sending of messages to the server, ensuring efficiency in processing and receiving results.
Unichat MCP Server in Python
Also available in TypeScript
Send requests to OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, Inception using MCP protocol via tool or predefined prompts. Vendor API key required.
Tools
The server implements one tool:
unichat
: Send a request to unichat- Takes "messages" as required string arguments
- Returns a response
Prompts
code_review
- Review code for best practices, potential issues, and improvements
- Arguments:
code
(string, required): The code to review
document_code
- Generate documentation for code including docstrings and comments
- Arguments:
code
(string, required): The code to comment
explain_code
- Explain how a piece of code works in detail
- Arguments:
code
(string, required): The code to explain
code_rework
- Apply requested changes to the provided code
- Arguments:
changes
(string, optional): The changes to applycode
(string, required): The code to rework
Quickstart
Install
Claude Desktop
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Supported Models:
A list of currently supported models to be used as "SELECTED_UNICHAT_MODEL"
may be found here:
https://github.com/amidabuddha/unichat/blob/main/unichat/models.py
Please make sure to add the relevant vendor API key as "YOUR_UNICHAT_API_KEY"
.
Example:
"env": {
"UNICHAT_MODEL": "gpt-4o-mini",
"UNICHAT_API_KEY": "YOUR_OPENAI_API_KEY"
}
Development/Unpublished Servers Configuration:
"mcpServers": {
"unichat-mcp-server": {
"command": "uv",
"args": [
"--directory",
"{{your source code local directory}}/unichat-mcp-server",
"run",
"unichat-mcp-server"
],
"env": {
"UNICHAT_MODEL": "SELECTED_UNICHAT_MODEL",
"UNICHAT_API_KEY": "YOUR_UNICHAT_API_KEY"
}
}
}
Published Servers Configuration:
"mcpServers": {
"unichat-mcp-server": {
"command": "uvx",
"args": [
"unichat-mcp-server"
],
"env": {
"UNICHAT_MODEL": "SELECTED_UNICHAT_MODEL",
"UNICHAT_API_KEY": "YOUR_UNICHAT_API_KEY"
}
}
}
Installing via Smithery
To install Unichat for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install unichat-mcp-server --client claude
Development
Building and Publishing
To prepare the package for distribution:
- Remove older builds:
rm -rf dist
- Sync dependencies and update lockfile:
uv sync
- Build package distributions:
uv build
This will create source and wheel distributions in the dist/
directory.
- Publish to PyPI:
uv publish --token {{YOUR_PYPI_API_TOKEN}}
Debugging
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm with this command:
npx @modelcontextprotocol/inspector uv --directory {{your source code local directory}}/unichat-mcp-server run unichat-mcp-server
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.