MCP-server-Deepseek_R1
by: 66julienmartin
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
πOverview
Purpose: To implement a Model Context Protocol (MCP) server for the Deepseek R1 language model, optimizing it for reasoning tasks with a context window of 8192 tokens.
Overview: The Deepseek R1 MCP Server serves as a robust platform for leveraging the Deepseek R1 language model, built on Node.js and TypeScript. This framework supports the execution of advanced text generation tasks, integrates with Claude Desktop, and allows for easy configuration and setup.
Key Features:
-
Advanced Text Generation: Supports generation with an 8192 token context window, enabling detailed and extensive text outputs for various applications.
-
Configurable Parameters: Allows customization of generation parameters such as
max_tokens
andtemperature
, facilitating tailored behavior for different use cases. -
Robust Error Handling: Provides detailed error messages for common API-related issues, enhancing the developer experience by simplifying troubleshooting.
-
Full MCP Protocol Support: Ensures seamless integration and operation within the MCP framework.
-
Claude Desktop Integration: Offers compatibility with Claude Desktop, enhancing user accessibility and experience.
-
Model Flexibility: Supports switching between the Deepseek R1 and DeepSeek-V3 models for diverse application needs.
Deepseek R1 MCP Server
A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model, optimized for reasoning tasks with a context window of 8192 tokens.
Why Node.js?
This implementation uses Node.js/TypeScript for its stable integration with MCP servers, providing better type safety, error handling, and compatibility with Claude Desktop.
Quick Start
Installing Manually
# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install
# Set up environment
cp .env.example .env # Then add your API key
# Build and run
npm run build
Prerequisites
- Node.js (v18 or higher)
- npm
- Claude Desktop
- Deepseek API key
Model Selection
By default, this server uses the deepseek-R1 model. To use DeepSeek-V3, modify the model name in src/index.ts
:
// For DeepSeek-R1 (default)
model: "deepseek-reasoner"
// For DeepSeek-V3
model: "deepseek-chat"
Project Structure
deepseek-r1-mcp/
βββ src/
β βββ index.ts # Main server implementation
βββ build/ # Compiled files
β βββ index.js
βββ LICENSE
βββ README.md
βββ package.json
βββ package-lock.json
βββ tsconfig.json
Configuration
-
Create a
.env
file:DEEPSEEK_API_KEY=your-api-key-here
-
Update Claude Desktop configuration:
{ "mcpServers": { "deepseek_r1": { "command": "node", "args": ["/path/to/deepseek-r1-mcp/build/index.js"], "env": { "DEEPSEEK_API_KEY": "your-api-key" } } } }
Development
npm run dev # Watch mode
npm run build # Build for production
Features
- Advanced text generation with Deepseek R1 (8192 token context window)
- Configurable parameters (max_tokens, temperature)
- Robust error handling
- Full MCP protocol support
- Claude Desktop integration
- Support for both DeepSeek-R1 and DeepSeek-V3 models
API Usage
{
"name": "deepseek_r1",
"arguments": {
"prompt": "Your prompt here",
"max_tokens": 8192, // Maximum tokens to generate
"temperature": 0.2 // Controls randomness
}
}
The Temperature Parameter
The default temperature
is set to 0.2. Recommended values based on use case:
USE CASE | TEMPERATURE |
---|---|
Coding / Math | 0.0 |
Data Cleaning / Data Analysis | 1.0 |
General Conversation | 1.3 |
Translation | 1.3 |
Creative Writing / Poetry | 1.5 |
Error Handling
The server provides detailed error messages for:
- API authentication errors
- Invalid parameters
- Rate limiting
- Network issues
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT