MCP HubMCP Hub
66julienmartin

MCP-server-Deepseek_R1

by: 66julienmartin

A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)

51created 05/02/2025
Visit
Claude
DeepSeek

πŸ“ŒOverview

Purpose: To implement a Model Context Protocol (MCP) server for the Deepseek R1 language model, optimizing it for reasoning tasks with a context window of 8192 tokens.

Overview: The Deepseek R1 MCP Server serves as a robust platform for leveraging the Deepseek R1 language model, built on Node.js and TypeScript. This framework supports the execution of advanced text generation tasks, integrates with Claude Desktop, and allows for easy configuration and setup.

Key Features:

  • Advanced Text Generation: Supports generation with an 8192 token context window, enabling detailed and extensive text outputs for various applications.

  • Configurable Parameters: Allows customization of generation parameters such as max_tokens and temperature, facilitating tailored behavior for different use cases.

  • Robust Error Handling: Provides detailed error messages for common API-related issues, enhancing the developer experience by simplifying troubleshooting.

  • Full MCP Protocol Support: Ensures seamless integration and operation within the MCP framework.

  • Claude Desktop Integration: Offers compatibility with Claude Desktop, enhancing user accessibility and experience.

  • Model Flexibility: Supports switching between the Deepseek R1 and DeepSeek-V3 models for diverse application needs.


Deepseek R1 MCP Server

A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model, optimized for reasoning tasks with a context window of 8192 tokens.

Why Node.js?

This implementation uses Node.js/TypeScript for its stable integration with MCP servers, providing better type safety, error handling, and compatibility with Claude Desktop.

Quick Start

Installing Manually

# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install

# Set up environment
cp .env.example .env  # Then add your API key

# Build and run
npm run build

Prerequisites

  • Node.js (v18 or higher)
  • npm
  • Claude Desktop
  • Deepseek API key

Model Selection

By default, this server uses the deepseek-R1 model. To use DeepSeek-V3, modify the model name in src/index.ts:

// For DeepSeek-R1 (default)
model: "deepseek-reasoner"

// For DeepSeek-V3
model: "deepseek-chat"

Project Structure

deepseek-r1-mcp/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ index.ts             # Main server implementation
β”œβ”€β”€ build/                   # Compiled files
β”‚   β”œβ”€β”€ index.js
β”œβ”€β”€ LICENSE
β”œβ”€β”€ README.md
β”œβ”€β”€ package.json
β”œβ”€β”€ package-lock.json
└── tsconfig.json

Configuration

  1. Create a .env file:

    DEEPSEEK_API_KEY=your-api-key-here
    
  2. Update Claude Desktop configuration:

    {
      "mcpServers": {
        "deepseek_r1": {
          "command": "node",
          "args": ["/path/to/deepseek-r1-mcp/build/index.js"],
          "env": {
            "DEEPSEEK_API_KEY": "your-api-key"
          }
        }
      }
    }
    

Development

npm run dev     # Watch mode
npm run build   # Build for production

Features

  • Advanced text generation with Deepseek R1 (8192 token context window)
  • Configurable parameters (max_tokens, temperature)
  • Robust error handling
  • Full MCP protocol support
  • Claude Desktop integration
  • Support for both DeepSeek-R1 and DeepSeek-V3 models

API Usage

{
  "name": "deepseek_r1",
  "arguments": {
    "prompt": "Your prompt here",
    "max_tokens": 8192,    // Maximum tokens to generate
    "temperature": 0.2     // Controls randomness
  }
}

The Temperature Parameter

The default temperature is set to 0.2. Recommended values based on use case:

USE CASETEMPERATURE
Coding / Math0.0
Data Cleaning / Data Analysis1.0
General Conversation1.3
Translation1.3
Creative Writing / Poetry1.5

Error Handling

The server provides detailed error messages for:

  • API authentication errors
  • Invalid parameters
  • Rate limiting
  • Network issues

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT