MCP HubMCP Hub
ruixingshi

deepseek-thinker-mcp

by: ruixingshi

A MCP provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's CoT from the Deepseek API service or a local Ollama server.

40created 13/02/2025
Visit
Deepseek
Ollama

📌Overview

Purpose: To provide reasoning content from Deepseek to MCP-enabled AI clients using the Model Context Protocol.

Overview: The Deepseek Thinker MCP Server offers dual-mode access to Deepseek's reasoning capabilities, catering to both remote API and local server configurations. It enables AI clients to retrieve structured and thoughtful responses by leveraging Deepseek's underlying thought processes.

Key Features:

  • Dual Mode Support: Allows operation in OpenAI API mode for remote access, or Ollama local mode for on-premise reasoning, accommodating different user preferences.

  • Focused Reasoning: Captures and outputs Deepseek's reasoning process, ensuring detailed and structured responses that enhance the interaction quality with AI clients.


Deepseek Thinker MCP Server

A MCP (Model Context Protocol) provider delivering Deepseek reasoning content to MCP-enabled AI clients, such as Claude Desktop. Supports access to Deepseek's thought processes through the Deepseek API service or a local Ollama server.

Core Features

  • 🤖 Dual Mode Support

    • OpenAI API mode
    • Ollama local mode
  • 🎯 Focused Reasoning

    • Captures Deepseek's thinking process
    • Provides reasoning output

Available Tools

get-deepseek-thinker

  • Description: Perform reasoning using the Deepseek model
  • Input Parameters:
    • originPrompt (string): User's original prompt
  • Returns: Structured text response containing the reasoning process

Environment Configuration

OpenAI API Mode

Set the following environment variables:

API_KEY=<Your OpenAI API Key>
BASE_URL=<API Base URL>

Ollama Mode

Set the following environment variable:

USE_OLLAMA=true

Usage

Integration with AI Client, like Claude Desktop

Add the following configuration to your claude_desktop_config.json:

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "npx",
      "args": [
        "-y",
        "deepseek-thinker-mcp"
      ],
      "env": {
        "API_KEY": "<Your API Key>",
        "BASE_URL": "<Your Base URL>"
      }
    }
  }
}

Using Ollama Mode

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "npx",
      "args": [
        "-y",
        "deepseek-thinker-mcp"
      ],
      "env": {
        "USE_OLLAMA": "true"
      }
    }
  }
}

Local Server Configuration

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "node",
      "args": [
        "/your-path/deepseek-thinker-mcp/build/index.js"
      ],
      "env": {
        "API_KEY": "<Your API Key>",
        "BASE_URL": "<Your Base URL>"
      }
    }
  }
}

Development Setup

# Install dependencies
npm install

# Build project
npm run build

# Run service
node build/index.js

FAQ

Response like this: "MCP error -32001: Request timed out"

This error occurs when the Deepseek API response is too slow or the reasoning content output is too long, causing the MCP server to timeout.

Tech Stack

  • TypeScript
  • @modelcontextprotocol/sdk
  • OpenAI API
  • Ollama
  • Zod (parameter validation)

License

This project is licensed under the MIT License. See the LICENSE file for details.