MCP HubMCP Hub
shreyaskarnik

huggingface-mcp-server

by: shreyaskarnik

huggingface mcp server

38created 19/03/2025
Visit
huggingface
AI

📌Overview

Purpose: The Hugging Face MCP Server offers a protocol for read-only access to the Hugging Face Hub APIs, enabling LLMs to efficiently interact with Hugging Face's extensive range of models, datasets, spaces, papers, and collections.

Overview: This server simplifies access to Hugging Face resources utilizing a custom URI scheme. It is designed for enhanced interaction with models and data through various tools and prompt templates, allowing users to effectively manage and analyze AI resources.

Key Features:

  • Resource Access: Utilizes a custom hf:// URI scheme to access models, datasets, spaces, and collections with descriptive naming and JSON content type, facilitating straightforward resource identification.

  • Prompt Templates: Offers templates like compare-models for model comparisons and summarize-paper for paper summaries, streamlining the analysis and retrieval of insights from various Hugging Face resources.


🤗 Hugging Face MCP Server 🤗

A Model Context Protocol (MCP) server that provides read-only access to the Hugging Face Hub APIs. This server allows LLMs like Claude to interact with Hugging Face's models, datasets, spaces, papers, and collections.

Components

Resources

The server exposes popular Hugging Face resources:

  • Custom hf:// URI scheme for accessing resources
  • Models with hf://model/{model_id} URIs
  • Datasets with hf://dataset/{dataset_id} URIs
  • Spaces with hf://space/{space_id} URIs
  • All resources have descriptive names and JSON content type

Prompts

The server provides two prompt templates:

  • compare-models: Generates a comparison between multiple Hugging Face models

    • Requires model_ids argument (comma-separated model IDs)
    • Retrieves model details and formats them for comparison
  • summarize-paper: Summarizes a research paper from Hugging Face

    • Requires arxiv_id argument for paper identification
    • Optional detail_level argument (brief/detailed) to control summary depth
    • Combines paper metadata with implementation details

Tools

The server implements several tool categories:

  • Model Tools

    • search-models: Search models with filters for query, author, tags, and limit
    • get-model-info: Get detailed information about a specific model
  • Dataset Tools

    • search-datasets: Search datasets with filters
    • get-dataset-info: Get detailed information about a specific dataset
  • Space Tools

    • search-spaces: Search Spaces with filters including SDK type
    • get-space-info: Get detailed information about a specific Space
  • Paper Tools

    • get-paper-info: Get information about a paper and its implementations
    • get-daily-papers: Get the list of curated daily papers
  • Collection Tools

    • search-collections: Search collections with various filters
    • get-collection-info: Get detailed information about a specific collection

Configuration

The server does not require configuration but supports optional Hugging Face authentication:

  • Set HF_TOKEN environment variable with your Hugging Face API token to:
    • Get higher API rate limits
    • Access private repositories (if authorized)
    • Improve reliability for high-volume requests

Quickstart

Install

Installing via Smithery

To install huggingface-mcp-server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @shreyaskarnik/huggingface-mcp-server --client claude

Claude Desktop Configuration

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration
"mcpServers": {
  "huggingface": {
    "command": "uv",
    "args": [
      "--directory",
      "/absolute/path/to/huggingface-mcp-server",
      "run",
      "huggingface_mcp_server.py"
    ],
    "env": {
      "HF_TOKEN": "your_token_here"  // Optional
    }
  }
}

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This creates source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: Set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. It is recommended to use the MCP Inspector.

Launch the MCP Inspector via npm with:

npx @modelcontextprotocol/inspector uv --directory /path/to/huggingface-mcp-server run huggingface_mcp_server.py

The Inspector will display a URL to access in your browser to begin debugging.

Example Prompts for Claude

Try these example prompts when using this server with Claude:

  • Search for BERT models on Hugging Face with less than 100 million parameters
  • Find the most popular datasets for text classification on Hugging Face
  • What are today's featured AI research papers on Hugging Face?
  • Summarize the paper with arXiv ID 2307.09288 using the Hugging Face MCP server
  • Compare the Llama-3-8B and Mistral-7B models from Hugging Face
  • Show me the most popular Gradio spaces for image generation
  • Find collections created by TheBloke that include Mixtral models

Troubleshooting

If you encounter issues:

  1. Check server logs in Claude Desktop:

    • macOS: ~/Library/Logs/Claude/mcp-server-huggingface.log
    • Windows: %APPDATA%\Claude\logs\mcp-server-huggingface.log
  2. For API rate limiting errors, consider adding a Hugging Face API token

  3. Ensure your machine has internet connectivity to reach the Hugging Face API

  4. If a particular tool is failing, try accessing the same data through the Hugging Face website to verify its existence