MCP HubMCP Hub
ALucek

quick-mcp-example

by: ALucek

Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.

26created 04/03/2025
Visit
Tools
Example

📌Overview

Purpose: To establish a standardized framework that allows applications to seamlessly integrate with Large Language Models (LLMs) by providing context and functionality through a unified protocol.

Overview: The Model Context Protocol (MCP) serves as an open framework that facilitates interactions between applications and LLMs. By highlighting the roles of MCP Servers, Clients, and Hosts, the protocol encourages modular development while ensuring interoperability across various components. This design enables developers to create specialized MCP Servers for different functionalities and easily connect them to supported applications, enhancing the capabilities and user experiences of LLM-based systems.

Key Features:

  • MCP Servers: Central to the protocol, they expose standardized tools, resources, and prompts, allowing easy integration and execution of functions by LLMs through defined interfaces.

  • Tools: Functions that LLMs can invoke; they enable dynamic interactions with external systems and databases to retrieve information or perform actions.

  • Resources: Data sources accessible to client applications, identified by URIs, that provide context without requiring direct function calls.

  • Prompts: Reusable templates for interaction patterns, which standardize conversation flows and facilitate user engagement through UI elements, enhancing the overall efficiency of communication tasks.


Standardizing LLM Interaction with MCP Servers

Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to Large Language Models (LLMs). It offers a unified framework for LLM-based applications to connect to data sources, get context, use tools, and execute standard prompts.

MCP Ecosystem Components

The MCP ecosystem consists of three main components:

  • MCP Servers: Handle tool availability, execution of functions, static content resources, and preset prompts.

  • Clients: Manage connections to servers, LLM integration, and message passing.

  • Hosts: Provide frontend interfaces, exposing MCP functionality to users and integrating with the overall ecosystem.

This architecture allows for a modular system where components can be developed independently while ensuring interoperability.

MCP Server Components

MCP servers expose standardized capabilities through defined interfaces. This flexibility allows developers to create various implementations and user experiences for interaction.

Tools

Tools are functions that LLMs can invoke to perform actions or retrieve information. Each tool is defined with:

{
  name: string;          // Unique identifier for the tool
  description?: string;  // Human-readable description
  inputSchema: {         // JSON Schema for the tool's parameters
    type: "object",
    properties: { ... }  // Tool-specific parameters
  }
}

Resources

Resources represent data sources accessible by the client application and are identified by URIs. They can include:

{
  uri: string;           // Unique identifier for the resource
  name: string;          // Human-readable name
  description?: string;  // Optional description
  mimeType?: string;     // Optional MIME type
}

Prompts

Prompts are reusable templates defining specific interaction patterns:

{
  name: string;              // Unique identifier for the prompt
  description?: string;      // Human-readable description
  arguments?: [              // Optional list of arguments
    {
      name: string;          // Argument identifier
      description?: string;  // Argument description
      required?: boolean;    // Whether argument is required
    }
  ]
}

Prompts facilitate consistent interactions for common tasks.

More details and additional functionality can be found in the MCP Official Documentation.

Setting Up Our Example

Our MCP Server will demonstrate tools, resources, and prompts for a knowledgebase chatbot flow:

  1. Allow the LLM to query a vector database for RAG responses.
  2. Enable users to choose existing resources for context.
  3. Facilitate the execution of standard prompts for complex analytical workflows.

Refer to mcp_server.py and the corresponding CLI client in client.py for implementation.

Setup and Installation

  1. Clone the Repository

    git clone https://github.com/ALucek/quick-mcp-example.git
    cd quick-mcp-example
    
  2. Create the ChromaDB Database

    Follow the instructions in MCP_setup.ipynb to create the vector database.

  3. Create a Virtual Environment and Install Packages

    # Using uv (recommended)
    uv venv
    source .venv/bin/activate  # On macOS/Linux
    # OR
    .venv\Scripts\activate     # On Windows
    
    # Install dependencies
    uv sync
    
  4. Run the Client & Server

    python client.py mcp_server.py