MCP HubMCP Hub
ALucek

quick-mcp-example

by: ALucek

Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.

26created 04/03/2025
Visit
Tools
Example

📌Overview

Purpose: To establish a standardized framework that allows applications to seamlessly integrate with Large Language Models (LLMs) by providing context and functionality through a unified protocol.

Overview: The Model Context Protocol (MCP) serves as an open framework that facilitates interactions between applications and LLMs. By highlighting the roles of MCP Servers, Clients, and Hosts, the protocol encourages modular development while ensuring interoperability across various components. This design enables developers to create specialized MCP Servers for different functionalities and easily connect them to supported applications, enhancing the capabilities and user experiences of LLM-based systems.

Key Features:

  • MCP Servers: Central to the protocol, they expose standardized tools, resources, and prompts, allowing easy integration and execution of functions by LLMs through defined interfaces.

  • Tools: Functions that LLMs can invoke; they enable dynamic interactions with external systems and databases to retrieve information or perform actions.

  • Resources: Data sources accessible to client applications, identified by URIs, that provide context without requiring direct function calls.

  • Prompts: Reusable templates for interaction patterns, which standardize conversation flows and facilitate user engagement through UI elements, enhancing the overall efficiency of communication tasks.


Standardizing LLM Interaction with MCP Servers

Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. It provides a unified framework for LLM-based applications to connect to data sources, get context, use tools, and execute standard prompts.

MCP Ecosystem Components

The MCP ecosystem includes three main components:

  • MCP Servers: Handle tool availability (exposing available functions), tool execution, static content as resources, and preset prompts.
  • Clients: Manage connections to servers, LLM integration, and message passing between components.
  • Hosts: Provide frontend interfaces, surface MCP functionality to users, and serve as integration points in the ecosystem.

This modular architecture allows different components to be developed independently while maintaining interoperability. It enables users to create MCP servers for various LLM-related functionalities and plug them into a range of supported applications, commonly integrating service APIs, tools, or local data sources.

MCP Server Components

MCP servers form the foundation of the protocol by exposing standardized capabilities through well-defined interfaces. Hosts and clients connect to these servers using the protocol, while the user experience and implementation details remain flexible—ranging from command line interfaces to graphical applications or embedded systems.

The main components of an MCP server include:

Tools

Tools are functions that the LLM can invoke to perform actions or retrieve information. Each tool is defined as:

{
  name: string;          # Unique identifier for the tool
  description?: string;  # Human-readable description
  inputSchema: {         # JSON Schema for the tool's parameters
    type: "object",
    properties: { ... }  # Tool-specific parameters
  }
}

Tools let LLMs interact with external systems by executing code, querying databases, or performing calculations, representing actions that produce effects or generate new information.

Resources

Resources are data sources accessible by the client application, identified by URIs:

{
  uri: string;           # Unique identifier for the resource
  name: string;          # Human-readable name
  description?: string;  # Optional description
  mimeType?: string;     # Optional MIME type
}

Resources can be static (e.g., configuration files) or dynamic (e.g., database records or API responses) and provide context to the LLM without requiring function calls.

Prompts

Prompts are reusable templates defining specific interaction patterns:

{
  name: string;              # Unique identifier for the prompt
  description?: string;      # Human-readable description
  arguments?: [              # Optional list of arguments
    {
      name: string;          # Argument identifier
      description?: string;  # Argument description
      required?: boolean;    # Whether argument is required
    }
  ]
}

Prompts enable consistent, purpose-built interactions for common tasks, triggered through UI elements such as slash commands.

Note: Tools are designed specifically for LLM interaction (similar to function calling), while prompts and resources serve different purposes. Prompts are user-controlled templates invoked via UI, and resources are application-controlled data sources presented for user selection before inclusion in the LLM context.

More details and additional functionality are available in the MCP Official Documentation.


Setting Up Our Example

Our example MCP Server demonstrates tools, resources, and prompts by implementing a simple knowledgebase chatbot flow with these capabilities:

  1. Allow LLMs to use tools to query a vector database for retrieval-augmented generation (RAG) responses.
  2. Enable users to select existing resources to provide context.
  3. Allow users to execute standard prompts for complex analytical workflows.

This example is implemented in mcp_server.py with a corresponding simple CLI client in client.py.

As a useful resource, check out MCP's Server List for official integrations and community-made servers:
https://github.com/modelcontextprotocol/servers


Setup and Installation

  1. Clone the Repository
git clone https://github.com/ALucek/quick-mcp-example.git
cd quick-mcp-example
  1. Create the ChromaDB Database

Follow the instructions in MCP_setup.ipynb to create the vector database and embed a PDF into it.

  1. Create the Virtual Environment and Install Packages
# Using uv (recommended)
uv venv
source .venv/bin/activate  # On macOS/Linux
# OR
.venv\Scripts\activate     # On Windows

# Install dependencies
uv sync
  1. Run the Client & Server
python client.py mcp_server.py