MCP HubMCP Hub
kimtth

mcp-aoai-web-browsing

by: kimtth

A minimal Model Context Protocol 🖥️ server/client🧑‍💻with Azure OpenAI and 🌐 web browser control via Playwright.

15created 13/12/2024
Visit
OpenAI
Playwright

📌Overview

Purpose: The implementation serves to facilitate secure and efficient interactions between AI applications and resources using the Model Context Protocol (MCP) with Azure OpenAI.

Overview: This document outlines a server/client implementation that leverages the Model Context Protocol (MCP) and integrates with Azure OpenAI. It highlights the usage of FastMCP for building the server, and features like OpenAI function calling format conversion for seamless interaction between the MCP server and OpenAI-compatible models.

Key Features:

  • FastMCP Integration: Utilizes FastMCP to build the MCP server, ensuring high performance and ease of use for interacting with AI resources.

  • OpenAI Function Calling Compatibility: Transforms MCP responses into the OpenAI function calling format, allowing for straightforward communication and function execution in AI applications.

  • Stable Connection via MCP-LLM Bridge: Employs a bridge that ensures stable connections by directly integrating the MCP server object, enhancing communication reliability.

  • Community and Official Resources: Links to repositories and community resources support users in furthering their knowledge and development efforts related to MCP and Azure OpenAI integrations.


MCP Server & Client Implementation for Using Azure OpenAI

A minimal server/client application implementation utilizing the Model Context Protocol (MCP) and Azure OpenAI.

  1. The MCP server is built with FastMCP.
  2. Playwright is an open source, end-to-end testing framework by Microsoft for testing modern web applications.
  3. The MCP response about tools will be converted to the OpenAI function calling format.
  4. The bridge that converts the MCP server response to the OpenAI function calling format customizes the MCP-LLM Bridge implementation.
  5. To ensure a stable connection, the server object is passed directly into the bridge.

Model Context Protocol (MCP)

Model Context Protocol (MCP) is an open protocol enabling secure, controlled interactions between AI applications and local or remote resources.

Official Repositories

  • MCP Python SDK
  • Create Python Server
  • MCP Servers

Community Resources

  • Awesome MCP Servers
  • MCP on Reddit

Related Projects

  • FastMCP: The fast, Pythonic way to build MCP servers.
  • Chat MCP: MCP client.
  • MCP-LLM Bridge: MCP implementation enabling communication between MCP servers and OpenAI-compatible LLMs.

MCP Playwright

  • MCP Playwright server
  • Microsoft Playwright for Python

Configuration

During development in December 2024, the Python project should be initiated with uv. Other dependency management tools such as pip and poetry are not yet fully supported by the MCP CLI.

  1. Rename .env.template to .env, then update the Azure OpenAI values:

    AZURE_OPEN_AI_ENDPOINT=
    AZURE_OPEN_AI_API_KEY=
    AZURE_OPEN_AI_DEPLOYMENT_MODEL=
    AZURE_OPEN_AI_API_VERSION=
    
  2. Install uv for Python library management:

    pip install uv
    uv sync
    
  3. Run the application:

    python chatgui.py
    

Note on 'stdio' and JSON-RPC

stdio is a transport layer (raw data flow), while JSON-RPC is an application protocol (structured communication). They are distinct but often used interchangeably, e.g., "JSON-RPC over stdio" in protocols.

Tool Description Example

@self.mcp.tool()
async def playwright_navigate(url: str, timeout=30000, wait_until="load"):
    """Navigate to a URL."""

This comment provides a description, which may be used in a mechanism similar to function calling in LLMs.

Output example:

Tool(name='playwright_navigate', description='Navigate to a URL.', inputSchema={'properties': {'url': {'title': 'Url', 'type': 'string'}, 'timeout': {'default': 30000, 'title': 'timeout', 'type': 'string'}}})

Tips for Using uv

  • uv run: Run a script.
  • uv venv: Create a new virtual environment (default: .venv).
  • uv add: Add a dependency to a script.
  • uv remove: Remove a dependency from a script.
  • uv sync: Install the project's dependencies with the environment.

More features can be found in the documentation.

Additional Tips

  • To forcibly terminate Python processes on Windows:

    taskkill /IM python.exe /F
    
  • Visual Studio Code Python Debugger: Use launch.json configuration in .vscode/launch.json to start debugging sessions.