uber-eats-mcp-server
by: ericzakariasson
uber eats mcp server
📌Overview
Purpose: This framework serves as a proof of concept (POC) to build Model Context Protocol (MCP) servers that integrate with Uber Eats.
Overview: The Model Context Protocol (MCP) is an open protocol designed to facilitate seamless connections between large language model (LLM) applications and external tools. This implementation focuses on leveraging MCP within the Uber Eats ecosystem, showcasing its potential for enhancing interactions.
Key Features:
-
Seamless Integration: MCP allows for easy integration of LLM applications with various external tools, providing a flexible environment for developers to create enhanced user experiences.
-
Apache-Style Setup: The setup process is straightforward, requiring a virtual environment and clear instructions for installing dependencies, allowing developers to quickly get started.
Uber Eats MCP Server
This is a proof of concept (POC) demonstrating how to build MCP servers on top of Uber Eats.
What is MCP?
The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external tools. More information can be found at https://modelcontextprotocol.io/.
Prerequisites
- Python 3.12 or higher
- Anthropic API key or other supported LLM provider
Setup
-
Ensure you have a virtual environment activated:
uv venv source .venv/bin/activate # On Unix/Mac
-
Install required packages:
uv pip install -r requirements.txt playwright install
-
Update the
.env
file with your API key:ANTHROPIC_API_KEY=your_openai_api_key_here
Note
Since stdio is used as the MCP transport, all browser output has been disabled.
Debugging
Run the MCP inspector tool with the following command:
uv run mcp dev server.py