uber-eats-mcp-server
by: ericzakariasson
uber eats mcp server
📌Overview
Purpose: This framework serves as a proof of concept (POC) to build Model Context Protocol (MCP) servers that integrate with Uber Eats.
Overview: The Model Context Protocol (MCP) is an open protocol designed to facilitate seamless connections between large language model (LLM) applications and external tools. This implementation focuses on leveraging MCP within the Uber Eats ecosystem, showcasing its potential for enhancing interactions.
Key Features:
-
Seamless Integration: MCP allows for easy integration of LLM applications with various external tools, providing a flexible environment for developers to create enhanced user experiences.
-
Apache-Style Setup: The setup process is straightforward, requiring a virtual environment and clear instructions for installing dependencies, allowing developers to quickly get started.
Uber Eats MCP Server
This is a proof of concept for building MCP servers on top of Uber Eats.
What is MCP?
The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external tools.
Prerequisites
- Python 3.12 or higher
- Anthropic API key or other supported LLM provider
Setup
-
Ensure you have a virtual environment activated:
uv venv source .venv/bin/activate # On Unix/Mac
-
Install required packages:
uv pip install -r requirements.txt playwright install
-
Update the
.env
file with your API key:ANTHROPIC_API_KEY=your_openai_api_key_here
Note
Since we're using stdio as MCP transport, we have disabled all output from browser use.
Debugging
Run the MCP inspector tool with the following command:
uv run mcp dev server.py