aws-cost-explorer-mcp-server
by: aarora79
MCP server for understanding AWS spend
📌Overview
Purpose: The main goal of this tool is to provide an efficient way to analyze and visualize AWS cloud spending data using Anthropic's Claude model for natural language interaction.
Overview: This framework functions as an MCP server, leveraging AWS Cost Explorer and Amazon Bedrock usage data to allow users to inquire about their AWS costs and spending patterns through natural language queries.
Key Features:
-
Amazon EC2 Spend Analysis: Enables users to view and analyze detailed breakdowns of EC2 spending from the last day, facilitating immediate insights into costs.
-
Amazon Bedrock Spend Analysis: Provides a comprehensive breakdown of spending by region, user, and model for the past 30 days, helping to understand usage patterns.
-
Service Spend Reports: Allows analysis of spending across all AWS services for the last 30 days, offering a holistic view of expenditures.
-
Detailed Cost Breakdown: Delivers granular cost data categorized by day, region, service, and instance type, enabling precise financial tracking.
-
Interactive Interface: Utilizes Claude's natural language processing capabilities to transform complex spending queries into user-friendly interactions, simplifying data access and understanding.
AWS Cost Explorer and Amazon Bedrock Model Invocation Logs MCP Server & Client
An MCP server for retrieving AWS spend data via Cost Explorer and Amazon Bedrock usage data via Model invocation logs in Amazon CloudWatch through Anthropic's MCP (Model Control Protocol). You can run your MCP server over HTTPS as described in the section on Secure Remote MCP Server.
flowchart LR
User([User]) --> UserApp[User Application]
UserApp --> |Queries| Host[Host]
subgraph "Claude Desktop"
Host --> MCPClient[MCP Client]
end
MCPClient --> |MCP Protocol over HTTPS| MCPServer[AWS Cost Explorer MCP Server]
subgraph "AWS Services"
MCPServer --> |API Calls| CostExplorer[(AWS Cost Explorer)]
MCPServer --> |API Calls| CloudWatchLogs[(AWS CloudWatch Logs)]
end
You can run the MCP server locally and access it via Claude Desktop or run a Remote MCP server on Amazon EC2 and access it via an MCP client built into a LangGraph Agent.
🚨 You can also use this MCP server to get AWS spend information from other accounts as long as the IAM role used by the MCP server can assume roles in those other accounts. 🚨
Overview
This tool provides a convenient way to analyze and visualize AWS cloud spending data using Anthropic's Claude model as an interactive interface. It functions as an MCP server exposing AWS Cost Explorer API functionality to Claude Desktop, allowing natural language queries about your AWS spend.
Features
- Amazon EC2 Spend Analysis: View detailed EC2 spending breakdowns for the last day.
- Amazon Bedrock Spend Analysis: View breakdowns by region, users, and models over the last 30 days.
- Service Spend Reports: Analyze spending across all AWS services for the last 30 days.
- Detailed Cost Breakdown: Get granular cost data by day, region, service, and instance type.
- Interactive Interface: Use Claude to query your cost data via natural language.
Requirements
- Python 3.12
- AWS credentials with Cost Explorer access
- Anthropic API access (for Claude integration)
- Optional: Amazon Bedrock access (for LangGraph Agent)
- Optional: Amazon EC2 for running a remote MCP server
Installation
-
Install
uv
:# macOS and Linux curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
-
Clone the repository:
git clone https://github.com/aarora79/aws-cost-explorer-mcp.git cd aws-cost-explorer-mcp
-
Setup Python virtual environment and install dependencies:
uv venv --python 3.12 && source .venv/bin/activate && uv pip install --requirement pyproject.toml
-
Configure AWS credentials in
~/.aws/credentials
and~/.aws/config
. For AWS IAM Identity Center, follow the AWS CLI documentation to configure short-term credentials.
Usage
Prerequisites
- Setup model invocation logs in Amazon CloudWatch.
- Ensure the IAM user/role has full read-only access to Amazon Cost Explorer and Amazon CloudWatch. Refer to AWS documentation for example policies.
- To access AWS spend information from other accounts, set the
CROSS_ACCOUNT_ROLE_NAME
when starting the server.
Local Setup
Uses stdio
as transport for MCP; both server and client run locally.
Starting the Server Locally
export MCP_TRANSPORT=stdio
export BEDROCK_LOG_GROUP_NAME=YOUR_BEDROCK_CW_LOG_GROUP_NAME
export CROSS_ACCOUNT_ROLE_NAME=ROLE_NAME_FOR_THE_ROLE_TO_ASSUME_IN_OTHER_ACCOUNTS # optional
python server.py
Claude Desktop Configuration
There are two ways to configure this tool with Claude Desktop:
Option 1: Using Docker
Add the following to your Claude Desktop configuration file:
{
"mcpServers": {
"aws-cost-explorer": {
"command": "docker",
"args": [ "run", "-i", "--rm", "-e", "AWS_ACCESS_KEY_ID", "-e", "AWS_SECRET_ACCESS_KEY", "-e", "AWS_REGION", "-e", "BEDROCK_LOG_GROUP_NAME", "-e", "MCP_TRANSPORT", "-e", "CROSS_ACCOUNT_ROLE_NAME", "aws-cost-explorer-mcp:latest" ],
"env": {
"AWS_ACCESS_KEY_ID": "YOUR_ACCESS_KEY_ID",
"AWS_SECRET_ACCESS_KEY": "YOUR_SECRET_ACCESS_KEY",
"AWS_REGION": "us-east-1",
"BEDROCK_LOG_GROUP_NAME": "YOUR_CLOUDWATCH_BEDROCK_MODEL_INVOCATION_LOG_GROUP_NAME",
"CROSS_ACCOUNT_ROLE_NAME": "ROLE_NAME_FOR_THE_ROLE_TO_ASSUME_IN_OTHER_ACCOUNTS",
"MCP_TRANSPORT": "stdio"
}
}
}
}
Important: Replace credentials with your actual AWS credentials; do not commit them to version control.
Option 2: Using UV (without Docker)
{
"mcpServers": {
"aws_cost_explorer": {
"command": "uv",
"args": [
"--directory",
"/path/to/aws-cost-explorer-mcp-server",
"run",
"server.py"
],
"env": {
"AWS_ACCESS_KEY_ID": "YOUR_ACCESS_KEY_ID",
"AWS_SECRET_ACCESS_KEY": "YOUR_SECRET_ACCESS_KEY",
"AWS_REGION": "us-east-1",
"BEDROCK_LOG_GROUP_NAME": "YOUR_CLOUDWATCH_BEDROCK_MODEL_INVOCATION_LOG_GROUP_NAME",
"CROSS_ACCOUNT_ROLE_NAME": "ROLE_NAME_FOR_THE_ROLE_TO_ASSUME_IN_OTHER_ACCOUNTS",
"MCP_TRANSPORT": "stdio"
}
}
}
}
Replace the directory path with your repository location.
Remote Setup
Uses sse
(Server-Sent Events) as transport; MCP server runs on EC2, client runs locally.
Starting the Server Remotely
export MCP_TRANSPORT=sse
export BEDROCK_LOG_GROUP_NAME=YOUR_BEDROCK_CW_LOG_GROUP_NAME
export CROSS_ACCOUNT_ROLE_NAME=ROLE_NAME_FOR_THE_ROLE_TO_ASSUME_IN_OTHER_ACCOUNTS # optional
python server.py
- MCP server listens on TCP port 8000.
- Configure EC2 security group to allow inbound TCP port 8000 from your client machine.
MCP protocol uses JSON-RPC 2.0 and has no built-in authentication; avoid transmitting sensitive data over MCP.
Testing with CLI MCP Client
MCP_SERVER_HOSTNAME=YOUR_MCP_SERVER_EC2_HOSTNAME
AWS_ACCOUNT_ID=AWS_ACCOUNT_ID_TO_GET_INFO_ABOUT
python mcp_sse_client.py --host $MCP_SERVER_HOSTNAME --aws-account-id $AWS_ACCOUNT_ID
Testing with Chainlit App
app.py
provides a Chainlit chatbot that uses a LangGraph agent with LangChain MCP Adapter to interact with the MCP server.
Run:
chainlit run app.py --port 8080
Then use the chatbot through localhost:8080
to get AWS spend details.
Available Tools
get_ec2_spend_last_day()
: EC2 spending data for the previous day.get_detailed_breakdown_by_day(days=7)
: Cost analysis by region, service, and instance type over a period.get_bedrock_daily_usage_stats(days=7, region='us-east-1', log_group_name='BedrockModelInvocationLogGroup')
: Daily model usage breakdown.get_bedrock_hourly_usage_stats(days=7, region='us-east-1', log_group_name='BedrockModelInvocationLogGroup')
: Hourly model usage breakdown.
Example Queries
- "Help me understand my Bedrock spend over the last few weeks"
- "What was my EC2 spend yesterday?"
- "Show me my top 5 AWS services by cost for the last month"
- "Analyze my spending by region for the past 14 days"
- "Which instance types are costing me the most money?"
- "Which services had the highest month-over-month cost increase?"
Docker Support
Build and run with Docker:
docker build -t aws-cost-explorer-mcp .
docker run -v ~/.aws:/root/.aws aws-cost-explorer-mcp
Development
Project Structure
server.py
: Main MCP server implementation with tools.pyproject.toml
: Project dependencies.Dockerfile
: Container configuration.
Adding New Cost Analysis Tools
- Add new functions to
server.py
. - Annotate with
@mcp.tool()
. - Implement AWS Cost Explorer API calls.
- Format results for readability.
Secure Remote MCP Server
Use nginx
as a reverse proxy to provide HTTPS for MCP server access.
Setup Steps
-
Allow inbound TCP port 443 in EC2 security group for your MCP client IP.
-
Obtain an SSL certificate and private key for your MCP server domain.
-
Install
nginx
:sudo apt-get install nginx sudo nginx -t sudo systemctl reload nginx
-
Get your EC2 public hostname:
TOKEN=$(curl -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 21600") curl -H "X-aws-ec2-metadata-token: $TOKEN" -s http://169.254.169.254/latest/meta-data/public-hostname
-
Create
/etc/nginx/conf.d/ec2.conf
with:server { listen 80; server_name YOUR_EC2_HOSTNAME; return 301 https://$host$request_uri; } server { listen 443 ssl; server_name YOUR_EC2_HOSTNAME; ssl_certificate /etc/ssl/certs/cert.pem; ssl_certificate_key /etc/ssl/privatekey/privkey.pem; ssl_protocols TLSv1.2 TLSv1.3; ssl_ciphers HIGH:!aNULL:!MD5; location / { proxy_pass http://127.0.0.1:8000; proxy_http_version 1.1; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; } }
-
Restart nginx:
sudo systemctl start nginx
-
Start your MCP server as in remote setup.
-
Access the MCP server via HTTPS at
https://your-mcp-server-domain-name.com/sse
.
Client Configuration Examples
MCP_SERVER_HOSTNAME=YOUR_MCP_SERVER_DOMAIN_NAME
AWS_ACCOUNT_ID=AWS_ACCOUNT_ID_TO_GET_INFO_ABOUT
python mcp_sse_client.py --host $MCP_SERVER_HOSTNAME --port 443 --aws-account-id $AWS_ACCOUNT_ID
Run Chainlit app:
export MCP_SERVER_URL=YOUR_MCP_SERVER_DOMAIN_NAME
export MCP_SERVER_PORT=443
chainlit run app.py --port 8080
Run LangGraph Agent:
python langgraph_agent_mcp_sse_client.py --host $MCP_SERVER_HOSTNAME --port 443 --aws-account-id $AWS_ACCOUNT_ID
License
MIT License
Acknowledgments
- Uses Anthropic's MCP framework
- Powered by AWS Cost Explorer API
- Built with FastMCP for server
- README generated using GitIngest and Claude