aws-cost-explorer-mcp-server
by: aarora79
MCP server for understanding AWS spend
📌Overview
Purpose: The main goal of this tool is to provide an efficient way to analyze and visualize AWS cloud spending data using Anthropic's Claude model for natural language interaction.
Overview: This framework functions as an MCP server, leveraging AWS Cost Explorer and Amazon Bedrock usage data to allow users to inquire about their AWS costs and spending patterns through natural language queries.
Key Features:
-
Amazon EC2 Spend Analysis: Enables users to view and analyze detailed breakdowns of EC2 spending from the last day, facilitating immediate insights into costs.
-
Amazon Bedrock Spend Analysis: Provides a comprehensive breakdown of spending by region, user, and model for the past 30 days, helping to understand usage patterns.
-
Service Spend Reports: Allows analysis of spending across all AWS services for the last 30 days, offering a holistic view of expenditures.
-
Detailed Cost Breakdown: Delivers granular cost data categorized by day, region, service, and instance type, enabling precise financial tracking.
-
Interactive Interface: Utilizes Claude's natural language processing capabilities to transform complex spending queries into user-friendly interactions, simplifying data access and understanding.
AWS Cost Explorer and Amazon Bedrock Model Invocation Logs MCP Server & Client
An MCP server for accessing AWS spending data via Cost Explorer and Amazon Bedrock usage data through Model invocation logs in Amazon CloudWatch, utilizing Anthropic's MCP (Model Control Protocol).
Overview
This tool provides a way to analyze and visualize AWS cloud spending data using the Claude model as an interactive interface. It functions as an MCP server that exposes AWS Cost Explorer API functionality to Claude, allowing users to inquire about AWS costs in natural language.
Features
- Amazon EC2 Spend Analysis: View detailed breakdowns of EC2 spending for the last day.
- Amazon Bedrock Spend Analysis: Analyze breakdowns by region, users, and models over the last 30 days.
- Service Spend Reports: Evaluate spending across all AWS services for the last 30 days.
- Detailed Cost Breakdown: Access granular cost data by day, region, service, and instance type.
- Interactive Interface: Query cost data through natural language using Claude.
Requirements
- Python 3.12
- AWS credentials with Cost Explorer access
- Anthropic API access (for Claude integration)
- (Optional) Amazon Bedrock access
- (Optional) Amazon EC2 for running a remote MCP server
Installation
-
Install
uv
:- macOS and Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
- Windows:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
- macOS and Linux:
-
Clone the repository:
git clone https://github.com/aarora79/aws-cost-explorer-mcp.git cd aws-cost-explorer-mcp
-
Set up the Python virtual environment and install dependencies:
uv venv --python 3.12 && source .venv/bin/activate && uv pip install --requirement pyproject.toml
-
Configure your AWS credentials:
mkdir -p ~/.aws # Set up your credentials in ~/.aws/credentials and ~/.aws/config
Usage
Prerequisites
- Set up model invocation logs in Amazon CloudWatch.
- Ensure that the IAM user/role being used has read-only access to Amazon Cost Explorer and Amazon CloudWatch.
Local Setup
To run the server locally, use stdio
as the transport.
Starting the Server (local)
Run the server using:
export MCP_TRANSPORT=stdio
export BEDROCK_LOG_GROUP_NAME=YOUR_BEDROCK_CW_LOG_GROUP_NAME
python server.py
Remote Setup
Use sse
as the transport for MCP.
Starting the Server (remote)
Run the server, ensuring to set the MCP_TRANSPORT
as sse
:
export MCP_TRANSPORT=sse
export BEDROCK_LOG_GROUP_NAME=YOUR_BEDROCK_CW_LOG_GROUP_NAME
python server.py
Available Tools
The server exposes the following tools for Claude:
get_ec2_spend_last_day()
: Retrieves EC2 spending data for the previous day.get_detailed_breakdown_by_day(days=7)
: Analyzes costs by region, service, and instance type.get_bedrock_daily_usage_stats(days=7, region='us-east-1', log_group_name='BedrockModelInvocationLogGroup')
: Delivers a per-day breakdown of model usage.get_bedrock_hourly_usage_stats(days=7, region='us-east-1', log_group_name='BedrockModelInvocationLogGroup')
: Provides a detailed analysis of hourly model usage.
Example Queries
Possible inquiries using Claude include:
- "What was my Bedrock spend last month?"
- "What was my EC2 spending yesterday?"
- "List my top 5 AWS services by cost for the last month."
- "Analyze my spending by region for the past 14 days."
Docker Support
For containerized deployment:
docker build -t aws-cost-explorer-mcp .
docker run -v ~/.aws:/root/.aws aws-cost-explorer-mcp
Development
Project Structure
server.py
: Main server implementation.pyproject.toml
: Project dependencies.Dockerfile
: Container definition for deployments.
Adding New Tools
To extend functionality:
- Add new functions to
server.py
. - Annotate them with
@mcp.tool()
. - Implement the API calls and format results for readability.
License
Acknowledgments
- Uses Anthropic's MCP framework
- Powered by AWS Cost Explorer API
- Built with FastMCP for server implementation