swagger-mcp
by: danishjsheikh
mcp server which will dynamically define tools based on swagger
📌Overview
Purpose: swagger-mcp
aims to dynamically generate MCP tools from Swagger UI by scraping the swagger.json
file, facilitating tool selection for MCP clients.
Overview: swagger-mcp
simplifies the integration of Swagger UI with MCP clients by automatically generating well-defined tools at runtime. This enhances the user experience and streamlines the utilization of APIs.
Key Features:
-
Dynamic Tool Generation: Automatically generates MCP tools at runtime from
swagger.json
, ensuring quick and efficient tool access. -
Integration with MCP Clients: Easily integrates with any MCP client, enhancing its functionality to support various use cases.
swagger-mcp
Overview
swagger-mcp
is a tool designed to scrape Swagger UI by extracting the swagger.json
file and dynamically generating well-defined MCP tools at runtime. These tools can be utilized by the MCP client for further tool selection.
Demo Video
Check out the demo video showcasing the project in action:
Support
If you find this project valuable, please support me on LinkedIn by:
- Liking and sharing the demo post
- Leaving your thoughts and feedback in the comments
- Connecting with me for future updates
Your support will help reach more people and improve the project!
Prerequisites
To use swagger-mcp
, ensure you have the following dependencies:
- LLM Model API Key / Local LLM: Requires access to OpenAI, Claude, or Ollama models.
- Any MCP Client: Used mcphost from mark3labs.
Installation and Setup
Install and run swagger-mcp
with:
go install github.com/danishjsheikh/swagger-mcp@latest
swagger-mcp
MCP Configuration
To integrate with mcphost, include the following configuration in .mcp.json
:
{
"mcpServers":
{
"swagger_loader": {
"command": "swagger-mcp",
"args": ["--specUrl=<swagger/doc.json_url>"]
}
}
}
Demo Flow
- Run the backend demo:
go install github.com/danishjsheikh/go-backend-demo@latest go-backend-demo
- Run Ollama:
ollama run llama3.2
- Run MCP Client:
go install github.com/mark3labs/mcphost@latest mcphost -m ollama:llama3.2 --config <.mcp.json_file_path>
Need Help
Work is ongoing to improve tool definitions to enhance:
- Better error handling for more accurate responses
- LLM behavior control to ensure reliance only on API responses without using its own memory
- Preventing hallucinations and random data generation by enforcing strict data retrieval from APIs
If you have insights or suggestions on improving these aspects, please contribute by:
- Sharing your experience with similar implementations
- Suggesting modifications to tool definitions
- Providing feedback on current limitations
Your input will be invaluable in making this tool more reliable and effective! 🚀