MCP HubMCP Hub
arthurcolle

openai-mcp

by: arthurcolle

OpenAI Code Assistant Model Context Protocol (MCP) Server

19created 07/03/2025
Visit
OpenAI
Protocol

📌Overview

Purpose: This framework serves as a powerful coding assistant that facilitates software development tasks through a natural language interface, supporting multiple language model (LLM) providers.

Overview: The MCP Coding Assistant is a Python-based tool designed to enhance developer productivity by providing real-time code assistance, visualization, and cost management features. It integrates the Model Context Protocol (MCP) for interoperability with various clients and agents, making it a versatile solution for developers.

Key Features:

  • Multi-Provider Support: Compatible with various LLM providers such as OpenAI and Anthropic, allowing users to choose their preferred model for coding assistance.

  • Model Context Protocol Integration: Functions as both an MCP server and client, enabling seamless communication with multiple agents for complex problem-solving and enhancing collaboration in development tasks.


MCP Coding Assistant

A powerful Python tool that provides a natural language interface for software development tasks, supporting multiple LLM providers.

Key Features

  • Multi-Provider Support: Works with OpenAI, Anthropic, and other LLM providers.
  • Model Context Protocol Integration:
    • MCP server for use with clients.
    • Multi-agent synchronization for complex problem-solving.
  • Real-Time Tool Visualization: See execution progress and results.
  • Cost Management: Track token usage and expenses with budget controls.
  • Comprehensive Tool Suite: File operations, search, command execution, and more.
  • Enhanced UI: Rich terminal interface with progress indicators and syntax highlighting.
  • Context Optimization: Smart conversation compaction and memory management.
  • Agent Coordination: Specialized agents collaborate on tasks.

Installation

  1. Clone this repository.

  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Create a .env file with your API keys:

    OPENAI_API_KEY=your_openai_api_key_here
    ANTHROPIC_API_KEY=your_anthropic_api_key_here
    OPENAI_MODEL=gpt-4o
    ANTHROPIC_MODEL=claude-3-opus-20240229
    

Usage

CLI Mode

Run with default provider:

python claude.py chat

Specify provider and model:

python claude.py chat --provider openai --model gpt-4o

Set a budget limit:

python claude.py chat --budget 5.00

MCP Server Mode

Run as a Model Context Protocol server:

python claude.py serve

Start in development mode:

python claude.py serve --dev

MCP Client Mode

Connect to an MCP server:

python claude.py mcp-client path/to/server.py

Multi-Agent MCP Mode

Launch a multi-agent client:

python claude.py mcp-multi-agent path/to/server.py

Available Tools

  • View: Read files.
  • Edit: Modify files.
  • Replace: Create or overwrite files.
  • GlobTool: Find files by pattern.
  • GrepTool: Search contents using regex.
  • LS: List directory contents.
  • Bash: Execute shell commands.

Chat Commands

  • /help: Show available commands.
  • /compact: Compress conversation history.
  • /version: Show version information.
  • /providers: List available LLM providers.
  • /cost: Show cost and usage information.
  • /budget [amount]: Set a budget limit.
  • /quit, /exit: Exit the application.

Architecture

The MCP Coding Assistant is built with a modular architecture:

/claude_code/
  /lib/
    /providers/      # LLM provider implementations
    /tools/          # Tool implementations
    /context/        # Context management
    /ui/             # UI components
    /monitoring/     # Cost tracking & metrics
  /commands/         # CLI commands
  /config/           # Configuration management
  /util/             # Utility functions
  claude.py          # Main CLI entry point
  mcp_server.py      # Model Context Protocol server

Contributing

  1. Fork the repository.
  2. Create a feature branch.
  3. Implement changes with tests.
  4. Submit a pull request.

License

MIT