mcp-neurolora
by: aindreyway
An intelligent MCP server that provides tools for collecting and documenting code from directories
📌Overview
Purpose: To provide an intelligent MCP server that utilizes OpenAI API for advanced code analysis, code collection, and documentation generation.
Overview: MCP Neurolora is a powerful server designed to enhance coding workflows by integrating tools for assessing code quality, collecting and documenting code efficiently. It harnesses the capabilities of OpenAI to deliver insights and improvements for developers.
Key Features:
-
Code Analysis: Utilizes OpenAI API to analyze code and provide detailed feedback, including best practices recommendations and GitHub issue generation.
-
Code Collection: Gathers code from specified directories into a single markdown file, supporting syntax highlighting and navigation while allowing for pattern-based filtering of files.
MCP Neurolora
An intelligent MCP server that provides tools for code analysis using OpenAI API, code collection, and documentation generation.
🚀 Installation Guide
Follow these steps to install MCP Neurolora.
Step 1: Install Node.js
macOS
- Install Homebrew if not installed:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
- Install Node.js 18:
brew install node@18 echo 'export PATH="/opt/homebrew/opt/node@18/bin:$PATH"' >> ~/.zshrc source ~/.zshrc
Windows
- Download Node.js 18 LTS from https://nodejs.org/
- Run the installer
- Open a new terminal to apply changes
Linux (Ubuntu/Debian)
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
sudo apt-get install -y nodejs
Step 2: Install uv and uvx
All Operating Systems
- Install uv:
curl -LsSf https://astral.sh/uv/install.sh | sh
- Install uvx:
uv pip install uvx
Step 3: Verify Installation
Run these commands to verify the installation:
node --version # Should show v18.x.x
npm --version # Should show 9.x.x or higher
uv --version # Should show uv installed
uvx --version # Should show uvx installed
Step 4: Configure MCP Server
Your assistant will help you:
-
Find your Cline settings file, locations include:
- VSCode macOS:
~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
- Claude Desktop macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- VSCode Windows:
%APPDATA%/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
- Claude Windows:
%APPDATA%/Claude/claude_desktop_config.json
- VSCode macOS:
-
Add this configuration:
{ "mcpServers": { "aindreyway-mcp-neurolora": { "command": "npx", "args": ["-y", "@aindreyway/mcp-neurolora@latest"], "env": { "NODE_OPTIONS": "--max-old-space-size=256", "OPENAI_API_KEY": "your_api_key_here" } } } }
Step 5: Install Base Servers
Ask your assistant: "Please install the base MCP servers for my environment"
Your assistant will:
- Find your settings file
- Run the install_base_servers tool
- Configure all necessary servers automatically
After installation:
- Close VSCode completely (Cmd+Q on macOS, Alt+F4 on Windows)
- Reopen VSCode
- The new servers will be ready to use
Important: A complete restart of VSCode is required after installing the base servers for proper initialization.
This server uses
npx
for direct npm package execution, optimal for Node.js/TypeScript MCP servers, providing seamless integration with the npm ecosystem.
Base MCP Servers
The following base servers will be automatically installed and configured:
- fetch: Basic HTTP request functionality
- puppeteer: Browser automation for web interaction and testing
- sequential-thinking: Advanced problem-solving tools
- github: GitHub integration features
- git: Git operations support for version control
- shell: Basic shell command execution with commands such as:
- ls, cat, pwd, grep, wc, touch, find
🎯 What Your Assistant Can Do
Ask your assistant to:
- Analyze code and suggest improvements
- Install base MCP servers for your environment
- Collect code from your project directory
- Create documentation for your codebase
- Generate a markdown file with all your code
🛠 Available Tools
analyze_code
Analyzes code using OpenAI API and generates detailed feedback.
Parameters:
codePath
(required): Path to the code file or directory
Example usage:
{
"codePath": "/path/to/your/code.ts"
}
The tool will generate:
- Issues and recommendations
- Best practices violations
- Impact analysis
- Steps to fix
Output files:
- LAST_RESPONSE_OPENAI.txt (human-readable analysis)
- LAST_RESPONSE_OPENAI_GITHUB_FORMAT.json (structured data for GitHub issues)
Requires OpenAI API key in environment configuration.
collect_code
Collects code from a directory into a single markdown file with syntax highlighting and navigation.
Parameters:
directory
(required): Directory path to collect code fromoutputPath
(optional): Path to save the output markdown fileignorePatterns
(optional): Array of patterns to ignore
Example usage:
{
"directory": "/path/to/project/src",
"outputPath": "/path/to/project/src/FULL_CODE_SRC_2024-12-20.md",
"ignorePatterns": ["*.log", "temp/", "__pycache__", "*.pyc", ".git"]
}
install_base_servers
Installs base MCP servers to your configuration file.
Parameters:
configPath
(required): Path to MCP settings configuration file
Example usage:
{
"configPath": "/path/to/cline_mcp_settings.json"
}
🔧 Features
- Code Analysis:
- OpenAI API integration
- Structured feedback
- Best practices recommendations
- GitHub issues generation
- Code Collection:
- Directory traversal
- Syntax highlighting
- Navigation generation
- Pattern-based filtering
- Base Server Management:
- Automatic installation
- Configuration handling
- Version management
📄 License
MIT License - feel free to use this in your projects!
👤 Author
Aindreyway
- GitHub: https://github.com/aindreyway
⭐️ Support
Give a ⭐️ if this project helped you!