llm-context-py
by: cyberchitta
Share code with LLMs via Model Context Protocol or clipboard. Rule-based customization enables easy switching between different tasks (like code review and documentation). Includes smart code outlining.
📌Overview
Purpose:
LLM Context is designed to help developers inject relevant project content into Large Language Model chat interfaces efficiently and intelligently.
Overview:
LLM Context streamlines integrating code and text project information into LLM-based chat systems. It supports both direct integration with advanced chat platforms (via Model Context Protocol) and works universally through a command-line clipboard workflow. By leveraging .gitignore
for smart file selection and customizable rule-based profiles, it ensures only the most pertinent project data is included, enhancing AI-assisted development and collaboration.
Key Features:
-
Smart File Selection: Automatically selects relevant files for LLM context based on
.gitignore
patterns, minimizing noise and focusing on essential project content. -
Seamless Integration & Workflows: Offers both native integration with platforms like Claude Desktop (via MCP) and universal CLI/clipboard workflows, making it compatible with a wide range of LLM chat interfaces and developer environments.
LLM Context
LLM Context is a tool to help developers inject relevant content from code or text projects into Large Language Model (LLM) chat interfaces. It uses .gitignore
patterns for smart file selection and provides both a streamlined command-line workflow and direct LLM integration through the Model Context Protocol (MCP).
What's New in v0.3.0
Configuration now uses a Markdown (with YAML front matter) rules system, replacing the previous TOML/YAML profiles. This is a breaking change. See the User Guide for details.
Why Use LLM Context?
For more background on LLM Context’s approach to AI-assisted development, see:
- LLM Context: Harnessing Vanilla AI Chats for Development
- Full Context Magic - When AI Finally Understands Your Entire Project
Usage Patterns
- Direct LLM Integration: Native support for Claude Desktop via the MCP protocol.
- Chat Interface Support: Works with any LLM chat interface via CLI/clipboard; optimized for persistent context interfaces like Claude Projects and Custom GPTs.
- Supported Projects: Suitable for code repositories and collections of text, markdown, or HTML documents.
- Project Size: Best for projects that fit within the LLM context window (large project support is upcoming).
Installation
With uv:
uv tool install "llm-context>=0.3.0"
To upgrade:
uv tool upgrade llm-context
Warning: LLM Context is under active development. Updates may overwrite configuration files prefixed with
lc-
. Version control your configuration files.
Quickstart
Using MCP with Claude Desktop
Add this to your claude_desktop_config.json
:
{
"mcpServers": {
"CyberChitta": {
"command": "uvx",
"args": ["--from", "llm-context", "lc-mcp"]
}
}
}
Then, either:
- Say: "I would like to work with my project" (Claude will prompt for the project root), or
- Specify: "I would like to work with my project /path/to/your/project"
For more, see Full Context Magic.
CLI Workflow
- Navigate to the project root.
- Initialize:
lc-init
(once). - Select files:
lc-sel-files
. - (Optional) Review your selection in
.llm-context/curr_ctx.yaml
. - Generate context:
lc-context
(add-p
for prompts,-u
for user notes). - Paste context into your LLM’s knowledge section or chat interface.
- If the LLM requests additional files:
- Copy the file list from the LLM.
- Run
lc-clip-files
. - Paste contents back to the LLM.
Core Commands
lc-init
: Initialize project configurationlc-set-rule <n>
: Switch ruleslc-sel-files
: Select files for contextlc-sel-outlines
: Select files for outline generationlc-context [-p] [-u] [-f FILE]
: Generate and copy context-p
: Include prompt instructions-u
: Include user notes-f FILE
: Output to file
lc-prompt
: Generate project instructions for LLMslc-clip-files
: Process LLM file requestslc-changed
: List recently modified fileslc-outlines
: Generate code outlineslc-clip-implementations
: Extract code requested by LLMs (except C/C++)
Features & Advanced Usage
- Smart file selection via
.gitignore
patterns - Rule-based context profiles (system and user-defined)
- Code Navigation:
- Smart Code Outlines: Automatic, high-level codebase structure summaries
- Implementation Extraction: Returns specific implementations the LLM asks for using
lc-clip-implementations
- Customizable templates and prompts
See the User Guide for documentation.
Similar Tools
Explore alternative tools in this comprehensive list.
Acknowledgments
LLM Context builds on prior AI-assisted development tools:
- Successor to LLM Code Highlighter.
- Inspired by work on RubberDuck, Continue, and Aider Chat.
- Uses tree-sitter tag query files from Aider Chat.
Thanks to the open-source community and LLMs (including Claude-3.5-Sonnet) for their support in this project's development.
License
Licensed under the Apache License, Version 2.0. See the LICENSE file for details.