MCP HubMCP Hub
docker

labs-ai-tools-for-devs

by: docker

An MCP server & prompt runner for all of Docker. Simple Markdown. BYO LLM.

193created 10/07/2024
Visit
Docker
Markdown

📌Overview

Purpose: To provide a framework that leverages Docker containers to create and manage complex AI workflows through markdown prompts.

Overview: This framework enables developers to utilize Dockerized tools and large language models (LLMs) for crafting innovative workflows. By integrating markdown as a universal language for both humans and LLMs, it facilitates seamless execution across various environments.

Key Features:

  • Markdown Workflows: Allows users to define intricate workflows within markdown files that can be executed using chosen LLMs, promoting ease of use and flexibility.

  • Multi-Model Support: Configurable prompts to run with multiple LLMs, enabling the selection of the most suitable model for specific tasks and fostering the creation of multi-agent workflows.

  • Project Context Extraction: Docker images can extract contextual information from projects, allowing for more informed interactions and assistance tailored to the project's requirements.

  • Versioned Prompts Repository: Enables prompts to be stored in a Git repository for version control and sharing, facilitating collaboration among developers.


AI Tools for Developers

Agentic AI workflows enabled by Docker containers.

MCP

You can use prompts and their tools as MCP servers by executing the serve mode with the --mcp flag. Register prompts using git references or paths with the --register <ref> option.

serve --mcp --register github:docker/labs-ai-tools-for-devs?path=prompts/examples/generate_dockerfile.md --register /Users/ai-overlordz/some/local/prompt.md

What is this?

This is a simple Docker image that enables novel workflows by combining Dockerized Tools, Markdown, and your chosen LLM (Large Language Model).

Markdown is the Language

Markdown allows you to write complex workflows in files, which can be executed with your LLM in any environment thanks to Docker.

Dockerized Tools

By utilizing Docker, you can enable LLMs to:

  • Take complex actions
  • Acquire more context with fewer tokens
  • Operate across diverse environments
  • Function in a sandboxed environment

Conversation Loop

The conversation loop is integral to each workflow, passing tool results, agent responses, and markdown prompts. Agents will attempt different parameters or tools upon encountering errors.

Multi-Model Agents

Each prompt can run with various LLM models or families, facilitating the use of the best tool for the task. Docker enables different models to plan and execute collaboratively.

Project-First Design

An extractor is a Docker image that runs against a project to extract information into JSON context.

Prompts as a Trackable Artifact

Prompts can be versioned and shared in a Git repository, allowing anyone to run them in their environment.

Get Started

Using VSCode

  1. Install Extension
    Get the latest release and install it:

    code --install-extension 'labs-ai-tools-vscode-<version>.vsix'
    
  2. Running:

    • Open or create a markdown file in VSCode.
    • Set the OpenAI API key: >Docker AI: Set OpenAI API Key.
    • Select your target project: >Docker AI: Select target project.
    • Start the conversation loop: >Docker AI: Run Prompt.

Using CLI

These instructions assume Docker Desktop is running.

  1. Set the OpenAI key:

    echo $OPENAI_API_KEY > $HOME/.openai-api-key
    
  2. Run the container in your project directory:

    docker run --rm --pull=always -it -v /var/run/docker.sock:/var/run/docker.sock --mount type=volume,source=docker-prompts,target=/prompts --mount type=bind,source=$HOME/.openai-api-key,target=/root/.openai-api-key vonwig/prompts:latest run --host-dir $PWD --user $USER --platform "$(uname -o)" --prompts "github:docker/labs-githooks?ref=main&path=prompts/git_hooks"
    

See docs for further details on running the conversation loop.

Building

Build the Docker image:

docker build -t vonwig/prompts:local -f Dockerfile .

Conclusion

This project provides innovative tools for developers to enhance their workflows through the power of AI and Docker.