MCP HubMCP Hub
runebookai

tome

by: runebookai

A magical tool for using local LLMs with MCP servers.

52created 25/04/2025
Visit
LLM
local

📌Overview

Purpose:
Tome aims to make working with local LLMs and MCP servers easy and accessible for all users, regardless of technical background.

Overview:
Tome is a MacOS application (with Windows and Linux support planned) that streamlines the setup and management of local language models and MCP servers. Built by Runebook, it eliminates the complexity of manual configurations by providing a user-friendly interface for connecting to Ollama and integrating MCP servers, allowing users to quickly deploy, manage, and interact with LLM-powered tools.

Key Features:

  • Seamless MCP Server Management:
    Allows users to easily connect to and manage multiple MCP servers without manual setup, code, or configuration files.

  • Easy Integration with Ollama:
    Simplifies the connection to local or remote Ollama instances, supporting quick model deployment and management via a graphical interface.


Tome

A magical tool for using local LLMs with MCP servers.


Overview

Tome is a MacOS app (Windows and Linux support coming soon) for working with local LLMs and MCP servers, built by the team at Runebook. Tome manages your MCP servers, allowing you to connect to Ollama, copy/paste MCP servers, and chat with an MCP-powered model easily.

Getting Started

Requirements

Quickstart

  1. Install Tome and Ollama.
  2. Install a Tool supported model (Qwen2.5 14B or 7B models are recommended, depending on your RAM).
  3. Open the MCP tab in Tome and install your first MCP server (for example, use uvx mcp-server-fetch).
  4. Chat with your MCP-powered model.

Vision

We aim to make local LLMs and MCP accessible to everyone. Tome is designed for creativity and ease of use, whether you're an engineer, tinkerer, or hobbyist.

Core Principles

  • Tome is local first: You control your data.
  • Tome is for everyone: No need to manage programming languages, package managers, or config files.

Roadmap

  • Expand support for more LLM engines and possibly cloud models.
  • Add Windows and Linux support.
  • Introduce tools beyond chat interfaces for creating powerful applications and workflows.
  • Community feedback is welcome.

Community

Discord
Bluesky
Twitter