Memgpt-MCP-Server
by: Vic563
A Model Context Protocol (MCP) server that provides persistent memory and multi-model LLM support.
📌Overview
Purpose: This framework aims to provide a robust TypeScript-based MCP server that enables interaction with multiple LLM providers while effectively managing conversation history.
Overview: The MemGPT MCP Server is designed to facilitate seamless chatting with various LLMs, maintaining a comprehensive memory system that enhances user interaction and experience by remembering past conversations.
Key Features:
-
Chat Tool: Enables users to send messages to the chosen LLM provider, supporting various options like OpenAI and Anthropic, enhancing flexibility in communication.
-
Memory Management: The
get_memory
tool retrieves conversation history, with a configurable limit for memory retrieval, allowing for efficient management of past interactions. Theclear_memory
function provides an option to reset conversation history entirely. -
Flexible Provider and Model Usage: Users can easily switch between different LLM providers and models tailored for specific tasks, ensuring optimal performance and relevance in interactions. The system supports multiple models from providers like OpenAI and Anthropic, enhancing versatility.
MemGPT MCP Server
A TypeScript-based MCP server that implements a memory system for Large Language Models (LLMs). It provides tools for chatting with various LLM providers while maintaining conversation history.
Features
Tools
chat
: Send a message to the current LLM provider.get_memory
: Retrieve conversation history.- Optional
limit
parameter for the number of memories to retrieve; uselimit: null
for unlimited retrieval.
- Optional
clear_memory
: Clear conversation history.use_provider
: Switch between different LLM providers (OpenAI, Anthropic, OpenRouter, Ollama).use_model
: Switch to a different model for the current provider, with support for specific models from:- Anthropic Claude Models:
- Claude 3 Series:
claude-3-haiku
,claude-3-sonnet
,claude-3-opus
- Claude 3.5 Series:
claude-3.5-haiku
,claude-3.5-sonnet
- Claude 3 Series:
- OpenAI: 'gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'
- OpenRouter: Any model in 'provider/model' format (e.g., 'openai/gpt-4', 'anthropic/claude-2')
- Ollama: Any locally available model (e.g., 'llama2', 'codellama')
- Anthropic Claude Models:
Development
Installation
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Configuration for Claude Desktop
Add the server config to:
- MacOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"letta-memgpt": {
"command": "/path/to/memgpt-server/build/index.js",
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}
}
Environment Variables
OPENAI_API_KEY
: Your OpenAI API keyANTHROPIC_API_KEY
: Your Anthropic API keyOPENROUTER_API_KEY
: Your OpenRouter API key
Debugging
Debugging can be challenging as MCP servers communicate over stdio. Use the MCP Inspector for debugging:
npm run inspector
Recent Updates
Claude 3 and 3.5 Series Support (March 2024)
- Added support for Claude 3 and 3.5 models.
Unlimited Memory Retrieval
- Added support for retrieving unlimited conversation history.
- Use
{ "limit": null }
withget_memory
for all stored memories; default limit is 10 if not specified.