MCP HubMCP Hub
GalaxyLLMCI

lyraios

by: GalaxyLLMCI

LYRAI is a Model Context Protocol (MCP) operating system for multi-AI AGENTs designed to extend the functionality of AI applications by enabling them to interact with financial networks and blockchain public chains. The server offers a range of advanced AI assistants, including blockchain public chain operations (SOLANA,ETH,BSC,etc.)

217created 20/02/2025
Visit
AI
blockchain

πŸ“ŒOverview

Purpose: LYRAIOS aims to serve as a comprehensive operating system for AI applications, enhancing their functionality to facilitate interaction with financial networks and blockchain technology.

Overview: LYRAIOS, also known as the LLM-based Your Reliable AI Operating System, is an advanced AI assistant platform that features a modular design capable of integrating various AI applications. It extends capabilities for financial operations and market analysis, making it suitable for a variety of sectors, including education and fintech.

Key Features:

  • AI Process Management: Facilitates dynamic task allocation, multi-assistant coordination, resource optimization, and state management, ensuring efficient operation across various AI tasks.

  • AI Memory System: Utilizes both short-term and long-term memory storage, preserving conversation context and integrating knowledge bases for improved interaction.

  • AI I/O System: Supports multi-modal input processing (text, files, APIs) and structured output formatting, enhancing user experience and interaction functionality.

  • Built-in Tools: Offers integrated utilities such as advanced calculators, web search functionalities, financial analysis tools, file management, and research aids, streamlining user tasks.

  • FastAPI and Docker Support: Enables a robust backend powered by FastAPI and easy deployment through Docker, enhancing scalability and maintainability.

  • Open Protocol Architecture: Provides a pluggable structure for easy integration of third-party tools and services, significantly increasing extensibility compared to traditional systems.


LYRAIOS

Overview & Technical Foundation

LYRAI is a Model Context Protocol (MCP) operating system for multi-AI Agents designed to extend AI applications (such as Claude Desktop and Cursor) by enabling interaction with financial networks and blockchain public chains. It offers advanced AI assistants supporting blockchain operations (SOLANA, ETH, etc.), fintech market analysis, education sector learning and training systems.

In future, advanced VIP features will be payable exclusively using LYRAI on Solana.

Core Innovations & Differentiated Value

LYRAIOS aims to create the next-generation AI Agent OS with innovations in:

  1. Open Protocol Architecture: Modular integration protocol supporting plug-and-play third-party tools/services with multi-modal interfaces; 80%+ improved extensibility.
  2. Multi-Agent Collaboration Engine: Distributed task orchestration enabling dynamic multi-agent collaboration, enterprise-grade workflow automation, and conflict resolution.
  3. Cross-Platform Runtime Environment: Smooth migration from personal assistants to enterprise digital employees across multiple scenarios (finance, healthcare, manufacturing).

For detailed architecture, see Architecture Documentation.

System Architecture

LYRAIOS adopts a layered architecture comprising:

User Interface Layer

Provides interaction modes:

  • Web UI (Streamlit)
  • Mobile UI
  • CLI
  • API Clients for third-party integration

Core OS Layer

Implements basic AI OS functions:

  • Process Management: Task scheduling, resource allocation, state management.
  • Memory System: Short-term memory, long-term storage, knowledge base.
  • I/O System: Multi-modal input, structured output, event handling.
  • Security & Access Control: Authentication, authorization, rate limiting.

MCP Integration Layer

Core innovation enabling external service integration via Model Context Protocol:

  • MCP Client: Protocol handling, connection, and message routing.
  • Tool Registry: Tool registration, capability discovery, manifest validation.
  • Tool Executor: Execution environment, resource management, error handling.
  • Adapters: REST API, Python plugins, and custom integrations.

External Services Layer

Includes services integrated through MCP protocol providing:

  • File system
  • Database
  • Web search
  • Code editor
  • Browser
  • Custom services

Tool Integration Protocol

Standardized protocol for integrating third-party tools:

  • Standardized JSON Tool Manifest
  • Pluggable adapter system
  • Secure, resource-limited execution environment
  • Versioning, dependency management
  • Monitoring and logging

Steps for integration:

  1. Define Tool Manifest (JSON)
  2. Implement Tool
  3. Register Tool with LYRAIOS
  4. Use Tool within LYRAIOS agents

MCP Protocol Overview

Model Context Protocol (MCP) is a client-server architecture connecting LLM applications and integrations:

  • Hosts: LLM apps initiating connections
  • Clients: Maintain connection with servers
  • Servers: Provide context, tools, prompts

Supported MCP functions:

  • Resources (attach local files/data)
  • Prompts (templates)
  • Tools (execute commands)
  • Sampling (planned)
  • Roots (planned)

Data Flow

User Request Processing:

  1. User sends request via interface
  2. Core OS processes request
  3. For external tools, forward to MCP layer
  4. MCP client connects to MCP server
  5. External service executes request
  6. Result returned to user

Tool Execution:

  1. AI Agent selects tool
  2. Tool registry finds tool capabilities
  3. Tool executor prepares environment
  4. Adapter formats request
  5. Tool executes and returns results
  6. AI Agent processes results

LYRAIOS Overview

An advanced AI assistant platform built with Streamlit serving as an AI operating system.

Core OS Features

  • AI Process Management: Dynamic task scheduling, multi-assistant coordination, resource optimization, state persistence.
  • AI Memory System: Short-term memory, vector DB long-term storage, session context, knowledge base integration.
  • AI I/O System: Multi-modal input processing, structured output, stream processing, event-driven architecture.

Built-in Tools

  • Calculator (advanced math)
  • Web Search (DuckDuckGo)
  • Financial Analysis (real-time stock data, analyst info, news)
  • File Management (workspace files)
  • Research Tools (Exa integration)

Specialized Assistants

  • Python Assistant (live code execution, charting, package management)
  • Research Assistant (report generation, web research, structured outputs, source citation)

Technical Architecture

  • FastAPI backend
  • Streamlit frontend
  • PGVector vector DB
  • PostgreSQL storage
  • Docker support

System Features

  • Knowledge management (PDF, websites, semantic search, knowledge graph)
  • Process control (task scheduling, error handling, monitoring)
  • Security & access control (API keys, auth, rate limiting, secure storage)

Security Considerations

  • Transmission: Use TLS, verify source, authenticate
  • Message Validation: Validate and clean input, check size and format
  • Resource Protection: Access control, path verification, monitor usage, rate limit
  • Error Handling: Avoid info leaks, log errors, cleanup, DoS prevention

Roadmap

  • Core Platform: Mostly complete with ongoing improvements (multimodal input, error handling, scaling)
  • AI Process Management: Basic functions done; advanced scheduling, optimization, visualization planned
  • Memory System: Basic integration done; optimization, cross-session learning planned
  • Tools & Integrations: Core tools complete; advanced visualization, API framework, media processing planned
  • Security & Access Control: Basic API keys and auth done; rate limiting, role-based access, audit logging planned
  • Open Protocol Architecture: Partial module interface standards; tool integration protocol and service discovery in development
  • Multi-Agent Collaboration: Basic team structure done; communication, task decomposition, conflict resolution advancing
  • Cross-Platform Support: Web interface complete; API access partial; responsiveness, desktop, CLI, IoT, voice planned

Setup Workspace

git clone https://github.com/GalaxyLLMCI/lyraios
cd lyraios

python3 -m venv aienv
source aienv/bin/activate

pip install 'phidata[aws]'

phi ws setup

cp workspace/example_secrets workspace/secrets
cp example.env .env

phi ws up
# Open localhost:8501 to view the app

phi ws down

Run Lyraios Locally

  1. Install Docker Desktop

  2. Export credentials:

export OPENAI_API_KEY=sk-***
export EXA_API_KEY=xxx      # For Exa research
export GOOGLE_API_KEY=xxx   # For Gemini research

# Or set in .env file
OPENAI_API_KEY=xxx
EXA_API_KEY=xxx
GOOGLE_API_KEY=xxx

phi ws up
# Open localhost:8501 to view the app

phi ws down

API Documentation

REST API Endpoints

  • POST /api/v1/assistant/chat: Chat with AI assistant, context-aware, returns structured responses including tool usage
  • GET /api/v1/health: System health status, version info

Interactive API docs available at /docs and /redoc

Development Guide

Project Structure

lyraios/
β”œβ”€β”€ ai/                     # AI core functionality
β”‚   β”œβ”€β”€ assistants.py       # Assistant implementations
β”‚   β”œβ”€β”€ llm/                # LLM integration
β”‚   └── tools/              # AI tools implementations
β”œβ”€β”€ app/                    # Main application
β”‚   β”œβ”€β”€ components/         # UI components
β”‚   β”œβ”€β”€ config/             # App config
β”‚   β”œβ”€β”€ db/                 # Database models
β”‚   β”œβ”€β”€ styles/             # UI styling
β”‚   β”œβ”€β”€ utils/              # Utility functions
β”‚   └── main.py             # Main app entry
β”œβ”€β”€ assets/                 # Static assets
β”œβ”€β”€ data/                   # Data storage
β”œβ”€β”€ tests/                  # Tests
β”œβ”€β”€ workspace/              # Workspace config
β”‚   β”œβ”€β”€ dev_resources/
β”‚   β”œβ”€β”€ settings.py
β”‚   └── secrets/            # Secret config (gitignored)
β”œβ”€β”€ docker/                 # Docker config
β”œβ”€β”€ scripts/                # Utility scripts
β”œβ”€β”€ .env                    # Env variables
β”œβ”€β”€ requirements.txt        # Dependencies
└── README.md               # Documentation

Environment Configuration

Copy and edit .env:

cp example.env .env

# Required
EXA_API_KEY=your_exa_api_key
OPENAI_API_KEY=your_openai_api_key
OPENAI_BASE_URL=optional_custom_api_endpoint

# OpenAI models
OPENAI_CHAT_MODEL=gpt-4-turbo-preview
OPENAI_VISION_MODEL=gpt-4-vision-preview
OPENAI_EMBEDDING_MODEL=text-embedding-3-small

# Optional ports
STREAMLIT_SERVER_PORT=8501
API_SERVER_PORT=8000

Examples for OpenAI configurations included for standard OpenAI, Azure OpenAI, and other providers.

Streamlit Configuration

mkdir -p ~/.streamlit

cat > ~/.streamlit/config.toml << EOL
[browser]
gatherUsageStats = false
EOL

Development Scripts

Run frontend and backend using:

python -m scripts.dev run

Options to run only frontend or backend are available, plus custom ports.

Manual start:

streamlit run app/app.py
uvicorn api.main:app --reload

Dependencies Management

pip install -r requirements.txt
pip install -r requirements-dev.txt
pip install -e .
pip install python-dotenv black isort mypy pytest

Development Best Practices

  • Follow PEP 8 style, use type hints and docstrings
  • Use black and isort for formatting
  • Use pytest for testing with optional coverage
  • Use pre-commit hooks to enforce standards

Deployment Guide

Docker Deployment

Development:

docker build -f docker/Dockerfile.dev -t lyraios:dev .
docker-compose -f docker-compose.dev.yml up

Production:

docker build -f docker/Dockerfile.prod -t lyraios:prod .
docker-compose -f docker-compose.prod.yml up -d

Configuration Options

Environment variables control application, AI, and database settings.

Scaling options configurable via worker and memory environment variables.

Monitoring and Maintenance

  • Health checks at /health
  • Monitor with Prometheus or similar tools
  • Logs located at /var/log/lyraios/

Backup and recovery scripts available.

Database Configuration

Supports SQLite (default) and PostgreSQL.

SQLite example:

DATABASE_TYPE=sqlite
DATABASE_PATH=data/lyraios.db

PostgreSQL example:

DATABASE_TYPE=postgres
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=lyraios
POSTGRES_USER=postgres
POSTGRES_PASSWORD=your_password

If no PostgreSQL config is provided, defaults to SQLite.

Contributing

Contributions welcome! See CONTRIBUTING.md for guidelines.

License

Licensed under Apache License 2.0. See LICENSE file for details.