stevenyu113228/Meta-MCP
If you are the rightful owner of Meta-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
Meta-MCP is a Model Context Protocol server designed to manage and provide intelligent access to multiple MCP tools, addressing the context overflow problem faced by AI agents.
Meta-MCP: MCP Tools Manager
A powerful Model Context Protocol (MCP) server that manages and provides intelligent access to multiple MCP tools, solving the context overflow problem when AI agents face dozens of tool definitions.
🎯 Problem Statement
AI Agents using MCP face challenges when dealing with many tools:
- Each tool's name/description/schema consumes valuable context space
- Most queries only need a few specific tools
- Loading all tools upfront is inefficient
💡 Solution
Meta-MCP acts as an intelligent tool selection layer:
- Smart Retrieval: Uses LangChain + Vector Store for semantic search
- Unified Proxy: All tool executions route through Meta-MCP
- Web Management: Easy GUI for managing MCP servers and viewing logs
✨ Features
Core Features
- 🔍 Semantic Tool Search: Find tools using natural language queries
- 🤖 LLM Query Optimization: Two-stage search with GPT-5 for intelligent query refinement
- 🔗 Multi-Server Management: Connect to multiple MCP servers simultaneously
- 📊 Usage Tracking: Monitor tool usage statistics and favorites
- 🔒 Secure Configuration: Encrypted storage for API keys
- 📝 Execution Logging: Complete audit trail of all tool executions
- ⚡ Fast API: RESTful API for all operations
- 🌐 Web Management Interface: React-based UI for easy configuration and monitoring
MCP Tools Exposed
Meta-MCP exposes these tools to AI agents:
search_tools: Semantic/keyword search for toolsget_tool_details: Get full schemas for selected toolsexecute_tool: Execute tools on downstream serverslist_categories: Browse tools by categoryget_favorites: Access frequently used tools
🏗️ Architecture
AI Agent
↓
Meta-MCP Server (MCP Tools Interface)
↓
Query Optimizer (GPT-5) → Embedding Service → Tool Registry
↓
MCP Client Manager
↓
Multiple Downstream MCP Servers (filesystem, database, web search, etc.)
Two-Stage Search Architecture:
- Stage 1: User query → LLM (GPT-5) extracts concise keywords
- Stage 2: Keywords → Semantic search via embeddings → Relevant tools
See for detailed documentation.
🚀 Quick Start
Prerequisites
- Python 3.13+
- uv package manager
- Node.js 20+ (for downstream MCP servers)
Installation
# Clone repository
git clone <repository-url>
cd Meta-MCP
# Install dependencies with uv
uv sync
# Initialize database
uv run python -c "from src.storage.database import Database; import asyncio; asyncio.run(Database('sqlite+aiosqlite:///./data/meta_mcp.db').init_db())"
Configuration
- Configure OpenAI API (for semantic search):
Create .env.local file:
OPENAI_API_ENDPOINT=https://api.openai.com/v1
OPENAI_API_KEY=your-api-key-here
OPENAI_MODEL=text-embedding-3-small
Or configure via Web UI (recommended).
- Set Encryption Key (for secure storage):
export META_MCP_ENCRYPTION_KEY=$(uv run python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())")
Running the Server
Option 1: Quick Start Script (Recommended - runs both backend & frontend)
./scripts/start.sh
- Backend API: http://localhost:8000
- Frontend UI: http://localhost:5173
- API Docs: http://localhost:8000/docs
Option 2: FastAPI Web Server Only (for Web UI and REST API)
uv run uvicorn src.api.main:app --reload --port 8000
Option 3: MCP Server Only (for AI agents)
uv run python src/mcp_server/server.py
Option 4: Using Claude Code
./scripts/start-mcp-server.sh
See for integration guide.
📖 Usage
1. Add MCP Servers via API
curl -X POST http://localhost:8000/api/mcp-servers \
-H "Content-Type: application/json" \
-d '{
"name": "filesystem-mcp",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"],
"env": {}
}'
2. Start MCP Server
curl -X POST http://localhost:8000/api/mcp-servers/{server_id}/start
3. Search for Tools
curl "http://localhost:8000/api/tools/search?q=read+file&use_semantic=true"
4. Execute Tools
curl -X POST http://localhost:8000/api/tools/execute \
-H "Content-Type: application/json" \
-d '{
"tool_id": "filesystem_read_file",
"parameters": {"path": "/tmp/test.txt"}
}'
Using with AI Agents
Configure Claude or other MCP-compatible AI to connect to Meta-MCP:
{
"mcpServers": {
"meta-mcp": {
"command": "uv",
"args": ["run", "python", "src/mcp_server/server.py"],
"cwd": "/path/to/Meta-MCP"
}
}
}
🧪 Testing
# Run all tests
PYTHONPATH=/path/to/Meta-MCP uv run pytest tests/ -v
# Run specific test file
PYTHONPATH=/path/to/Meta-MCP uv run pytest tests/test_tool_registry.py -v
# Run with coverage
PYTHONPATH=/path/to/Meta-MCP uv run pytest --cov=src --cov-report=html
Current test coverage: 34 tests passing ✅
📁 Project Structure
Meta-MCP/
├── src/
│ ├── api/ # FastAPI REST API
│ │ ├── main.py # API entry point
│ │ └── routes/ # API route handlers
│ ├── core/ # Core business logic
│ │ ├── client_manager.py # MCP client connections
│ │ ├── tool_registry.py # Tool storage & search
│ │ ├── execution_router.py # Tool execution routing
│ │ ├── embeddings.py # Semantic search
│ │ ├── query_optimizer.py # LLM query optimization
│ │ └── logger.py # Structured logging
│ ├── mcp_server/ # Meta-MCP Server
│ │ ├── server.py # MCP server entry point
│ │ ├── tools.py # Tool definitions
│ │ └── config.py # Configuration management
│ ├── storage/ # Data persistence
│ │ ├── database.py # Database management
│ │ └── models.py # SQLAlchemy models
│ └── utils/ # Utilities
│ └── crypto.py # Encryption utilities
├── web/ # React Frontend
│ ├── src/
│ │ ├── components/ # React components
│ │ └── pages/ # Page components
│ └── package.json
├── scripts/ # Utility scripts
│ ├── start.sh # Start both backend & frontend
│ └── start-mcp-server.sh # Start MCP server for Claude Code
├── tests/ # Test suite
├── data/ # Runtime data (git-ignored)
├── config/ # Configuration files
└── docker/ # Docker configuration
🔧 API Endpoints
Settings
GET /api/settings/openai- Get OpenAI settingsPUT /api/settings/openai- Update OpenAI settingsPOST /api/settings/openai/test- Test OpenAI connection
MCP Servers
GET /api/mcp-servers- List all MCP serversPOST /api/mcp-servers- Create new MCP serverGET /api/mcp-servers/{id}- Get server detailsDELETE /api/mcp-servers/{id}- Delete serverPOST /api/mcp-servers/{id}/start- Start/connect serverPOST /api/mcp-servers/{id}/stop- Stop/disconnect serverPOST /api/mcp-servers/{id}/test- Test server connection
Tools
GET /api/tools- List all toolsGET /api/tools/search- Search tools (semantic/keyword)GET /api/tools/{id}- Get tool detailsPOST /api/tools/{id}/favorite- Toggle favorite statusGET /api/tools/categories/list- List categoriesGET /api/tools/favorites/list- List favorite tools
🛠️ Development
Code Quality
# Format code
uv run black src tests
# Lint code
uv run ruff check src tests
# Type checking
uv run mypy src
Git Workflow
# Commit with conventional commits format
git commit -m "feat(component): description"
# Types: feat, fix, docs, test, refactor, chore
🐳 Docker Deployment
# Build and run with Docker Compose
docker-compose up -d
# View logs
docker-compose logs -f
# Stop services
docker-compose down
🔒 Security
- API keys encrypted using Fernet (symmetric encryption)
- Environment variables for sensitive data
- MCP servers run in isolated processes
- Input validation on all API endpoints
📝 License
MIT License - See LICENSE file for details
🤝 Contributing
Contributions welcome! Please read CONTRIBUTING.md for guidelines.
📚 Documentation
- - Quick start guide
- - Claude Code integration
- - Two-stage LLM search architecture
- - Web UI user guide
- - Development progress
📚 External Resources
🙏 Acknowledgments
Built with:
- FastAPI - Modern web framework
- LangChain - AI application framework
- Chroma - Vector database
- SQLAlchemy - Database ORM
- uv - Fast Python package manager
Status: 🚀 Production Ready Version: 1.0.0 Last Updated: 2025-10-06