rnd-pro/agent-aggregator
If you are the rightful owner of agent-aggregator and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Agent Aggregator is an MCP Server that aggregates tools from multiple MCP servers, acting as a proxy to provide unified access to various AI agents and tools.
Agent Aggregator
MCP Server that aggregates tools from multiple MCP servers, acting as a proxy to provide unified access to various AI agents and tools.
šÆ Features
- Multi-Agent Aggregation: Connects to multiple MCP servers simultaneously
- Unified Tool Interface: Exposes all tools through a single MCP interface
- AI Model Integration: Each agent can have an associated AI model via OpenRouter
- Dynamic Configuration: Supports runtime configuration of connected agents
- Error Handling: Robust error handling and connection management
- Modern Node.js: Built with ES modules and modern JavaScript features
- OpenRouter Support: Integrated support for AI models through OpenRouter API
š Project Structure
agent-aggregator/
āāā src/
ā āāā index.js # Main MCP server entry point
ā āāā aggregator/
ā ā āāā AgentAggregator.js # Core aggregation logic
ā ā āāā MCPConnection.js # Individual MCP server connection
ā ā āāā OpenRouterClient.js # OpenRouter API integration
ā āāā config/
ā ā āāā ConfigLoader.js # Configuration management
ā āāā mcp-servers/ # Custom MCP server implementations
ā āāā README.md # MCP servers documentation
ā āāā qwen_mcp_server.py # Qwen AI MCP server
āāā config/
ā āāā agents.json # Agent configuration file
āāā tests/
ā āāā integration.test.js # Integration tests with real services
āāā scripts/
ā āāā test-server.js # Manual server testing script
āāā docs/ # Documentation
š Quick Start
Installation
# Install globally from npm
npm install -g agent-aggregator
# Or clone the repository for development
git clone https://github.com/rnd-pro/agent-aggregator.git
cd agent-aggregator
# Install dependencies
npm install
Quick Start with Cursor
- Add to Cursor MCP configuration (
~/.cursor/mcp.json
):
{
"mcpServers": {
"agent-aggregator": {
"command": "npx",
"args": ["agent-aggregator"],
"env": {
"OPENROUTER_API_KEY": "your-openrouter-api-key",
"NODE_ENV": "production"
}
}
}
}
-
Set your OpenRouter API key:
- Get key from https://openrouter.ai/
- Replace
your-openrouter-api-key
with actual key
-
Restart Cursor and you'll have access to 14+ tools from connected MCP servers:
- Filesystem operations
- Code analysis tools
- AI assistance tools
- And more based on your configuration
Configuration
Edit config/agents.json
to configure which MCP servers to connect to:
{
"agents": [
{
"name": "filesystem",
"type": "mcp",
"enabled": true,
"description": "File system operations server",
"connection": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"],
"env": {}
},
"model": {
"provider": "openrouter",
"name": "qwen/qwen3-coder:free",
"apiKey": "${OPENROUTER_API_KEY}"
}
}
],
"aggregator": {
"timeout": 30000,
"retryAttempts": 3,
"retryDelay": 1000
},
"defaults": {
"model": {
"provider": "openrouter",
"name": "qwen/qwen3-coder:free",
"apiKey": "${OPENROUTER_API_KEY}",
"baseUrl": "https://openrouter.ai/api/v1"
}
}
}
Environment Variables
Set up your OpenRouter API key:
# For current session
export OPENROUTER_API_KEY="sk-or-v1-your-actual-key-here"
# Or create .env file in project root:
echo "OPENROUTER_API_KEY=sk-or-v1-your-actual-key-here" > .env
# For permanent setup (add to ~/.bashrc or ~/.zshrc):
echo 'export OPENROUTER_API_KEY="sk-or-v1-your-actual-key-here"' >> ~/.zshrc
Important: Never commit your actual API key to version control!
Running
# Start the MCP server
npm start
# Test the server
npm run test:server
# Run integration tests
npm test
# Development mode with auto-reload
npm run dev
š§ Usage
As MCP Server
Add to your MCP client configuration (e.g., Cursor):
{
"mcpServers": {
"agent-aggregator": {
"command": "npx",
"args": ["agent-aggregator"]
}
}
}
Supported MCP Servers
Currently configured to work with:
- Filesystem:
@modelcontextprotocol/server-filesystem
- File system operations - Claude Code MCP:
@kunihiros/claude-code-mcp
- Claude Code wrapper
You can add any MCP server that supports the standard MCP protocol. Popular options include:
@modelcontextprotocol/server-github
- GitHub API operations@modelcontextprotocol/server-memory
- Memory management@modelcontextprotocol/server-fetch
- HTTP requests and web fetching
š Architecture
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā MCP Client āāāāāā Agent Aggregator āāāāāā Filesystem ā
ā (Cursor) ā ā (This Server) ā ā MCP Server ā
āāāāāāāāāāāāāāāāāāā ā ā āāāāāāāāāāāāāāāāāāā
ā ā āāāāāāāāāāāāāāāāāāā
ā āāāāāā Qwen AI ā
ā ā ā MCP Server ā
ā ā āāāāāāāāāāāāāāāāāāā
ā ā āāāāāāāāāāāāāāāāāāā
ā āāāāāā Claude Code ā
ā ā ā MCP Server ā
ā ā āāāāāāāāāāāāāāāāāāā
ā ā āāāāāāāāāāāāāāāāāāā
ā āāāāāā OpenRouter ā
ā ā ā AI Models ā
āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
The Agent Aggregator:
- Connects to multiple downstream MCP servers
- Aggregates their tools into a unified list
- Routes tool calls to the appropriate server
- Provides AI model access via OpenRouter for each agent
- Returns results back to the client
š¤ AI Model Integration
Each MCP server can have an associated AI model that runs via OpenRouter. The default model is qwen/qwen3-coder:free
.
Custom Methods
The aggregator provides custom MCP methods for AI interactions:
custom/agents/list
- List all available agents and their capabilitiescustom/model/generate
- Generate text using an agent's modelcustom/model/chat
- Send chat completion requestscustom/models/info
- Get information about all modelscustom/status
- Get detailed status of all connections
š Debugging
If you encounter issues, you can inspect the MCP server:
# Debug with MCP inspector
npx @modelcontextprotocol/inspector node src/index.js
š ļø Development
For developers who want to extend or contribute:
Adding New MCP Servers
- Add server configuration to
config/agents.json
- Install the MCP server package
- Test the connection
Contributing
- Fork the repository
- Create a feature branch
- Test your changes
- Submit a pull request
š Configuration Options
Agent Configuration
{
"name": "unique-agent-name",
"type": "mcp",
"enabled": true,
"description": "Agent description",
"connection": {
"command": "command-to-run",
"args": ["--arg1", "--arg2"],
"env": {
"ENV_VAR": "value"
}
},
"model": {
"provider": "openrouter",
"name": "qwen/qwen3-coder:free",
"apiKey": "${OPENROUTER_API_KEY}"
}
}
Aggregator Configuration
{
"aggregator": {
"timeout": 30000, // Connection timeout in ms
"retryAttempts": 3, // Number of retry attempts
"retryDelay": 1000, // Delay between retries in ms
"concurrentConnections": 2 // Max concurrent connections
},
"defaults": {
"model": {
"provider": "openrouter",
"name": "qwen/qwen3-coder:free",
"apiKey": "${OPENROUTER_API_KEY}",
"baseUrl": "https://openrouter.ai/api/v1"
}
}
}
Available Models
The system uses OpenRouter API which supports many models:
qwen/qwen3-coder:free
(default) - Free Qwen 3 Coder modelopenai/gpt-4o-mini
- OpenAI GPT-4o Minianthropic/claude-3.5-sonnet
- Claude 3.5 Sonnetmeta-llama/llama-3.1-8b-instruct:free
- Free Llama model- And many more - see OpenRouter Models
## š Troubleshooting
### Common Issues
1. **"Could not attach to MCP server"**
- Check that the MCP server package is installed
- Verify the command and arguments in configuration
- Ensure the server supports the MCP protocol
2. **"Connection timeout"**
- Increase timeout in aggregator configuration
- Check that the MCP server starts properly
- Verify network connectivity
3. **"Tool not found"**
- Ensure the downstream MCP server is connected
- Check tool name prefixing (format: `agent-name__tool-name`)
- Verify the tool exists in the downstream server
4. **"OpenRouter API error"**
- Verify your OPENROUTER_API_KEY is set correctly
- Check that you have credits/access to the specified model
- Ensure the model name is correct (e.g., `qwen/qwen3-coder:free`)
5. **"No AI model configured"**
- Add a `model` section to your agent configuration
- Ensure the model configuration includes provider, name, and apiKey
- Check that environment variables are properly expanded
### Debug Mode
Enable debug logging by setting environment variables:
```bash
DEBUG=1 npm start
š¤ Contributing
- Follow the established code style
- Add tests for new functionality
- Update documentation
- Test with real MCP servers
š Links
- npm Package - Install from npm registry
- GitHub Repository - Source code and issues
- OpenRouter API - Get your API key for AI models
š License
MIT License