multi-ai-advisor-mcp
If you are the rightful owner of multi-ai-advisor-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A Model Context Protocol (MCP) server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question.
The Multi-Model Advisor, also known as ι΅ι΅εδΊΊθ‘, is an MCP server designed to query multiple Ollama models simultaneously and synthesize their responses. This approach allows for a 'council of advisors' methodology, where each model provides its unique perspective on a given question. The server integrates seamlessly with Claude for Desktop, enabling users to receive comprehensive answers that incorporate diverse viewpoints. By assigning different roles or personas to each model, the Multi-Model Advisor ensures a well-rounded analysis of any query. The server is highly configurable, allowing users to customize system prompts and manage model configurations via environment variables. It requires Node.js and Ollama to be installed and running, and it can be easily set up using Smithery for automatic installation.
Features
- Query multiple Ollama models with a single question
- Assign different roles/personas to each model
- View all available Ollama models on your system
- Customize system prompts for each model
- Integrate seamlessly with Claude for Desktop
Tools
list-available-models
Show all available Ollama models on the system
query-models
Send query questions to multiple models