democratize-technology/decision-matrix-mcp
If you are the rightful owner of decision-matrix-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Decision Matrix MCP server facilitates structured decision analysis by evaluating options across multiple criteria in parallel, helping users make informed decisions.
Decision Matrix MCP
A Model Context Protocol (MCP) server for structured decision analysis using parallel criterion evaluation. Make better decisions through systematic comparison of options across weighted criteria.
Features
- Thread Orchestration: Each criterion evaluates options in parallel threads
- Weighted Scoring: 1-10 scale with importance weights (0.1-10.0)
- Graceful Abstention: Criteria can return
[NO_RESPONSE]
when not applicable - Multi-Backend Support: Bedrock, LiteLLM, and Ollama
- Session Management: UUID-based sessions with TTL cleanup
- Professional Presentation: Clear rankings and recommendations
Quick Start
- Install:
git clone https://github.com/democratize-technology/decision-matrix-mcp.git
cd decision-matrix-mcp
python3 -m venv .venv
source .venv/bin/activate
pip install -e .
- Configure Claude Desktop:
{
"mcpServers": {
"decision-matrix": {
"command": "/path/to/decision-matrix-mcp/run.sh",
"args": []
}
}
}
- Use in Claude:
"Help me decide between AWS, GCP, and Azure for our startup"
Available Tools
start_decision_analysis
- Initialize decision matrixadd_criterion
- Add weighted evaluation criteriaevaluate_options
- Run parallel evaluationget_decision_matrix
- View results and rankingsadd_option
- Add new alternativeslist_sessions
- Manage sessions
Example Usage
User: "Help me choose a cloud provider for our startup"
Claude: *creates decision matrix with AWS, GCP, Azure, DigitalOcean*
User: "Cost is most important (weight 3), then ease of use (2), then features"
Claude: *adds three criteria with appropriate weights*
User: "Evaluate the options"
Claude: *runs parallel evaluation across all criteria*
User: "Show me the results"
Claude: *displays ranked matrix with scores and justifications*
Configuration
Set environment variables for LLM backends:
AWS_PROFILE
- For Bedrock accessLITELLM_API_KEY
- For OpenAI/Anthropic via LiteLLMOLLAMA_HOST
- For local Ollama models
Troubleshooting
Common Issues
MCP Server Not Found in Claude Desktop
Symptom: Claude Desktop doesn't recognize the decision-matrix server
Solution:
- Verify the path in your Claude Desktop config is absolute, not relative
- Ensure
run.sh
has execute permissions:chmod +x run.sh
- Check the server starts manually:
python -m decision_matrix_mcp
- Review Claude Desktop logs for error messages
LLM Backend Connection Errors
AWS Bedrock Issues:
# Check AWS credentials
aws sts get-caller-identity
# Ensure region has Bedrock access
export AWS_REGION=us-east-1 # or another supported region
LiteLLM/OpenAI Issues:
# Test API key
curl https://api.openai.com/v1/models \
-H "Authorization: Bearer $OPENAI_API_KEY"
Ollama Connection Issues:
# Check Ollama is running
curl http://localhost:11434/api/tags
# If using Docker, use host networking
docker run --network host ...
Session Expired Errors
Symptom: "Session not found or expired" after some time
Solution:
- Sessions expire after 24 hours by default
- Use
list_sessions
to see active sessions - Start a new analysis if needed
Evaluation Timeout
Symptom: Evaluations take too long or timeout
Solutions:
- Reduce number of options or criteria
- Use a faster LLM model
- Check network connectivity
- Consider using Ollama for local execution
Import Errors
Symptom: ModuleNotFoundError
when starting
Solution:
# Ensure you're in the virtual environment
source .venv/bin/activate # or .venv\Scripts\activate on Windows
# Reinstall dependencies
pip install -e .
Debug Mode
Enable detailed logging by setting the environment variable:
export LOG_LEVEL=DEBUG
Getting Help
- Check the
- Review
- Search existing issues
- Ask in Discussions
- Report bugs using our
Documentation
- - Step-by-step setup and first analysis
- - Complete tool documentation
- - System design and components
- - How to contribute to the project
License
MIT License - See LICENSE file for details