cort-mcp
If you are the rightful owner of cort-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
CoRT MCP Server is a Chain-of-Recursive-Thoughts server that enhances AI thinking by making it argue with itself repeatedly.
The CoRT MCP Server is designed to enhance AI's problem-solving capabilities by implementing the Chain-of-Recursive-Thoughts (CoRT) methodology. This approach involves the AI generating multiple alternatives and evaluating them to select the best response. The server supports multi-LLM inference, allowing it to leverage different language models for generating diverse alternatives. The evaluation process is enhanced with a richer prompt that encourages the AI to consider various perspectives and contexts. The server requires an OPENROUTER_API_KEY for operation and supports logging configurations. It is built on the original work by PhialsBasement and offers several enhancements, including multi-LLM inference and improved evaluation prompts.
Features
- CoRT method available via MCP Server for enhanced AI thinking.
- Multi LLM inference for diverse alternative generation.
- Enhanced evaluation prompts for better response selection.
- Configurable logging options for operational transparency.
- Fallback processing for model and provider resolution.
Tools
{toolname}.simple
No details, output only final selected alternative.
{toolname}.details
Include details of LLM response history.
{toolname}.mixed.llm
Multi LLM inference.
{toolname}.neweval
New evaluation prompt.