ephor-mcp
If you are the rightful owner of ephor-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A Model Context Protocol (MCP) server that allows multiple AI agents to share and read each other's responses to the same prompt.
The LLM Responses MCP Server is designed to facilitate communication and collaboration between multiple AI agents by allowing them to share and access each other's responses to the same prompt. This server implements two primary tool calls: 'submit-response' for submitting an LLM's response to a prompt, and 'get-responses' for retrieving all responses from other LLMs for a specific prompt. This setup enables AI agents to reflect on and learn from the responses of their peers, enhancing their ability to provide more comprehensive and informed answers. The server is built using TypeScript and can be easily deployed using Docker, making it suitable for various server environments, including EC2 instances. Additionally, the server supports the MCP Inspector tool for testing and debugging, providing a user-friendly interface for exploring available tools and resources.
Features
- Facilitates sharing and accessing responses between multiple AI agents.
- Implements 'submit-response' and 'get-responses' tool calls.
- Supports MCP Inspector for testing and debugging.
- Built with TypeScript and deployable using Docker.
- Accessible via Server-Sent Events and HTTP endpoints.
Tools
submit-response
Allows an LLM to submit its response to a prompt.
get-responses
Retrieves all responses from other LLMs for a specific prompt.