ephor-mcp-collaboration

ephor-mcp-collaboration

3.2

If you are the rightful owner of ephor-mcp-collaboration and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

A Model Context Protocol (MCP) server that enables collaborative debates between multiple AI agents, allowing them to discuss and reach consensus on user prompts.

The LLM Responses MCP Server is designed to facilitate multi-turn conversations between large language models (LLMs) by enabling them to engage in collaborative debates. This server allows LLMs to register as participants in a debate session, engage in discussions to reach consensus, and share responses in real-time. The server supports session-based collaboration, deliberative consensus, and real-time response sharing, making it possible for multiple AI agents to deliberate on a user's question until a consensus is reached. The server provides four main tool calls: register-participant, submit-response, get-responses, and get-session-status, which enable LLMs to join sessions, submit responses, retrieve responses, and check session status, respectively.

Features

  • Session-based collaboration: LLMs can register as participants in a debate session.
  • Deliberative consensus: LLMs can engage in extended discussions to reach agreement.
  • Real-time response sharing: All participants can view and respond to each other's contributions.

Tools

  1. register-participant

    Allows an LLM to join a collaboration session with its initial response.

  2. submit-response

    Allows an LLM to submit follow-up responses during the debate.

  3. get-responses

    Allows an LLM to retrieve all responses from other LLMs in the session.

  4. get-session-status

    Allows an LLM to check if the registration waiting period has completed.