watercooler-cloud

mostlyharmless-ai/watercooler-cloud

3.3

If you are the rightful owner of watercooler-cloud and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Watercooler-cloud is a file-based collaboration protocol designed for agentic coding projects, enabling seamless integration with AI agents through the Model Context Protocol (MCP).

Tools
2
Resources
0
Prompts
0

watercooler-cloud

File-based collaboration protocol for agentic coding projects

Python MCP

InstallationMemory Backends


Example workflow:

Your Task → Claude plans → Codex implements → Claude reviews → State persists in Git

Each agent automatically knows when it's their turn, what role they're playing, and what happened before.


Quick Start

1. Authentication Setup

One-time GitHub authorization enables seamless access for all your AI agents:

  1. Visit the Watercooler Website
  2. Click "Sign in with GitHub"
  3. Grant access to your organizations
  4. Download credentials file from Settings → GitHub Connection
  5. Place it at ~/.watercooler/credentials.json

That's it! All MCP servers will automatically authenticate using this file.

2. Configure Your AI Agents

Minimal configuration - once authenticated, setup is just command + args!

Claude Code

Update ~/.claude.json:

    "watercooler-cloud": {
      "type": "stdio",
      "command": "uvx",
      "args": [
        "--from",
        "git+https://github.com/mostlyharmless-ai/watercooler-cloud@stable",
        "watercooler-mcp"
      ]
    },

Codex

Update ~/.codex/config.toml:

[mcp_servers.watercooler_cloud]
command = "uvx"
args = ["--from", "git+https://github.com/mostlyharmless-ai/watercooler-cloud@stable", "watercooler-mcp"]

Cursor

Edit ~/.cursor/mcp.json:

{
  "mcpServers": {
    "watercooler-cloud": {
      "command": "uvx",
      "args": [
        "--from",
        "git+https://github.com/mostlyharmless-ai/watercooler-cloud@stable",
        "watercooler-mcp"
      ]
    }
  }
}
Claude Desktop

Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "watercooler-cloud": {
      "command": "uvx",
      "args": [
        "--from",
        "git+https://github.com/mostlyharmless-ai/watercooler-cloud@stable",
        "watercooler-mcp"
      ]
    }
  }
}
Other MCP Clients

See the for:

  • Helper scripts (macOS/Linux/Windows)
  • fastmcp setup
  • (optional environment variables)

Create Your First Thread

Most collaborators never touch the raw CLI anymore—we stay inside Codex, Claude, Cursor, etc., and let them call the MCP tools for us. A typical spin-up looks like this:

  1. You → Codex: “Start a thread called feature-auth, outline the auth plan, and hand the ball to Claude.”
  2. Codex: Calls watercooler_say (with your agent_func) which creates the thread, writes the entry, commits, and pushes via run_with_sync.
  3. Claude: Sees the ball, continues refining the plan in the same thread, again using watercooler_say so git stays in sync.
  4. Cursor/Codex: Implements the feature, referencing the thread for context and flipping the ball back when done.

That’s the workflow we recommend because the MCP layer enforces formatting, branch pairing, git commits, and identity footers automatically. If you do need to work manually (for example, repairing a thread offline), the legacy CLI is still available:

watercooler init-thread feature-auth --ball Claude
watercooler say feature-auth \
  --agent Claude \
  --role planner \
  --title "Authentication Design" \
  --body "Proposing OAuth2 with JWT tokens"

See the for every flag if you go that route.


Example: Multi-Agent Collaboration

  1. You ask Codex: “Plan the payments feature in the feature-payment thread.” Codex hits watercooler_say, adds the plan entry, and the ball flips to Claude.
  2. Claude (prompted by you) critiques the plan, calling the same MCP tool so commits stay in sync. Ball now sits with Cursor.
  3. Cursor/Codex implements the feature, updates tests, and posts a completion note via watercooler_say, flipping the ball to Claude for review.
  4. Claude runs watercooler_ack to approve, then watercooler_set_status to mark the thread CLOSED after merge.

No manual git work, no hand-written metadata—each MCP call bundles the entry, ball movement, commit footers, and push.


Memory Backends

Watercooler supports pluggable memory backends for advanced knowledge retrieval and semantic search. The backend architecture uses Python Protocols for clean decoupling - swap implementations without changing application code.

Installation

# Install with all memory backends
pip install 'watercooler-cloud[memory]'

# Install specific backends
pip install 'watercooler-cloud[leanrag]'   # LeanRAG only
pip install 'watercooler-cloud[graphiti]'  # Graphiti only

Quick Usage Example

from pathlib import Path
from watercooler_memory.backends import get_backend, LeanRAGConfig

# Initialize backend
config = LeanRAGConfig(work_dir=Path("./memory"))
backend = get_backend("leanrag", config)

# Prepare, index, and query (see docs/examples/BACKEND_USAGE.md for full examples)
backend.prepare(corpus)
backend.index(chunks)
results = backend.query(queries)

Available Backends

LeanRAG - Hierarchical Graph RAG

Entity extraction with hierarchical semantic clustering. Ideal for large document corpora with redundancy reduction.

Features:

  • Hierarchical semantic clustering (~46% redundancy reduction)
  • Batch document processing
  • Optional vector search with Milvus

Setup:

Graphiti - Episodic Memory

Temporal entity tracking with hybrid search. Ideal for conversation tracking and time-aware retrieval.

Features:

  • Episodic ingestion with temporal reasoning
  • Automatic fact extraction and deduplication
  • Hybrid semantic + graph search

Setup:

Learn More

  • - Practical code examples and patterns
  • - Architecture, comparison, and API reference
  • - Backend contract specification

Contributing

We welcome contributions! Please see:

  • - Guidelines and DCO requirements
  • - Community standards
  • - Security policy

License

Apache 2.0 License - see