cape-mcp-server

S0lidStat3/cape-mcp-server

3.2

If you are the rightful owner of cape-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The CAPE MCP Worker is a Cloudflare Worker that provides a Model Context Protocol (MCP) interface to the CAPE Sandbox API, enabling seamless integration and execution of sandbox tasks over HTTPS.

Tools
1
Resources
0
Prompts
0

CAPE MCP Worker

Cloudflare Worker that exposes the full CAPE Sandbox API surface as Model Context Protocol (MCP) tools. It serves MCP metadata and executes tool calls over HTTPS without requiring any Node/Express runtime.

✨ Highlights

  • 32 MCP tools mapped directly to CAPE endpoints (task submission, lifecycle, files, reports, platform status).
  • Zod validation for every tool input and automatic JSON Schema metadata at /.well-known/mcp.json.
  • Worker-first HTTP implementation (Request/Response) with zero Node polyfills.
  • Binary artifact streaming with size guardrails and base64 packaging for MCP clients.

Requirements

  • Node.js 20+ for TypeScript tooling and Vitest.
  • Wrangler 3.0+ for local dev and deploys (npm install -g wrangler).
  • A reachable CAPE Sandbox instance plus optional remote-download API keys.

Configuration

The Worker reads all configuration from environment bindings (Wrangler vars/secret). The same names can be placed in .dev.vars for local wrangler dev.

VariableRequiredDescription
CAPE_BASE_URLCAPE API base URL (include /apiv2).
CAPE_API_TOKENoptionalAPI token for CAPE deployments that require auth.
CAPE_VT_API_KEYoptionalDefault VirusTotal/MalwareBazaar key used by cape.file.remote-create when caller omits apiKey.
MAX_BINARY_BYTESoptionalMax size (bytes) allowed when proxying binary artifacts (default 25 MB).
HTTP_TIMEOUT_MSoptionalCAPE HTTP timeout in milliseconds (default 60 000).

Tip: Copy .env.example to .dev.vars and adjust values for local testing. Secrets should be stored with wrangler secret put NAME in production.


Setup & Local Development

npm install

Start a locally emulated Worker (watches and hot-reloads):

npm run dev

This runs wrangler dev, exposing the MCP HTTP surface on http://127.0.0.1:8787:

MethodPathDescription
GET/healthzWorker health check.
GET/.well-known/mcp.jsonMCP metadata blob (tools + schemas).
POST/mcp/tools/listReturns { tools } array describing every tool.
POST/mcp/tools/callExecutes a tool via { toolName, arguments }.

Example invocation:

curl -X POST http://127.0.0.1:8787/mcp/tools/call \
   -H "Content-Type: application/json" \
   -d '{"toolName":"cape.tasks.status","arguments":{"taskId":1234}}'

Deploying to Cloudflare

npm run deploy    # wraps `wrangler publish`

wrangler.toml already points to src/worker.ts and sets a default compatibility_date. Provide production bindings/secrets via wrangler.toml, wrangler secret put, or the Cloudflare Dashboard before publishing.


Testing

npm test

Vitest validates the tool registry (unique names + JSON Schema emission). Add new tests alongside tests/mcp.tools.test.ts as you extend the worker.


Extending the Worker

  1. New CAPE operation: add a descriptor in src/mcp/tools.ts and a companion method in src/services/mcpService.ts (reusing CapeApi).
  2. Expose configuration: extend ConfigSchema in src/config.ts, then document the binding in this README and wrangler.toml.
  3. Hardening: enhance src/mcp/router.ts to emit richer error codes or rate limiting if needed.

Because the Worker is stateless, multiple deployments can run in parallel without coordination. Keep MAX_BINARY_BYTES conservative to avoid excessive memory usage when proxying large dumps.

Happy reversing! 🕵️‍♀️