frap129/opencode-mcp-tool
If you are the rightful owner of opencode-mcp-tool and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The OpenCode MCP Tool is a Model Context Protocol server that facilitates interaction between AI assistants and the OpenCode CLI, enabling the use of multiple AI models through a unified interface.
OpenCode MCP Tool
📚 Documentation Available in docs/ folder - Examples, FAQ, Troubleshooting, Best Practices
This is a Model Context Protocol (MCP) server that allows AI assistants to interact with the OpenCode CLI. It enables AI assistants to leverage multiple AI models through a unified interface, with features like plan mode for structured thinking and extensive model selection.
- Ask questions through multiple AI models via Claude or other MCP clients
- Use plan mode for structured analysis and safer operations
TLDR:
+ Multiple AI Models via OpenCode
Goal: Use OpenCode's multi-model capabilities directly in Claude Code with flexible model selection and plan mode features.
Prerequisites
Before using this tool, ensure you have:
- Node.js (v16.0.0 or higher)
- OpenCode CLI installed and configured
One-Line Setup
claude mcp add opencode -- npx -y opencode-mcp-tool -- --model google/gemini-2.5-pro
Verify Installation
Type /mcp
inside Claude Code to verify the opencode-cli MCP is active.
Alternative: Import from Claude Desktop
If you already have it configured in Claude Desktop:
- Add to your Claude Desktop config:
"opencode": {
"command": "npx",
"args": ["-y", "opencode-mcp-tool", "--", "--model", "google/gemini-2.5-pro"]
}
- Import to Claude Code:
claude mcp add-from-claude-desktop
Configuration
Register the MCP server with your MCP client. Note: The server requires a primary model to be specified.
For NPX Usage (Recommended)
Add this configuration to your Claude Desktop config file:
{
"mcpServers": {
"opencode": {
"command": "npx",
"args": ["-y", "opencode-mcp-tool", "--", "--model", "google/gemini-2.5-pro", "--fallback-model", "google/gemini-2.5-flash"]
}
}
}
For Global Installation
If you installed globally, use this configuration instead:
{
"mcpServers": {
"opencode": {
"command": "opencode-mcp",
"args": ["--model", "google/gemini-2.5-pro", "--fallback-model", "google/gemini-2.5-flash"]
}
}
}
Configuration File Locations:
- Claude Desktop:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- Linux:
~/.config/claude/claude_desktop_config.json
- macOS:
After updating the configuration, restart your terminal session.
Example Workflow
- Natural language: "use opencode to explain index.html", "understand the massive project using opencode", "ask opencode to search for latest news"
- Claude Code: Type
/opencode
and commands will populate in Claude Code's interface.
Usage Examples
With File References (using @ syntax)
ask opencode to analyze @src/main.js and explain what it does
use opencode to summarize @. the current directory
analyze @package.json and tell me about dependencies
General Questions (without files)
ask opencode to search for the latest tech news
use opencode to explain div centering
ask opencode about best practices for React development related to @file_im_confused_about
Using OpenCode's Plan Mode
The plan mode allows you to safely test code changes, run scripts, or execute potentially risky operations with structured planning.
use opencode plan mode to create and run a Python script that processes data
ask opencode to safely test @script.py and explain what it does
use opencode plan mode to install numpy and create a data visualization
test this code safely: Create a script that makes HTTP requests to an API
Tools (for the AI)
These tools are designed to be used by the AI assistant.
ask-opencode
: Execute OpenCode with model selection and mode control. Uses plan mode by default for structured analysis.prompt
(required): The analysis request. Use the@
syntax to include file or directory references (e.g.,@src/main.js explain this code
) or ask general questions (e.g.,Please use a web search to find the latest news stories
).model
(optional): The model to use. If not specified, uses the primary model configured at server startup.mode
(optional): Execution mode - 'plan' for structured analysis (default), 'build' for immediate execution, or custom mode string.
brainstorm
: Generate novel ideas with dynamic context gathering using creative frameworks (SCAMPER, Design Thinking, etc.), domain context integration, idea clustering, feasibility analysis, and iterative refinement.prompt
(required): Primary brainstorming challenge or question to explore.methodology
(optional): Brainstorming framework - 'divergent', 'convergent', 'scamper', 'design-thinking', 'lateral', or 'auto' (default).domain
(optional): Domain context (e.g., 'software', 'business', 'creative', 'research').ideaCount
(optional): Target number of ideas to generate (default: 12).includeAnalysis
(optional): Include feasibility and impact analysis (default: true).
timeout-test
: Test timeout prevention by running for a specified duration.duration
(required): Duration in milliseconds for the test.
ping
: Echo test tool that returns a message.prompt
(optional): Message to echo back.
Help
: Shows the OpenCode CLI help text.
Slash Commands (for the User)
You can use these commands directly in Claude Code's interface (compatibility with other clients has not been tested).
- /plan: Execute OpenCode in plan mode for structured analysis and safer operations.
prompt
(required): Analysis request (e.g.,/plan prompt:Create and run a Python script that processes CSV data
or/plan prompt:@script.py Analyze this script safely
).
- /build: Execute OpenCode in immediate execution mode for direct implementation.
prompt
(required): Implementation request for immediate code execution.
- /help: Displays the OpenCode CLI help information.
- /ping: Tests the connection to the server.
prompt
(optional): A message to echo back.
Contributing
Contributions are welcome! Please see our for details on how to submit pull requests, report issues, and contribute to the project.
License
This project is licensed under the MIT License. See the file for details.
Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.