taehun-kmu/gemini-mcp-server
If you are the rightful owner of gemini-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
Gemini MCP Server integrates Google's Gemini CLI for AI second opinions and validation in development workflows.
Gemini MCP Server
MCP (Model Context Protocol) Server with Google Gemini CLI integration for AI second opinions and validation in development workflows.
Features
- 🤖 AI Second Opinions: Get alternative perspectives from Google's Gemini AI
- 🔍 Automatic Uncertainty Detection: Triggers consultations when uncertainty is detected
- ⚡ Rate Limiting: Built-in rate limiting to prevent API abuse
- 🛡️ Configurable Models: Support for different Gemini models
- 📊 Consultation Logging: Track all AI consultations with timestamps
Quick Start
Using uvx (Recommended)
Run directly without installation:
# Run in current directory
uvx gemini-mcp-server
# Run with specific project root
uvx gemini-mcp-server --project-root /path/to/project
Local Development
# Clone the repository
git clone https://github.com/taehun-kmu/gemini-mcp-server
cd gemini-mcp-server
# Install with uv
uv pip install -e .
# Run the server
gemini-mcp-server
Prerequisites
- Node.js 18+ - Required for Gemini CLI
- Python 3.8+ - Required for MCP server
- Gemini CLI - Install and authenticate:
# Install Gemini CLI
npm install -g @google/gemini-cli
# Authenticate (run once)
gemini
Configuration
Create a gemini-config.json file in your project root:
{
"enabled": true,
"auto_consult": true,
"cli_command": "gemini",
"timeout": 60,
"rate_limit_delay": 2.0,
"model": "gemini-2.5-flash",
"log_consultations": true
}
Environment Variables
Override configuration with environment variables:
GEMINI_ENABLED: Enable/disable integrationGEMINI_AUTO_CONSULT: Enable/disable automatic consultationGEMINI_CLI_COMMAND: CLI command (default: "gemini")GEMINI_TIMEOUT: Command timeout in secondsGEMINI_RATE_LIMIT: Delay between consultationsGEMINI_MODEL: Model to use (default: "gemini-2.5-flash")
MCP Tools
The server exposes three MCP tools:
-
consult_gemini: Get second opinions from Gemini
query: The question or topiccontext: Additional contextcomparison_mode: Request structured comparison format
-
gemini_status: Check integration status and statistics
-
toggle_gemini_auto_consult: Enable/disable automatic consultation
enable: true/false or omit to toggle
Claude Code Integration
Configure Claude Code to use this MCP server by adding to your MCP configuration:
{
"mcpServers": {
"gemini": {
"command": "uvx",
"args": ["gemini-mcp-server"],
"env": {
"GEMINI_ENABLED": "true"
}
}
}
}
Note: Add "--project-root", "/path/to/project" to args if you need to specify a different project directory.
Development
Project Structure
gemini-mcp-server/
├── gemini_mcp/
│ ├── __init__.py
│ ├── __main__.py # CLI entry point
│ ├── gemini_integration.py # Gemini integration logic
│ └── server.py # MCP server implementation
├── pyproject.toml # Package configuration
├── README.md # This file
├── LICENSE # MIT License
├── setup-gemini-integration.sh # Setup script
└── gemini-config.json # Example configuration
Running Tests
# Install dev dependencies
uv pip install -e ".[dev]"
# Run tests
pytest
# Format code
ruff format gemini_mcp
ruff check gemini_mcp
License
MIT License - see LICENSE file for details.