lbds137/gemini-mcp-server
If you are the rightful owner of gemini-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Gemini MCP Server is a Model Context Protocol server that facilitates collaboration between Claude and Google's Gemini AI models.
Gemini MCP Server
A Model Context Protocol (MCP) server that enables Claude to collaborate with Google's Gemini AI models.
Features
- 🤖 Multiple Gemini Tools: Ask questions, review code, brainstorm ideas, generate tests, and get explanations
- 🔄 Dual-Model Support: Automatic fallback from experimental to stable models
- ⚡ Configurable Models: Easy switching between different Gemini variants
- 🛡️ Reliable: Never lose functionality with automatic model fallback
- 📊 Transparent: Shows which model was used for each response
Quick Start
1. Prerequisites
- Python 3.9+
- Claude Desktop
- Google AI API Key
2. Installation
# Clone the repository
git clone https://github.com/lbds137/gemini-mcp-server.git
cd gemini-mcp-server
# Install dependencies
pip install -r requirements.txt
# Copy and configure environment
cp .env.example .env
# Edit .env and add your GEMINI_API_KEY
3. Configuration
Edit .env to configure your models:
# Your Gemini API key (required)
GEMINI_API_KEY=your_api_key_here
# Model configuration (optional - defaults shown)
GEMINI_MODEL_PRIMARY=gemini-2.5-pro-preview-06-05
GEMINI_MODEL_FALLBACK=gemini-1.5-pro
GEMINI_MODEL_TIMEOUT=10000
4. Development Setup
For development with PyCharm or other IDEs:
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install in development mode
pip install -e .
# Run tests
python -m pytest
5. Register with Claude
# Install to MCP location
./scripts/install.sh
# Or manually register
claude mcp add gemini-collab python3 ~/.claude-mcp-servers/gemini-collab/server.py
Available Tools
ask_gemini
General questions and problem-solving assistance.
gemini_code_review
Get code review feedback focusing on security, performance, and best practices.
gemini_brainstorm
Collaborative brainstorming for architecture and design decisions.
gemini_test_cases
Generate comprehensive test scenarios for your code.
gemini_explain
Get clear explanations of complex code or concepts.
server_info
Check server status and model configuration.
Model Configurations
Best Quality (Default)
GEMINI_MODEL_PRIMARY=gemini-2.5-pro-preview-06-05
GEMINI_MODEL_FALLBACK=gemini-1.5-pro
Best Performance
GEMINI_MODEL_PRIMARY=gemini-2.5-flash-preview-05-20
GEMINI_MODEL_FALLBACK=gemini-2.0-flash
Most Cost-Effective
GEMINI_MODEL_PRIMARY=gemini-2.0-flash
GEMINI_MODEL_FALLBACK=gemini-2.0-flash-lite
Development
Project Structure
gemini-mcp-server/
├── src/
│ └── gemini_mcp/
│ ├── __init__.py
│ └── server.py # Main server with DualModelManager
├── tests/
│ └── test_server.py
├── scripts/
│ ├── install.sh # Quick installation script
│ ├── install.sh # Install/update deployment script
│ └── dev-link.sh # Development symlink script
├── docs/
│ └── BUILD_YOUR_OWN_MCP_SERVER.md
├── .claude/
│ └── settings.json # Claude Code permissions
├── .env # Your configuration (git-ignored)
├── .env.example # Example configuration
├── .gitignore
├── CLAUDE.md # Instructions for Claude Code
├── LICENSE
├── README.md # This file
├── docs/
│ ├── BUILD_YOUR_OWN_MCP_SERVER.md
│ ├── DUAL_MODEL_CONFIGURATION.md # Dual-model setup guide
│ ├── PYCHARM_SETUP.md
│ └── TESTING.md
├── requirements.txt
├── setup.py
├── package.json # MCP registration metadata
└── package-lock.json
Running Tests
python -m pytest tests/ -v
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Updating
To update your local MCP installation after making changes:
./scripts/install.sh
This script intelligently handles both installation and updates.
Troubleshooting
Server not found
# Check registration
claude mcp list
# Re-register if needed
./scripts/install.sh
API Key Issues
# Verify environment variable
echo $GEMINI_API_KEY
# Test directly
python -c "import google.generativeai as genai; genai.configure(api_key='$GEMINI_API_KEY'); print('✅ API key works')"
Model Availability
Some models may not be available in all regions. Check the fallback model in logs if primary fails consistently.
License
MIT License - see file for details.
Acknowledgments
- Built for Claude using the Model Context Protocol
- Powered by Google's Gemini AI