richardbaxterseo/gemini-mcp
If you are the rightful owner of gemini-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Gemini MCP Server is a Model Context Protocol server that integrates Claude Desktop with Google's Gemini AI models, offering advanced AI capabilities.
gemini_chat
Chat with Gemini models with customizable parameters.
gemini_analyze_image
Analyze images using vision-capable models.
gemini_list_models
List all available models and their specifications.
gemini_count_tokens
Count tokens in text before sending.
gemini_generate_content
Generate creative content with optimized settings.
Gemini MCP Server
A Model Context Protocol (MCP) server that provides seamless integration with Google's Gemini AI models. This server allows AI assistants like Claude to interact with Gemini models for text generation, image analysis, and more.
Features
- 💬 Chat with Gemini Models - Send messages to various Gemini models with customizable parameters
- 📝 List Available Models - Get detailed information about all available Gemini models
- 🖼️ Image Analysis - Analyze images using Gemini's vision capabilities
- 🔧 Configurable Parameters - Control temperature, max tokens, and system prompts
- 🛡️ Enhanced Error Handling - Robust server version with better content filtering management
- 🚀 Easy Setup - Simple installation and configuration process
Installation
Prerequisites
- Python 3.7 or higher
- A Google AI Studio API key (Get one here)
Setup
- Clone the repository:
git clone https://github.com/richardbaxterseo/gemini-mcp.git
cd gemini-mcp
- Install dependencies:
pip install -r requirements.txt
- Set up your API key:
# Windows
set GEMINI_API_KEY=your-api-key-here
# macOS/Linux
export GEMINI_API_KEY=your-api-key-here
Configuration
For Claude Desktop
Add the following to your Claude Desktop configuration file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"gemini": {
"command": "python",
"args": ["C:\\path\\to\\gemini-mcp\\server.py"],
"env": {
"GEMINI_API_KEY": "your-api-key-here"
}
}
}
}
Using the Robust Server (Recommended)
For better error handling and content filtering management, use server_robust.py
:
{
"mcpServers": {
"gemini": {
"command": "python",
"args": ["C:\\path\\to\\gemini-mcp\\server_robust.py"],
"env": {
"GEMINI_API_KEY": "your-api-key-here"
}
}
}
}
The robust server provides:
- Enhanced error messages when content is filtered
- Relaxed safety settings to minimize false positives
- Better handling of various Gemini API response formats
- Actionable suggestions when responses are blocked
Available Tools
gemini_chat
Chat with Gemini models to generate text responses.
Parameters:
message
(required): The message to send to Geminimodel
(optional): Model to use (default: "gemini-2.5-flash")temperature
(optional): Controls randomness 0-2 (default: 0.7)max_tokens
(optional): Maximum tokens in response (default: 2048)system_prompt
(optional): System instruction to guide the model
Example:
"Can you explain quantum computing in simple terms?"
gemini_list_models
List all available Gemini models and their capabilities.
Example:
"Show me all available Gemini models"
gemini_analyze_image
Analyze images using Gemini's vision capabilities.
Parameters:
image_url
(required): URL of the image to analyzeprompt
(optional): Question about the image (default: "What's in this image?")model
(optional): Model to use (default: "gemini-2.5-flash")
Example:
"Analyze this image: https://example.com/image.jpg"
Troubleshooting
Common Issues
-
"GEMINI_API_KEY environment variable not set"
- Make sure you've set your API key in the environment or in the Claude configuration
-
"No module named 'mcp'"
- Run
pip install -r requirements.txt
to install all dependencies
- Run
-
Server not appearing in Claude
- Restart Claude Desktop after updating the configuration
- Check that the path in your configuration is correct
- Ensure Python is in your system PATH
-
"Content was filtered" errors
- Use
server_robust.py
for better handling of content filtering - Try rephrasing your query or using different parameters
- Break complex queries into smaller, more specific questions
- Use
Debug Mode
To see detailed logs, you can run the server manually:
set GEMINI_API_KEY=your-api-key-here
python server.py
Testing the Server
To ensure the server is working correctly, try these test queries:
- Basic conversation: "What is machine learning?"
- Complex technical query: "Explain the differences between transformer and LSTM architectures"
- Creative writing: "Write a short story about a robot learning to paint"
- Code generation: "Create a Python function to calculate fibonacci numbers"
- Content filtering test: If you encounter filtered content, the robust server will provide helpful feedback
Development
To contribute or modify the server:
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
License
MIT License - see LICENSE file for details
Acknowledgments
- Built on the Model Context Protocol
- Uses Google's Gemini API
- Inspired by the MCP ecosystem
Support
For issues, questions, or contributions:
- Open an issue on GitHub
- Check existing issues for solutions
- Contribute improvements via pull requests