gemini-mcp-server

vytautas-bunevicius/gemini-mcp-server

3.1

If you are the rightful owner of gemini-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

A Model Context Protocol (MCP) server for interacting with Google's Gemini AI models.

The Gemini MCP Server acts as an adapter, translating requests from MCP clients into API calls for Google's Gemini models and returning the responses in the MCP format. It enables users of MCP-compatible clients to leverage Gemini's capabilities, including different models and function calling. This server implements the Model Context Protocol specification, acting as a bridge to allow MCP clients (like Claude for Desktop) to interact seamlessly with Google's powerful Gemini AI models via a standardized interface.

Features

  • Standard MCP Implementation: Full support for the MCP specification, including tools and resources.
  • Local Transport: Primarily designed for `stdio` transport for secure, local connections (e.g., with desktop applications). Network transport via Docker is also supported.
  • Latest Gemini API Integration: Supports recent Gemini models, including the Gemini 1.5 series (as available via the API).
  • Function Calling Support: Implements Gemini's function calling capability for models that support it.
  • Resilience: Includes retry logic with exponential backoff for handling transient API errors.

Tools

  1. ask_gemini

    Sends a single question (prompt) to a specified Gemini model and returns the response.

  2. chat_with_gemini

    Manages a multi-turn conversation with a specified Gemini model.

  3. gemini_function_call

    Leverages Gemini's function calling feature (only for models supporting it).