gemini-mcp-server

gemini-mcp-server

3.5

If you are the rightful owner of gemini-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Gemini MCP Server is a Model Context Protocol server that provides access to Google's Gemini API, enabling LLMs to perform intelligent web searches, generate content, and access other Gemini features.

The Gemini MCP Server is a versatile tool designed to interface with Google's Gemini API, offering a robust platform for leveraging large language models (LLMs) in various applications. It supports both STDIO and streamable-http transport modes, allowing for flexible deployment either locally or remotely. The server can be authenticated using an AI Studio API key, ensuring secure access to its features. Users can perform web searches, generate content, and utilize other Gemini functionalities through this server. It is particularly useful for developers and researchers looking to integrate advanced AI capabilities into their applications. The server can be deployed on Google Cloud Run, making it easily accessible to clients worldwide. With its ability to handle both direct and networked interactions, the Gemini MCP Server is a powerful tool for enhancing AI-driven projects.

Features

  • Supports STDIO and streamable-http transport modes for flexible deployment.
  • Enables intelligent web searches and content generation using Gemini API.
  • Can be run locally or deployed remotely on Google Cloud Run.
  • Authenticated access via AI Studio API key for secure operations.
  • Integrates with large language models for advanced AI capabilities.

Tools

  1. web_search

    Performs a web search using Gemini and returns synthesized results with citations.

  2. use_gemini

    Delegates a task to a specified Gemini 2.5 model (Pro or Flash).