mcp-gemini-server
If you are the rightful owner of mcp-gemini-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
MCP Gemini Server is a dedicated MCP server that integrates with Google's Gemini model, providing a consistent interface for LLMs and MCP-compatible systems.
The MCP Gemini Server is designed to facilitate the integration of Google's Gemini models into various applications by providing a standardized MCP interface. It leverages the `@google/genai` SDK to expose Gemini's capabilities, allowing other LLMs or MCP-compatible systems to utilize these features as a backend. The server simplifies the process of interacting with Gemini models by offering a tool-based interface that adheres to the MCP standard, making it easier for developers to incorporate advanced AI functionalities into their applications.
Features
- Core Generation: Provides standard and streaming text generation capabilities using Gemini models.
- Function Calling: Allows Gemini models to execute client-defined functions.
- Stateful Chat: Manages conversational context across multiple interactions.
- File Handling: Supports file upload, retrieval, and management using the Gemini API.
- Caching: Enables caching of content to optimize prompt processing.
Tools
gemini_generateContent
Generates non-streaming text content from a prompt.
gemini_generateContentStream
Generates text content via streaming.
gemini_functionCall
Executes a function call based on a prompt and function declarations.
gemini_startChat
Initiates a new stateful chat session.
gemini_sendMessage
Sends a message within an existing chat session.