gemini-mcp-server

namikmesic/gemini-mcp-server

3.2

If you are the rightful owner of gemini-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Gemini MCP Server provides seamless access to Google's Gemini AI models, designed for easy integration with MCP-compatible clients and tools.

Tools
1
Resources
0
Prompts
0

Gemini MCP Server

A Model Context Protocol (MCP) server that provides seamless access to Google's Gemini AI models, designed for easy integration with MCP-compatible clients and tools.

Core MCP Features:

  • callGemini Tool: Send prompts to the Gemini API and receive AI-generated responses.
  • Flexible Model Support: Works with various Gemini models (e.g., gemini-1.5-flash-latest, gemini-pro).
  • Customizable Generation: Adjust parameters like temperature, token limits, etc.
  • Multiple Transport Options: Supports stdio (ideal for desktop clients) and HTTP for MCP communication.
  • Simple Configuration: Primarily requires a Gemini API key to get started.

Quick Start for MCP

Get your Gemini MCP server running in minutes with these methods:

Using NPX

The easiest way to run the server without installing anything locally. Ideal for quick setup with clients like Claude Desktop or VS Code.

npx -y @yourusername/gemini-mcp-server

Ensure you have Node.js 18+ installed. Your GEMINI_API_KEY will need to be set as an environment variable or directly in your client configuration.

Using Docker (Stdio for Desktop Clients)

Run the server in a container, perfect for consistent environments and use with desktop MCP clients.

docker run -i --rm \
  -e GEMINI_API_KEY=YOUR_GEMINI_API_KEY_HERE \
  yourusername/gemini-mcp-server:latest

Replace YOUR_GEMINI_API_KEY_HERE with your actual Gemini API key. Docker should be installed.

Configuration

The server is configured primarily through environment variables:

  • GEMINI_API_KEY (Required): Your Google Gemini API key. Obtain it from Google AI Studio. This is essential for the server to communicate with the Gemini API.
  • LOG_LEVEL: (Optional) Set the logging level (e.g., error, warn, info, debug). Defaults to info.
  • PORT: (Optional) Port for the HTTP transport, if used. Defaults to 3000.
  • REQUEST_TIMEOUT_MS: (Optional) Timeout for API requests in milliseconds. Defaults to 30000.

Integrating with MCP Clients

Claude Desktop

To integrate this server with the Claude Desktop app, add the following configuration to the "mcpServers" section of your claude_desktop_config.json:

NPX Method
{
  "mcpServers": {
    "gemini": {
      "command": "npx",
      "args": [
        "-y",
        "@yourusername/gemini-mcp-server"
      ],
      "env": {
        "GEMINI_API_KEY": "YOUR_GEMINI_API_KEY_HERE"
      }
    }
  }
}
Docker Method
{
  "mcpServers": {
    "gemini": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e", "GEMINI_API_KEY=YOUR_GEMINI_API_KEY_HERE",
        "yourusername/gemini-mcp-server:latest"
      ]
    }
  }
}

Replace YOUR_GEMINI_API_KEY_HERE with your actual Gemini API key.

VS Code

Add the following JSON block to your User Settings (JSON) file (Preferences: Open User Settings (JSON)) or to a .vscode/mcp.json file in your workspace.

NPX Method
{
  "mcp": {
    "inputs": [
      {
        "type": "promptString",
        "id": "gemini_api_key",
        "description": "Your Google Gemini API Key",
        "password": true
      }
    ],
    "servers": {
      "gemini": {
        "command": "npx",
        "args": [
          "-y",
          "@yourusername/gemini-mcp-server"
        ],
        "env": {
          "GEMINI_API_KEY": "${input:gemini_api_key}"
        }
      }
    }
  }
}
Docker Method
{
  "mcp": {
    "inputs": [
      {
        "type": "promptString",
        "id": "gemini_api_key",
        "description": "Your Google Gemini API Key",
        "password": true
      }
    ],
    "servers": {
      "gemini": {
        "command": "docker",
        "args": [
          "run", 
          "-i", 
          "--rm",
          "yourusername/gemini-mcp-server:latest"
        ],
        "env": {
          "GEMINI_API_KEY": "${input:gemini_api_key}"
        }
      }
    }
  }
}
One-Click Installation for VS Code

Install with NPX in VS Code

Install with Docker in VS Code

Programmatic Usage (SDK)

You can also interact with the Gemini MCP server programmatically using the MCP SDK. This is useful for building custom applications or scripts that leverage Gemini models.

Example (using NPX server):

import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { StdioClientTransport } from '@modelcontextprotocol/sdk/client/stdio.js';

async function main() {
  const client = new Client({ name: 'MyClient', version: '1.0.0' });

  const transport = new StdioClientTransport({
    command: 'npx',
    args: ['-y', '@yourusername/gemini-mcp-server'],
    env: {
      'GEMINI_API_KEY': 'YOUR_GEMINI_API_KEY_HERE' // Replace with your actual key
    }
  });

  await client.connect(transport);

  try {
    const result = await client.callTool({
      name: 'callGemini',
      arguments: {
        model: 'gemini-1.5-flash-latest',
        prompt: 'Explain what MCP is in simple terms.',
        temperature: 0.3,
      }
    });
    console.log(result.content[0].text);
  } catch (error) {
    console.error('Error calling tool:', error);
  } finally {
    await client.close();
  }
}

main().catch(console.error);

Ensure your GEMINI_API_KEY is correctly set in the environment or client configuration. This example uses NPX; similar approaches apply for Docker by adjusting the transport configuration.

Using the Gemini Tool

Tool Parameters

ParameterTypeDescriptionDefaultRequired
modelstringThe Gemini model to usegemini-1.5-flash-latestNo
promptstringThe text prompt to send to Gemini-Yes
temperaturenumberControls randomness (0.0-1.0)0.7No
maxOutputTokensnumberMaximum tokens to generate2048No
topKnumberConsider only top-K tokens-No
topPnumberConsider tokens in top-P probability mass-No

Advanced Installation & Development

Alternative Installation Methods

  • Clone the repository: git clone https://github.com/yourusername/gemini-mcp-server.git; cd gemini-mcp-server; npm install; npm run build
  • Build Docker image locally: docker build -t gemini-mcp-server . (after cloning)
  • Run HTTP server with Docker: docker run -p 3000:3000 --rm -e GEMINI_API_KEY=YOUR_GEMINI_API_KEY_HERE yourusername/gemini-mcp-server:latest node dist/http-server.js
  • Use Docker Compose: Create a .env file with GEMINI_API_KEY=YOUR_GEMINI_API_KEY_HERE, then run docker-compose up gemini-mcp-http (for HTTP) or docker-compose up gemini-mcp-stdio (for stdio). This is useful for easily running pre-configured HTTP or stdio versions.

Development

Key scripts for development:

  • npm run build: Compile TypeScript to JavaScript.
  • npm start: Run the stdio transport server (after building).
  • npm run start:http: Run the HTTP transport server (after building).
  • npm test: Run the test client.

Adding New Tools

To extend the server with new tools, see the document.

Troubleshooting

  • API Key Issues:
    • Ensure your GEMINI_API_KEY is valid and correctly set as an environment variable or in your client's configuration.
    • Verify the key has access to the Gemini API.
  • Server Not Running/Connection Errors:
    • NPX/Docker: Make sure the npx or docker run command was successful and the server is running before attempting to connect from a client. Check for error messages in the terminal where you started the server.
    • Claude Desktop/VS Code: Double-check the command and arguments in your client configuration match one of the Quick Start methods.
  • Incorrect Model Name: Ensure the model parameter in your tool call (e.g., gemini-1.5-flash-latest) is a valid and available model.
  • Docker Issues: Ensure Docker Desktop is running. If using docker run, check that the image yourusername/gemini-mcp-server:latest is available locally or can be pulled.
  • For more detailed logs, set the LOG_LEVEL environment variable to debug.

License

MIT

Contributing

Contributions are welcome! Please follow our coding standards and submit pull requests for new features or bug fixes.