vertex-ai-mcp-server
If you are the rightful owner of vertex-ai-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project implements a Model Context Protocol (MCP) server that provides a comprehensive suite of tools for interacting with Google Cloud's Vertex AI Gemini models, focusing on coding assistance and general query answering.
The Vertex AI MCP Server is designed to facilitate interaction with Google Cloud's Vertex AI Gemini models, offering a robust set of tools for coding assistance and general query answering. It supports web search grounding and direct knowledge answering, allowing users to configure model parameters such as ID, temperature, and streaming behavior through environment variables. The server uses a streaming API by default for enhanced responsiveness and includes basic retry logic for handling transient API errors. Minimal safety filters are applied to reduce potential blocking, making it a powerful tool for developers seeking to leverage AI capabilities in their workflows.
Features
- Access to Vertex AI Gemini models via numerous MCP tools.
- Supports web search grounding and direct knowledge answering.
- Configurable model parameters via environment variables.
- Uses streaming API by default for better responsiveness.
- Includes basic retry logic for transient API errors.