inardini/vertex-memory-bank-mcp
If you are the rightful owner of vertex-memory-bank-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Vertex AI Memory Bank MCP Server is a clean, simple server that enables LLMs to generate and retrieve long-term memories using Vertex AI's Memory Bank.
This is a personal project by Ivan Nardini to explore how to build a Model Context Protocol (MCP) server for Vertex AI Memory Bank.
Vertex AI Memory Bank MCP server is not a Google product. And it is not officially support.
Vertex AI Memory Bank MCP Server
A simple MCP (Model Context Protocol) server that enables LLMs to generate and retrieve long-term memories using Vertex AI's Memory Bank.
Why This Project?
This server demonstrates how to build an MCP server with Vertex AI Memory Bank. It has been inspired by a developer request and released for developers.
Prerequisites
- Python 3.11 or higher
- Google Cloud account with Vertex AI API enabled
- Basic understanding of async Python (helpful but not required)
Quick Start
Setup Google Cloud
# Install gcloud CLI (if not already installed)
# https://cloud.google.com/sdk/docs/install
# Authenticate
gcloud auth application-default login
# Set your project
gcloud config set project YOUR_PROJECT_ID
# Enable Vertex AI API
gcloud services enable aiplatform.googleapis.com
Install
# Clone the repository
git clone https://github.com/yourusername/vertex-ai-memory-bank-mcp.git
cd vertex-ai-memory-bank-mcp
# Install with pip
pip install -r requirements.txt
# OR install with uv (faster, recommended)
uv sync
# For running examples (optional)
pip install -e ".[examples]"
# OR with uv
uv sync --extra examples
Configure
# Copy the example environment file
cp .env.example .env
# Edit .env with your project details
GOOGLE_CLOUD_PROJECT=your-project-id
GOOGLE_CLOUD_LOCATION=us-central1
Run Your First Example
Interactive Tutorial (Recommended): Open get_started_with_memory_bank_mcp.ipynb
in Jupyter
Or try the command-line examples:
# Basic MCP Client Usage
python examples/basic_usage.py
# Gemini Agent with Memory
python examples/gemini_memory_agent.py
# Automatic Tool Calling with Gemini
python examples/automatic_tool_calling.py
Use with Claude Desktop
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json
):
{
"mcpServers": {
"memory-bank": {
"command": "python",
"args": ["/path/to/memory_bank_server.py"],
"env": {
"GOOGLE_CLOUD_PROJECT": "your-project-id",
"GOOGLE_CLOUD_LOCATION": "us-central1"
}
}
}
}
Key Concepts
Memory Scope
Memories are scoped to users or contexts:
scope = {"user_id": "alice123"}
Memory Topics
Categorize what to remember:
topics = ["USER_PREFERENCES", "USER_PERSONAL_INFO"]
Semantic Search
Find relevant memories with similarity search:
search_query = "programming preferences"
top_k = 5
Available Tools
Tool | Purpose | Example Use Case |
---|---|---|
initialize_memory_bank | Set up connection to Vertex AI | First-time setup |
generate_memories | Extract memories from conversations | After chat sessions |
retrieve_memories | Fetch relevant memories | Personalize responses |
create_memory | Manually add a memory | Store user preferences |
delete_memory | Remove specific memory | User requests deletion |
list_memories | View all stored memories | Debugging/inspection |
Common Patterns
Pattern 1: Conversation Memory
# After each conversation turn
await session.call_tool(
"generate_memories",
{
"conversation": conversation_history,
"scope": {"user_id": user_id},
"wait_for_completion": True
}
)
Pattern 2: Explicit Memory
# Store specific facts
await session.call_tool(
"create_memory",
{
"fact": "User prefers dark mode",
"scope": {"user_id": user_id}
}
)
Pattern 3: Context Retrieval
# Get relevant context before responding
memories = await session.call_tool(
"retrieve_memories",
{
"scope": {"user_id": user_id},
"search_query": user_message,
"top_k": 5
}
)
Project Structure
vertex-ai-memory-bank-mcp/
āāā memory_bank_server.py # Main entry point
āāā src/ # Modular source code
ā āāā __init__.py
ā āāā server.py # Server orchestration
ā āāā tools.py # MCP tool implementations
ā āāā config.py # Configuration management
ā āāā app_state.py # Application state
ā āāā validators.py # Input validation
ā āāā formatters.py # Data formatting
āāā examples/ # Usage examples
ā āāā basic_usage.py # Basic MCP client usage
ā āāā automatic_tool_calling.py # Automatic function calling
ā āāā claude_config.json # Claude Desktop config
āāā get_started_with_memory_bank_mcp.ipynb # Getting started tutorial
āāā pyproject.toml # Project config (pip & uv)
āāā requirements.txt # Dependencies (pip)
āāā uv.lock # Lock file (uv)
āāā .env.example # Environment template
āāā .gitignore # Git ignore rules
āāā .python-version # Python version
āāā README.md # This file
āāā LICENSE # Apache 2.0 License
Troubleshooting
"Connection closed" error
Solution: Check that your MCP server is using stderr for logging, not stdout.
"Not authenticated"
Solution: Run gcloud auth application-default login
Contributing
This project is meant to inspire. Feel free to fork and create your own version as well as share your production implementations.
Resources
License
This project is licensed under the Apache 2.0 License.