vertex-memory-bank-mcp

inardini/vertex-memory-bank-mcp

3.3

If you are the rightful owner of vertex-memory-bank-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Vertex AI Memory Bank MCP Server is a clean, simple server that enables LLMs to generate and retrieve long-term memories using Vertex AI's Memory Bank.

Tools
6
Resources
0
Prompts
0

This is a personal project by Ivan Nardini to explore how to build a Model Context Protocol (MCP) server for Vertex AI Memory Bank.

Vertex AI Memory Bank MCP server is not a Google product. And it is not officially support.


Vertex AI Memory Bank MCP Server

A simple MCP (Model Context Protocol) server that enables LLMs to generate and retrieve long-term memories using Vertex AI's Memory Bank.

Why This Project?

This server demonstrates how to build an MCP server with Vertex AI Memory Bank. It has been inspired by a developer request and released for developers.

Prerequisites

  • Python 3.11 or higher
  • Google Cloud account with Vertex AI API enabled
  • Basic understanding of async Python (helpful but not required)

Quick Start

Setup Google Cloud

# Install gcloud CLI (if not already installed)
# https://cloud.google.com/sdk/docs/install

# Authenticate
gcloud auth application-default login

# Set your project
gcloud config set project YOUR_PROJECT_ID

# Enable Vertex AI API
gcloud services enable aiplatform.googleapis.com

Install

# Clone the repository
git clone https://github.com/yourusername/vertex-ai-memory-bank-mcp.git
cd vertex-ai-memory-bank-mcp

# Install with pip
pip install -r requirements.txt

# OR install with uv (faster, recommended)
uv sync

# For running examples (optional)
pip install -e ".[examples]"
# OR with uv
uv sync --extra examples

Configure

# Copy the example environment file
cp .env.example .env

# Edit .env with your project details
GOOGLE_CLOUD_PROJECT=your-project-id
GOOGLE_CLOUD_LOCATION=us-central1

Run Your First Example

Interactive Tutorial (Recommended): Open get_started_with_memory_bank_mcp.ipynb in Jupyter

Or try the command-line examples:

# Basic MCP Client Usage
python examples/basic_usage.py

# Gemini Agent with Memory
python examples/gemini_memory_agent.py

# Automatic Tool Calling with Gemini
python examples/automatic_tool_calling.py

Use with Claude Desktop

Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "memory-bank": {
      "command": "python",
      "args": ["/path/to/memory_bank_server.py"],
      "env": {
        "GOOGLE_CLOUD_PROJECT": "your-project-id",
        "GOOGLE_CLOUD_LOCATION": "us-central1"
      }
    }
  }
}

Key Concepts

Memory Scope

Memories are scoped to users or contexts:

scope = {"user_id": "alice123"}

Memory Topics

Categorize what to remember:

topics = ["USER_PREFERENCES", "USER_PERSONAL_INFO"]

Semantic Search

Find relevant memories with similarity search:

search_query = "programming preferences"
top_k = 5

Available Tools

ToolPurposeExample Use Case
initialize_memory_bankSet up connection to Vertex AIFirst-time setup
generate_memoriesExtract memories from conversationsAfter chat sessions
retrieve_memoriesFetch relevant memoriesPersonalize responses
create_memoryManually add a memoryStore user preferences
delete_memoryRemove specific memoryUser requests deletion
list_memoriesView all stored memoriesDebugging/inspection

Common Patterns

Pattern 1: Conversation Memory

# After each conversation turn
await session.call_tool(
    "generate_memories",
    {
        "conversation": conversation_history,
        "scope": {"user_id": user_id},
        "wait_for_completion": True
    }
)

Pattern 2: Explicit Memory

# Store specific facts
await session.call_tool(
    "create_memory",
    {
        "fact": "User prefers dark mode",
        "scope": {"user_id": user_id}
    }
)

Pattern 3: Context Retrieval

# Get relevant context before responding
memories = await session.call_tool(
    "retrieve_memories",
    {
        "scope": {"user_id": user_id},
        "search_query": user_message,
        "top_k": 5
    }
)

Project Structure

vertex-ai-memory-bank-mcp/
ā”œā”€ā”€ memory_bank_server.py                     # Main entry point
ā”œā”€ā”€ src/                                       # Modular source code
│   ā”œā”€ā”€ __init__.py
│   ā”œā”€ā”€ server.py                             # Server orchestration
│   ā”œā”€ā”€ tools.py                              # MCP tool implementations
│   ā”œā”€ā”€ config.py                             # Configuration management
│   ā”œā”€ā”€ app_state.py                          # Application state
│   ā”œā”€ā”€ validators.py                         # Input validation
│   └── formatters.py                         # Data formatting
ā”œā”€ā”€ examples/                                 # Usage examples
│   ā”œā”€ā”€ basic_usage.py                        # Basic MCP client usage
│   ā”œā”€ā”€ automatic_tool_calling.py             # Automatic function calling
│   └── claude_config.json                    # Claude Desktop config
ā”œā”€ā”€ get_started_with_memory_bank_mcp.ipynb    # Getting started tutorial
ā”œā”€ā”€ pyproject.toml                            # Project config (pip & uv)
ā”œā”€ā”€ requirements.txt                          # Dependencies (pip)
ā”œā”€ā”€ uv.lock                                   # Lock file (uv)
ā”œā”€ā”€ .env.example                              # Environment template
ā”œā”€ā”€ .gitignore                                # Git ignore rules
ā”œā”€ā”€ .python-version                           # Python version
ā”œā”€ā”€ README.md                                 # This file
└── LICENSE                                   # Apache 2.0 License

Troubleshooting

"Connection closed" error

Solution: Check that your MCP server is using stderr for logging, not stdout.

"Not authenticated"

Solution: Run gcloud auth application-default login

Contributing

This project is meant to inspire. Feel free to fork and create your own version as well as share your production implementations.

Resources

License

This project is licensed under the Apache 2.0 License.