mcp-brain-server

mcp-brain-server

3.2

If you are the rightful owner of mcp-brain-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Brain Server is a powerful MCP server designed for managing knowledge embeddings and vector search.

Brain Server - MCP Knowledge Embedding Service

A powerful MCP (Model Context Protocol) server for managing knowledge embeddings and vector search.

Features

  • Vector Embeddings: Generate high-quality embeddings for knowledge content
  • Semantic Search: Find knowledge based on meaning, not just keywords
  • MCP Compliance: Follows Model Context Protocol for AI integration
  • Brain Management: Organize knowledge into domain-specific brains
  • Context-Aware Retrieval: Includes surrounding context for better understanding
  • Progress Tracking: Real-time monitoring of long-running operations

Embedding Models

The server uses embedding models to convert text into vector representations:

  • On first run, the server will automatically download the embedding models
  • By default, it uses Xenova/all-MiniLM-L6-v2 from HuggingFace
  • Models are cached locally after first download
  • For testing, a MockEmbeddingProvider is available that generates random vectors

You can configure which model to use in the .env file:

EMBEDDING_MODEL=Xenova/all-MiniLM-L6-v2

Supported models include:

  • Xenova/all-MiniLM-L6-v2 (default, 384 dimensions)
  • Xenova/bge-small-en-v1.5 (384 dimensions)
  • Xenova/bge-base-en-v1.5 (768 dimensions)
  • Xenova/e5-small-v2 (384 dimensions)

Quick Start with Docker

The easiest way to run Brain Server is using Docker and Docker Compose:

# Clone the repository
git clone https://github.com/patrickdeluca/mcp-brain-server.git
cd mcp-brain-server

# Start the server with Docker Compose
docker-compose up -d

# View logs
docker-compose logs -f

The server will be available at http://localhost:3000 with MongoDB running inside the same container.

Using Docker with Claude Desktop

To use the dockerized Brain Server with Claude Desktop, update your claude_desktop_config.json:

{
  "brain-server": {
    "command": "docker",
    "args": ["run", "--rm", "-p", "3000:3000", "patrickdeluca/mcp-brain-server:latest"],
    "env": {}
  }
}

Prerequisites for Local Installation

MongoDB Installation

If you're not using Docker, the Brain Server requires MongoDB (version 6.0 or later recommended for vector search):

Modern Installation (Recommended)
Ubuntu/Debian
# Import public key
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | sudo tee /etc/apt/trusted.gpg.d/mongodb-6.0.asc
# Add MongoDB repository
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/6.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-6.0.list
# Update package database and install MongoDB
sudo apt-get update
sudo apt-get install -y mongodb-org
# Start MongoDB service
sudo systemctl start mongod
macOS
# Using Homebrew
brew tap mongodb/brew
brew install mongodb-community@6.0
brew services start mongodb-community@6.0
Windows
  1. Download the MongoDB 6.0 installer from the MongoDB Download Center
  2. Run the installer and follow the setup wizard
  3. Start MongoDB from the Windows Services console

Verify MongoDB Installation

To verify that MongoDB is running properly:

mongosh --eval "db.version()"

Manual Installation

# Clone the repository
git clone https://github.com/patrickdeluca/mcp-brain-server.git
cd mcp-brain-server

# Install dependencies
npm install

# Configure environment 
cp .env.example .env
# Edit .env with your settings

# Build the project
npm run build

Configuration

Configure the server using environment variables in the .env file:

# Server Configuration
PORT=3000

# MongoDB Configuration
MONGODB_URI=mongodb://localhost:27017/brain_db

# Model Configuration
EMBEDDING_MODEL=Xenova/all-MiniLM-L6-v2
MAX_CHUNK_SIZE=1024

Usage

Starting the Server

# Development mode
npm run dev

# Production mode
npm start

Using with Claude Desktop

Add the brain server to your Claude Desktop configuration by adding this to your claude_desktop_config.json file:

{
  "brain-server": {
    "command": "node",
    "args": ["path/to/mcp-brain-server/dist/index.js"],
    "env": {
      "MONGODB_URI": "mongodb://localhost:27017/brain_db"
    }
  }
}

Using with MCP Inspector

To debug or test the server, you can use the MCP Inspector:

npx @modelcontextprotocol/inspector node dist/index.js

Building Your Own Docker Image

If you want to build and run your own Docker image:

# Build the Docker image
docker build -t mcp-brain-server .

# Run the container
docker run -p 3000:3000 -d --name brain-server mcp-brain-server

The Docker image includes both the Brain Server and MongoDB for a self-contained deployment.

MCP Resources

The server exposes the following MCP resources:

  • embedding_config: Current embedding configuration
  • embedding_models: Available embedding models and their configurations
  • service_status: Current status of the embedding service

MCP Tool Usage

The server exposes the following MCP tools:

  • addKnowledge: Add new knowledge to the vector database
  • searchSimilar: Find semantically similar content
  • updateKnowledge: Update existing knowledge entries
  • deleteKnowledge: Remove knowledge entries
  • batchAddKnowledge: Add multiple knowledge entries in a batch
  • getEmbedding: Generate embeddings for text content

Development

Project Structure

src/
β”œβ”€β”€ config/          # Configuration settings
β”œβ”€β”€ controllers/     # Route controllers
β”œβ”€β”€ errors/          # Error definitions
β”œβ”€β”€ middleware/      # Express middleware
β”œβ”€β”€ models/          # Data models and types
β”œβ”€β”€ services/        # Business logic
β”‚   β”œβ”€β”€ embeddings/  # Embedding providers
β”‚   β”œβ”€β”€ ingestion/   # Knowledge ingestion
β”‚   β”œβ”€β”€ processing/  # Knowledge processing
β”‚   └── storage/     # Storage services
β”œβ”€β”€ tools/           # MCP tool definitions
β”œβ”€β”€ types/           # TypeScript type definitions
β”œβ”€β”€ utils/           # Utility functions
β”œβ”€β”€ server.ts        # MCP server setup
└── index.ts         # Application entry point

Tool Schema Examples

Here's an example of using the addKnowledge tool:

{
  "content": "The Model Context Protocol (MCP) is a standardized interface for AI models to interact with external systems.",
  "metadata": {
    "brainId": "tech-knowledge",
    "userId": "user123",
    "source": "documentation",
    "type": "definition"
  }
}

And the searchSimilar tool:

{
  "query": "What is MCP?",
  "options": {
    "limit": 5,
    "minConfidence": 0.7,
    "filters": {
      "metadata.brainId": "tech-knowledge"
    }
  }
}

Troubleshooting

Docker Issues
  • Check container logs: docker logs brain-server
  • Ensure ports are correctly mapped: docker ps
  • Verify MongoDB is running in the container: docker exec brain-server ps aux | grep mongod
MongoDB Connection Issues
  • Verify MongoDB is running: ps aux | grep mongod
  • Check MongoDB logs: sudo cat /var/log/mongodb/mongod.log
  • Ensure your firewall allows connections to MongoDB (default port 27017)
  • Verify your connection string in .env: MONGODB_URI=mongodb://localhost:27017/brain_db
Missing Vector Index Capabilities

If you encounter errors related to vector index capabilities:

  • Ensure you're using MongoDB 6.0+ for optimal vector search support
  • For older MongoDB versions, the server will fall back to approximate nearest neighbors search

Available Scripts

  • npm run build: Build the TypeScript project
  • npm start: Run the built application
  • npm run dev: Run in development mode with hot reloading
  • npm test: Run tests
  • npm run lint: Run linter

License

This project is licensed under the ISC License - see the LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.