topoteretes/Cognee
Cognee is hosted online, so all tools can be tested directly either in theInspector tabor in theOnline Client.
If you are the rightful owner of Cognee and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Cognee MCP server is a project designed to facilitate the integration and management of model context protocols, particularly for use with Claude Desktop.
Try Cognee with chat:
Server config via mcphub
Traditional api access examples
Path-based authentication
cogneeâmcp - Run cogneeâs memory engine as a Model Context Protocol server
Demo . Learn more ¡ Join Discord ¡ Join r/AIMemory
Build memory for Agents and query from any client that speaks MCPÂ â in your terminal or IDE.
⨠Features
- Multiple transports â choose Streamable HTTP --transport http (recommended for web deployments), SSE --transport sse (realâtime streaming), or stdio (classic pipe, default)
- API Mode â connect to an already running Cognee FastAPI server instead of using cognee directly (see API Mode below)
- Integrated logging â all actions written to a rotating file (see get_log_file_location()) and mirrored to console in dev
- Local file ingestion â feed .md, source files, Cursor ruleâsets, etc. straight from disk
- Background pipelines â longârunning cognify & codify jobs spawn offâthread; check progress with status tools
- Developer rules bootstrap â one call indexes .cursorrules, .cursor/rules, AGENT.md, and friends into the developer_rules nodeset
- Prune & reset â wipe memory clean with a single prune call when you want to start fresh
Please refer to our documentation here for further information.
đ Quick Start
- Clone cognee repo
git clone https://github.com/topoteretes/cognee.git - Navigate to cognee-mcp subdirectory
cd cognee/cognee-mcp - Install uv if you don't have one
pip install uv - Install all the dependencies you need for cognee mcp server with uv
uv sync --dev --all-extras --reinstall - Activate the virtual environment in cognee mcp directory
source .venv/bin/activate - Set up your OpenAI API key in .env for a quick setup with the default cognee configurations
LLM_API_KEY="YOUR_OPENAI_API_KEY" - Run cognee mcp server with stdio (default)
or stream responses over SSEpython src/server.py
or run with Streamable HTTP transport (recommended for web deployments)python src/server.py --transport ssepython src/server.py --transport http --host 127.0.0.1 --port 8000 --path /mcp
You can do more advanced configurations by creating .env file using our template. To use different LLM providers / database configurations, and for more info check out our documentation.
đł Docker Usage
If you'd rather run cognee-mcp in a container, you have two options:
-
Build locally
-
Make sure you are in /cognee root directory and have a fresh
.envcontaining only yourLLM_API_KEY(and your chosen settings). -
Remove any old image and rebuild:
docker rmi cognee/cognee-mcp:main || true docker build --no-cache -f cognee-mcp/Dockerfile -t cognee/cognee-mcp:main . -
Run it:
# For HTTP transport (recommended for web deployments) docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # For SSE transport docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # For stdio transport (default) docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:mainInstalling optional dependencies at runtime:
You can install optional dependencies when running the container by setting the
EXTRASenvironment variable:# Install a single optional dependency group at runtime docker run \ -e TRANSPORT_MODE=http \ -e EXTRAS=aws \ --env-file ./.env \ -p 8000:8000 \ --rm -it cognee/cognee-mcp:main # Install multiple optional dependency groups at runtime (comma-separated) docker run \ -e TRANSPORT_MODE=sse \ -e EXTRAS=aws,postgres,neo4j \ --env-file ./.env \ -p 8000:8000 \ --rm -it cognee/cognee-mcp:mainAvailable optional dependency groups:
aws- S3 storage supportpostgres/postgres-binary- PostgreSQL database supportneo4j- Neo4j graph database supportneptune- AWS Neptune supportchromadb- ChromaDB vector store supportscraping- Web scraping capabilitiesdistributed- Modal distributed executionlangchain- LangChain integrationllama-index- LlamaIndex integrationanthropic- Anthropic modelsgroq- Groq modelsmistral- Mistral modelsollama/huggingface- Local model supportdocs- Document processingcodegraph- Code analysismonitoring- Sentry & Langfuse monitoringredis- Redis support- And more (see pyproject.toml for full list)
-
-
Pull from Docker Hub (no build required):
# With HTTP transport (recommended for web deployments) docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # With SSE transport docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # With stdio transport (default) docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:mainWith runtime installation of optional dependencies:
# Install optional dependencies from Docker Hub image docker run \ -e TRANSPORT_MODE=http \ -e EXTRAS=aws,postgres \ --env-file ./.env \ -p 8000:8000 \ --rm -it cognee/cognee-mcp:main
Important: Docker vs Direct Usage
Docker uses environment variables, not command line arguments:
- â
Docker:
-e TRANSPORT_MODE=http - â Docker:
--transport http(won't work)
Direct Python usage uses command line arguments:
- â
Direct:
python src/server.py --transport http - â Direct:
-e TRANSPORT_MODE=http(won't work)
Docker API Mode
To connect the MCP Docker container to a Cognee API server running on your host machine:
Simple Usage (Automatic localhost handling):
# Start your Cognee API server on the host
python -m cognee.api.client
# Run MCP container in API mode - localhost is automatically converted!
docker run \
-e TRANSPORT_MODE=sse \
-e API_URL=http://localhost:8000 \
-e API_TOKEN=your_auth_token \
-p 8001:8000 \
--rm -it cognee/cognee-mcp:main
Note: The container will automatically convert localhost to host.docker.internal on Mac/Windows/Docker Desktop. You'll see a message in the logs showing the conversion.
Explicit host.docker.internal (Mac/Windows):
# Or explicitly use host.docker.internal
docker run \
-e TRANSPORT_MODE=sse \
-e API_URL=http://host.docker.internal:8000 \
-e API_TOKEN=your_auth_token \
-p 8001:8000 \
--rm -it cognee/cognee-mcp:main
On Linux (use host network or container IP):
# Option 1: Use host network (simplest)
docker run \
--network host \
-e TRANSPORT_MODE=sse \
-e API_URL=http://localhost:8000 \
-e API_TOKEN=your_auth_token \
--rm -it cognee/cognee-mcp:main
# Option 2: Use host IP address
# First, get your host IP: ip addr show docker0
docker run \
-e TRANSPORT_MODE=sse \
-e API_URL=http://172.17.0.1:8000 \
-e API_TOKEN=your_auth_token \
-p 8001:8000 \
--rm -it cognee/cognee-mcp:main
Environment variables for API mode:
API_URL: URL of the running Cognee API serverAPI_TOKEN: Authentication token (optional, required if API has authentication enabled)
Note: When running in API mode:
- Database migrations are automatically skipped (API server handles its own DB)
- Some features are limited (see API Mode Limitations)
đ MCP Client Configuration
After starting your Cognee MCP server with Docker, you need to configure your MCP client to connect to it.
SSE Transport Configuration (Recommended)
Start the server with SSE transport:
docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
Configure your MCP client:
Claude CLI (Easiest)
claude mcp add cognee-sse -t sse http://localhost:8000/sse
Verify the connection:
claude mcp list
You should see your server connected:
Checking MCP server health...
cognee-sse: http://localhost:8000/sse (SSE) - â Connected
Manual Configuration
Claude (~/.claude.json)
{
"mcpServers": {
"cognee": {
"type": "sse",
"url": "http://localhost:8000/sse"
}
}
}
Cursor (~/.cursor/mcp.json)
{
"mcpServers": {
"cognee-sse": {
"url": "http://localhost:8000/sse"
}
}
}
HTTP Transport Configuration (Alternative)
Start the server with HTTP transport:
docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
Configure your MCP client:
Claude CLI (Easiest)
claude mcp add cognee-http -t http http://localhost:8000/mcp
Verify the connection:
claude mcp list
You should see your server connected:
Checking MCP server health...
cognee-http: http://localhost:8000/mcp (HTTP) - â Connected
Manual Configuration
Claude (~/.claude.json)
{
"mcpServers": {
"cognee": {
"type": "http",
"url": "http://localhost:8000/mcp"
}
}
}
Cursor (~/.cursor/mcp.json)
{
"mcpServers": {
"cognee-http": {
"url": "http://localhost:8000/mcp"
}
}
}
Dual Configuration Example
You can configure both transports simultaneously for testing:
{
"mcpServers": {
"cognee-sse": {
"type": "sse",
"url": "http://localhost:8000/sse"
},
"cognee-http": {
"type": "http",
"url": "http://localhost:8000/mcp"
}
}
}
Note: Only enable the server you're actually running to avoid connection errors.
đ API Mode
The MCP server can operate in two modes:
Direct Mode (Default)
The MCP server directly imports and uses the cognee library. This is the default mode with full feature support.
API Mode
The MCP server connects to an already running Cognee FastAPI server via HTTP requests. This is useful when:
- You have a centralized Cognee API server running
- You want to separate the MCP server from the knowledge graph backend
- You need multiple MCP servers to share the same knowledge graph
Starting the MCP server in API mode:
# Start your Cognee FastAPI server first (default port 8000)
cd /path/to/cognee
python -m cognee.api.client
# Then start the MCP server in API mode
cd cognee-mcp
python src/server.py --api-url http://localhost:8000 --api-token YOUR_AUTH_TOKEN
API Mode with different transports:
# With SSE transport
python src/server.py --transport sse --api-url http://localhost:8000 --api-token YOUR_TOKEN
# With HTTP transport
python src/server.py --transport http --api-url http://localhost:8000 --api-token YOUR_TOKEN
API Mode with Docker:
# On Mac/Windows (use host.docker.internal to access host)
docker run \
-e TRANSPORT_MODE=sse \
-e API_URL=http://host.docker.internal:8000 \
-e API_TOKEN=YOUR_TOKEN \
-p 8001:8000 \
--rm -it cognee/cognee-mcp:main
# On Linux (use host network)
docker run \
--network host \
-e TRANSPORT_MODE=sse \
-e API_URL=http://localhost:8000 \
-e API_TOKEN=YOUR_TOKEN \
--rm -it cognee/cognee-mcp:main
Command-line arguments for API mode:
--api-url: Base URL of the running Cognee FastAPI server (e.g.,http://localhost:8000)--api-token: Authentication token for the API (optional, required if API has authentication enabled)
Docker environment variables for API mode:
API_URL: Base URL of the running Cognee FastAPI serverAPI_TOKEN: Authentication token (optional, required if API has authentication enabled)
API Mode limitations: Some features are only available in direct mode:
codify(code graph pipeline)cognify_status/codify_status(pipeline status tracking)prune(data reset)get_developer_rules(developer rules retrieval)list_datawith specific dataset_id (detailed data listing)
Basic operations like cognify, search, delete, and list_data (all datasets) work in both modes.
đť Basic Usage
The MCP server exposes its functionality through tools. Call them from any MCP client (Cursor, Claude Desktop, Cline, Roo and more).
Available Tools
-
cognify: Turns your data into a structured knowledge graph and stores it in memory
-
codify: Analyse a code repository, build a code graph, stores it in memory
-
search: Query memory â supports GRAPH_COMPLETION, RAG_COMPLETION, CODE, CHUNKS
-
list_data: List all datasets and their data items with IDs for deletion operations
-
delete: Delete specific data from a dataset (supports soft/hard deletion modes)
-
prune: Reset cognee for a fresh start (removes all data)
-
cognify_status / codify_status: Track pipeline progress
Data Management Examples:
# List all available datasets and data items
list_data()
# List data items in a specific dataset
list_data(dataset_id="your-dataset-id-here")
# Delete specific data (soft deletion - safer, preserves shared entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="soft")
# Delete specific data (hard deletion - removes orphaned entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="hard")
Development and Debugging
Debugging
To use debugger, run:
bash mcp dev src/server.py
Open inspector with timeout passed:
http://localhost:5173?timeout=120000
To apply new changes while developing cognee you need to do:
- Update dependencies in cognee folder if needed
uv sync --dev --all-extras --reinstallmcp dev src/server.py
Development
In order to use local cognee:
-
Uncomment the following line in the cognee-mcp file and set the cognee root path.
#"cognee[postgres,codegraph,gemini,huggingface,docs,neo4j] @ file:/Users/<username>/Desktop/cognee"Remember to replace
file:/Users/<username>/Desktop/cogneewith your actual cognee root path. -
Install dependencies with uv in the mcp folder
uv sync --reinstall
Code of Conduct
We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information.