Nonymaus/cursor-kg
If you are the rightful owner of cursor-kg and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A local-first Knowledge Graph MCP Server that enhances coding efficiency by integrating with Cursor IDE.
š§ Knowledge Graph MCP Server
A blazingly fast, local-first Knowledge Graph server that connects to Cursor IDE. Think of it as your personal AI memory that gets smarter as you code, without sending your data anywhere.
⨠What This Does
- š 10-40x Faster than other knowledge graph solutions
- š 100% Local - Your code never leaves your machine (no API keys needed!)
- š§ Smart Memory - Remembers your conversations, decisions, and code patterns
- ā” Real-time - Syncs instantly with Cursor IDE as you work
- š Powerful Search - Find anything across your entire codebase and conversations
- š”ļø Secure - Built-in authentication and input validation
š Quick Start (10 Minutes to Running)
Prerequisites
- Rust (we'll install this for you)
- macOS, Linux, or Windows
- Cursor IDE (recommended) or any MCP-compatible editor
Step 1: Get the Code
git clone https://github.com/Nonymaus/cursor-kg.git
cd cursor-kg
Step 2: Install Rust (if you don't have it)
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source ~/.cargo/env
Step 3: Build and Run
# Build the server (takes 2-3 minutes first time)
cargo build --release
# Start the server
cargo run --release
That's it! The server is now running on http://localhost:8360
š
š§ Connect to Cursor IDE
Automatic Setup (Recommended)
# This creates the config file for you
echo '{
"mcpServers": {
"cursor-kg": {
"url": "http://localhost:8360/sse"
}
}
}' > ~/.cursor/mcp.json
Manual Setup
- Open Cursor IDE
- Go to Settings ā MCP Servers
- Add this configuration:
{ "mcpServers": { "cursor-kg": { "url": "http://localhost:8360/sse" } } }
Test the Connection
In Cursor, try asking: "What's in my knowledge graph?"
If it works, you'll see a response from the server! š
āļø Configuration Options
All settings are in config.toml
. Here are the most important ones:
# Basic Settings
[database]
filename = "knowledge_graph.db" # Where your data is stored
[embeddings]
model_name = "nomic-embed-text-v1.5" # AI model for understanding text
batch_size = 16 # How many texts to process at once
# Security (optional)
[security]
enable_authentication = false # Set to true for API key protection
api_key = "" # Your secret key (if auth enabled)
rate_limit_requests_per_minute = 60 # Prevent spam
# Performance
[memory]
max_cache_size_mb = 128 # How much RAM to use for caching
š” Tip: The defaults work great for most people. Only change these if you know what you're doing!
š® How to Use It
Basic Commands
# Start the server
cargo run --release
# Start with debug info (if something's wrong)
RUST_LOG=debug cargo run --release
# Run on a different port
MCP_PORT=9000 cargo run --release
# Check if it's working
curl http://localhost:8360/health
# Should return: {"status":"ok"}
What You Can Do
Once connected to Cursor, you can:
š¬ Ask Questions
- "What did we discuss about the authentication system?"
- "Show me all the functions related to database queries"
- "What are the main components of this project?"
š Add Information
- "Remember that we decided to use SQLite for the database"
- "Add this code pattern to the knowledge graph"
- "Store this meeting summary"
š Search & Analyze
- "Find similar code patterns"
- "What are the dependencies between these files?"
- "Show me the project structure"
Advanced Usage
š Enable Security (for production):
# In config.toml
[security]
enable_authentication = true
api_key = "your-secret-key-here"
š³ Run with Docker:
docker build -t cursor-kg .
docker run -p 8360:8360 cursor-kg
š Monitor Performance:
# Check server stats
curl http://localhost:8360/metrics
šØ Troubleshooting
Server Won't Start
# Check if port is already in use
lsof -i :8360
# Try a different port
MCP_PORT=8361 cargo run --release
# Check for errors
RUST_LOG=debug cargo run --release
Cursor Can't Connect
- Check the server is running: Visit http://localhost:8360/health
- Verify Cursor config: Make sure
~/.cursor/mcp.json
has the right URL - Restart Cursor: Sometimes it needs a restart to pick up new MCP servers
- Check the logs: Look for error messages in the terminal where you started the server
Performance Issues
# Check database size
ls -lh knowledge_graph.db
# Clear cache and restart
rm -rf ~/.cache/cursor-kg/
cargo run --release
# Reduce memory usage in config.toml
[memory]
max_cache_size_mb = 64 # Default is 128
Common Errors
"Failed to bind to address" ā Port 8360 is already in use. Try a different port or kill the other process.
"Database is locked"
ā Another instance might be running. Check with ps aux | grep cursor-kg
"Model not found" ā The AI model is downloading. Wait a few minutes and try again.
šļø Architecture (For Developers)
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā Cursor IDE āāāāāŗā MCP Protocol āāāāāŗā cursor-kg ā
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāā¼āāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā ā¼ ā
ā āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā ā
ā ā Graph Engine ā ā
ā ā āāāāāāāāāāāāāāā¬āāāāāāāāāāāāāāāāāā ā ā
ā ā ā Episodes ā Relationships ā ā ā
ā ā ā Entities ā Embeddings ā ā ā
ā ā āāāāāāāāāāāāāāā“āāāāāāāāāāāāāāāāāā ā ā
ā āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā ā
ā ā ā
ā āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā ā
ā ā Storage Layer ā ā
ā ā āāāāāāāāāāāāāāā¬āāāāāāāāāāāāāāāāāā ā ā
ā ā ā SQLite ā Cache ā ā ā
ā ā ā FTS5 ā In-Memory ā ā ā
ā ā āāāāāāāāāāāāāāā“āāāāāāāāāāāāāāāāāā ā ā
ā āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā ā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Tech Stack:
- Rust - Fast, safe systems programming
- SQLite + FTS5 - Local database with full-text search
- ONNX Runtime - Local AI models (no internet required)
- MCP Protocol - Standard way to connect to editors
š Performance
This thing is fast. Here's why:
- Written in Rust - Compiled, not interpreted
- Local everything - No network calls to AI APIs
- Smart caching - Frequently used data stays in memory
- Efficient storage - SQLite with full-text search built-in
Real numbers:
- Memory: ~50MB baseline (grows with your data)
- Storage: ~2MB per 1000 conversations/episodes
- Speed: 10-40x faster than Python-based alternatives
š§ Development
Want to contribute or modify the code? Here's how:
Project Structure
cursor-kg/
āāā src/
ā āāā main.rs # Server entry point
ā āāā mcp/ # MCP protocol handling
ā āāā graph/ # Knowledge graph logic
ā āāā embeddings/ # AI model integration
ā āāā search/ # Search functionality
ā āāā security/ # Authentication & validation
āāā config.toml # Configuration
āāā tests/ # Test files
āāā README.md # This file
Running Tests
# Run all tests
cargo test
# Run with output
cargo test -- --nocapture
# Run specific test
cargo test test_name
Making Changes
- Fork the repo on GitHub
- Make your changes in a new branch
- Test everything with
cargo test
- Submit a pull request
Adding Features
- New MCP tools: Add to
src/mcp/handlers.rs
- Database changes: Modify
src/graph/storage.rs
- Configuration options: Update
config.toml
andsrc/config/mod.rs
š³ Docker (Optional)
If you prefer containers:
# Build and run
docker build -t cursor-kg .
docker run -p 8360:8360 cursor-kg
# Or use docker-compose
docker-compose up -d
š More Information
- Security: See for security features
- Configuration: See for detailed config options
- Development: Check out the other
.md
files for implementation details
š¤ Contributing
Found a bug? Want to add a feature? Contributions are welcome!
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
š License
This project is licensed under the MIT License - see the file for details.
š Acknowledgments
- Built with Rust for performance and safety
- Uses SQLite for reliable local storage
- Integrates with Cursor IDE via the MCP protocol
- AI embeddings powered by ONNX Runtime
Questions? Open an issue on GitHub or check the troubleshooting section above!