llabusch93/ObsidianRAG
If you are the rightful owner of ObsidianRAG and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
ObsidianRAG MCP Server is a high-performance, Docker-based RAG server designed to transform your local Obsidian vault into an intelligent, queryable knowledge base.
ObsidianRAG MCP Server
A high-performance, Docker-based RAG (Retrieval-Augmented Generation) server designed to turn your local Obsidian vault into an intelligent, queryable knowledge base.
This server acts as a long-term memory, allowing you to perform semantic searches and ask complex questions about your own notes. It's built to be used as a secure, local MCP (Model Context Protocol) tool with the Gemini CLI.
Features
- 🧠 Intelligent Search: Uses Google's state-of-the-art
text-embedding-004model to understand the meaning of your notes, not just keywords. - 🔒 Private & Secure: Your notes are indexed and stored in a local vector database. Only the text for embedding generation is sent to the Google API, and your API key is stored locally.
- 🚀 Optimized for Apple Silicon: The Docker container is explicitly built for the
linux/arm64architecture to ensure maximum performance on M-series Macs. - 📦 Containerized & Simple: The entire application is managed via Docker Compose, making setup and teardown a breeze.
- 💾 Persistent Knowledge: The vector database is stored in a persistent Docker volume, so your knowledge base survives container restarts.
- 🔧 Dynamic Configuration: Easily configure the path to your Obsidian vault via an environment file.
Getting Started
These instructions will get you a copy of the project up and running on your local machine.
Prerequisites
- Docker (or an alternative like OrbStack)
- An active Google AI API Key. You can get one from Google AI Studio.
Installation
-
Clone the repository:
git clone https://github.com/your-username/ObsidianRAG.git cd ObsidianRAG -
Configure your environment:
- Rename the example environment file:
mv .env.example .env - Open the
.envfile with a text editor and add your Google API Key and the absolute path to your Obsidian vault.# Your Google AI API Key for generating embeddings GOOGLE_API_KEY="AIzaSy..." # The absolute path to your local Obsidian Vault directory OBSIDIAN_VAULT_PATH="/path/to/your/vault"
- Rename the example environment file:
-
Build and run the server:
- From the project's root directory, run the following command:
docker-compose up --build -d - The first time you run this, Docker will build the image and download all dependencies. The server will then start in the background.
- On the first launch, the server will begin indexing your entire vault. This may take a few minutes depending on the size of your vault. You can monitor the progress with:
docker-compose logs -f
- From the project's root directory, run the following command:
Usage
This server is designed to be used as a tool within the Gemini CLI.
-
Open your Gemini CLI
settings.jsonfile. -
Add the following configuration to the
"tools"array:{ "tool_type": "MCP", "name": "obsidian_vault", "display_name": "Obsidian Vault", "description": "Queries my personal knowledge management system (Obsidian) for notes, concepts, and connections. Useful for complex questions based on my personal knowledge.", "url": "http://localhost:8000/mcp", "is_enabled": true } -
Restart the Gemini CLI to apply the changes.
-
Query your knowledge base! You can now ask questions directly from the CLI:
/tool obsidian_vault What are my most important notes on artificial intelligence?/tool obsidian_vault Summarize my thoughts on project management.
License
This project is licensed under the MIT License - see the file for details.