sethdavis512/obsidian-mcp-server
If you are the rightful owner of obsidian-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Obsidian MCP Server is a comprehensive Model Context Protocol server that integrates your Obsidian vault with OpenAI's API, designed to work with any MCP-compatible client.
Obsidian MCP Server
A comprehensive Model Context Protocol (MCP) server that integrates your Obsidian vault with OpenAI's API. Client-agnostic design works with Windsurf, Claude Desktop, Cursor, VS Code, and any MCP-compatible client.
π Related Project: For enhanced Obsidian integration, see the companion Obsidian MCP Plugin that provides direct plugin API access.
Features
- Full Vault Access: Read, write, search, and manage your Obsidian notes
- AI-Powered Operations: Summarize notes, generate content, extract tasks, and answer questions using OpenAI
- Flexible Transport: Supports both stdio (local) and HTTP/SSE (remote) connections
- Rich Search: Search by content, tags, paths with intelligent scoring
- Metadata Support: Handle YAML frontmatter and inline tags
- Bulk Operations: Process multiple notes efficiently
- Production Ready: Comprehensive error handling, logging, and configuration
Quick Start
1. Installation
# Clone or download the project
cd obsidian-mcp-server
npm install
npm run build
2. Configuration
Copy the environment template and configure:
cp .env.example .env
Edit .env
with your settings:
OPENAI_API_KEY=your_openai_api_key_here
OBSIDIAN_VAULT_PATH=/path/to/your/obsidian/vault
3. Test the Server
# Test stdio mode
npm run dev
# Test HTTP mode (in another terminal)
node dist/http-index.js
4. MCP Client Integration
π Multi-Client Support: This server works with any MCP-compatible client. See for configuration examples for Claude Desktop, Cursor, VS Code, and more.
Windsurf Configuration
Option A: Local (stdio) Mode
Add to your Windsurf MCP configuration:
{
"mcpServers": {
"obsidian-mcp-server": {
"command": "node",
"args": ["/absolute/path/to/obsidian-mcp-server/dist/index.js"],
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"OBSIDIAN_VAULT_PATH": "/path/to/your/obsidian/vault"
}
}
}
}
Option B: Remote (HTTP/SSE) Mode
- Start the HTTP server:
npm run start:http
- Add to your Windsurf MCP configuration:
{
"mcpServers": {
"obsidian-mcp-server-remote": {
"url": "http://localhost:3000/sse",
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"OBSIDIAN_VAULT_PATH": "/path/to/your/obsidian/vault"
}
}
}
}
Available Tools
Vault Operations
list_notes
- List all notes in the vaultread_note
- Read a specific notewrite_note
- Create or update a notedelete_note
- Delete a notesearch_notes
- Search notes by content, tags, or pathvault_stats
- Get vault statistics
AI-Powered Operations
summarize_note
- Generate AI summary of a notesummarize_notes
- Generate AI summary of multiple notesgenerate_content
- Generate new content using AIreformat_note
- Reformat a note using AIextract_tasks
- Extract tasks and to-dos from notesweekly_digest
- Generate a weekly digest of recent notesanswer_question
- Answer questions based on vault contentgenerate_tags
- Generate suggested tags for a note
Example Usage in Windsurf
Once configured, you can use natural language commands in Windsurf:
"Summarize all notes tagged #project"
"Create a weekly digest of my recent notes"
"Extract all tasks from my meeting notes"
"Answer: What are the key themes in my research notes?"
"Generate a summary of notes containing 'machine learning'"
Configuration Options
Environment Variables
Variable | Required | Default | Description |
---|---|---|---|
OPENAI_API_KEY | Yes | - | Your OpenAI API key |
OBSIDIAN_VAULT_PATH | Yes | - | Absolute path to your Obsidian vault |
OPENAI_MODEL | No | gpt-4 | OpenAI model to use |
OPENAI_MAX_TOKENS | No | 2000 | Maximum tokens per request |
OPENAI_TEMPERATURE | No | 0.7 | Temperature for AI responses |
MCP_SERVER_NAME | No | obsidian-mcp-server | Server name |
MCP_SERVER_VERSION | No | 1.0.0 | Server version |
DEBUG | No | false | Enable debug logging |
SERVER_HOST | No | localhost | HTTP server host |
SERVER_PORT | No | 3000 | HTTP server port |
Windsurf MCP Configuration
The server supports both local and remote configurations. See the config/
directory for complete examples:
windsurf-mcp-config.json
- Local stdio configurationwindsurf-mcp-config-remote.json
- Remote HTTP/SSE configuration
Related Projects
- Obsidian MCP Plugin - Direct Obsidian plugin for enhanced integration with plugin API access
- Obsidian MCP Server - This repository - Universal MCP server for file-based vault access
Architecture
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β MCP Client β β MCP Server β β Obsidian Vault β
β (Windsurf/ClaudeβββββΊβ βββββΊβ (Markdown) β
β /Cursor/etc.) β β β β β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β
βΌ
βββββββββββββββββββ
β OpenAI API β
β (GPT-4/3.5) β
βββββββββββββββββββ
Core Components
- ObsidianVault: File system operations and note management
- OpenAIClient: AI-powered content processing
- ObsidianMCPServer: MCP protocol implementation
- ObsidianHTTPServer: HTTP/SSE transport layer
Development
Project Structure
obsidian-mcp-server/
βββ src/
β βββ index.ts # Main entry point (stdio)
β βββ http-index.ts # HTTP server entry point
β βββ http-server.ts # HTTP server implementation
β βββ mcp-server.ts # MCP server implementation
β βββ vault.ts # Obsidian vault operations
β βββ openai-client.ts # OpenAI integration
β βββ types.ts # TypeScript definitions
βββ config/
β βββ windsurf-mcp-config.json # Local config example
β βββ windsurf-mcp-config-remote.json # Remote config example
βββ test/
β βββ test.js # Test script
βββ dist/ # Compiled JavaScript
βββ package.json
βββ tsconfig.json
βββ .env.example
βββ README.md
Building
npm run build # Compile TypeScript to dist/
npm run dev # Run MCP server in development mode (stdio)
npm run dev:http # Run HTTP server in development mode
npm start # Run compiled MCP server (stdio)
npm run start:http # Run compiled HTTP server
npm test # Run comprehensive test suite
Testing
# Run the test script
npm test
# Manual testing with curl (HTTP mode)
curl http://localhost:3000/health
curl http://localhost:3000/info
Security Considerations
- API Keys: Store OpenAI API keys securely, never commit to version control
- Vault Access: The server has full read/write access to your Obsidian vault
- Network: HTTP mode exposes the server on the network - use appropriate firewall rules
- Environment: Use environment variables for sensitive configuration
Troubleshooting
Common Issues
-
"Vault path does not exist"
- Ensure
OBSIDIAN_VAULT_PATH
points to a valid directory - Use absolute paths only
- Ensure
-
"OpenAI API key not found"
- Set
OPENAI_API_KEY
in your environment or.env
file - Verify the API key is valid and has sufficient credits
- Set
-
"Cannot connect to MCP server"
- Check that the server is running
- Verify the configuration in Windsurf matches your setup
- Check file permissions and paths
-
"Module not found" errors
- Run
npm install
to install dependencies - Run
npm run build
to compile TypeScript
- Run
Debug Mode
Enable debug logging:
DEBUG=true npm run dev
Logs
- Stdio mode: Logs go to stderr
- HTTP mode: Logs go to console and can be redirected
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
License
MIT License - see LICENSE file for details
Support
- Check the troubleshooting section above
- Review the Windsurf MCP documentation
- Open an issue for bugs or feature requests
Note: This server provides full access to your Obsidian vault and uses OpenAI's API. Ensure you understand the security implications and costs before deploying in production environments.