obsidian-mcp-server

sethdavis512/obsidian-mcp-server

3.3

If you are the rightful owner of obsidian-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Obsidian MCP Server is a comprehensive Model Context Protocol server that integrates your Obsidian vault with OpenAI's API, designed to work with any MCP-compatible client.

Tools
5
Resources
0
Prompts
0

Obsidian MCP Server

A comprehensive Model Context Protocol (MCP) server that integrates your Obsidian vault with OpenAI's API. Client-agnostic design works with Windsurf, Claude Desktop, Cursor, VS Code, and any MCP-compatible client.

πŸ”Œ Related Project: For enhanced Obsidian integration, see the companion Obsidian MCP Plugin that provides direct plugin API access.

Features

  • Full Vault Access: Read, write, search, and manage your Obsidian notes
  • AI-Powered Operations: Summarize notes, generate content, extract tasks, and answer questions using OpenAI
  • Flexible Transport: Supports both stdio (local) and HTTP/SSE (remote) connections
  • Rich Search: Search by content, tags, paths with intelligent scoring
  • Metadata Support: Handle YAML frontmatter and inline tags
  • Bulk Operations: Process multiple notes efficiently
  • Production Ready: Comprehensive error handling, logging, and configuration

Quick Start

1. Installation

# Clone or download the project
cd obsidian-mcp-server
npm install
npm run build

2. Configuration

Copy the environment template and configure:

cp .env.example .env

Edit .env with your settings:

OPENAI_API_KEY=your_openai_api_key_here
OBSIDIAN_VAULT_PATH=/path/to/your/obsidian/vault

3. Test the Server

# Test stdio mode
npm run dev

# Test HTTP mode (in another terminal)
node dist/http-index.js

4. MCP Client Integration

πŸ“‹ Multi-Client Support: This server works with any MCP-compatible client. See for configuration examples for Claude Desktop, Cursor, VS Code, and more.

Windsurf Configuration
Option A: Local (stdio) Mode

Add to your Windsurf MCP configuration:

{
  "mcpServers": {
    "obsidian-mcp-server": {
      "command": "node",
      "args": ["/absolute/path/to/obsidian-mcp-server/dist/index.js"],
      "env": {
        "OPENAI_API_KEY": "your_openai_api_key_here",
        "OBSIDIAN_VAULT_PATH": "/path/to/your/obsidian/vault"
      }
    }
  }
}
Option B: Remote (HTTP/SSE) Mode
  1. Start the HTTP server:
npm run start:http
  1. Add to your Windsurf MCP configuration:
{
  "mcpServers": {
    "obsidian-mcp-server-remote": {
      "url": "http://localhost:3000/sse",
      "env": {
        "OPENAI_API_KEY": "your_openai_api_key_here",
        "OBSIDIAN_VAULT_PATH": "/path/to/your/obsidian/vault"
      }
    }
  }
}

Available Tools

Vault Operations

  • list_notes - List all notes in the vault
  • read_note - Read a specific note
  • write_note - Create or update a note
  • delete_note - Delete a note
  • search_notes - Search notes by content, tags, or path
  • vault_stats - Get vault statistics

AI-Powered Operations

  • summarize_note - Generate AI summary of a note
  • summarize_notes - Generate AI summary of multiple notes
  • generate_content - Generate new content using AI
  • reformat_note - Reformat a note using AI
  • extract_tasks - Extract tasks and to-dos from notes
  • weekly_digest - Generate a weekly digest of recent notes
  • answer_question - Answer questions based on vault content
  • generate_tags - Generate suggested tags for a note

Example Usage in Windsurf

Once configured, you can use natural language commands in Windsurf:

"Summarize all notes tagged #project"
"Create a weekly digest of my recent notes"
"Extract all tasks from my meeting notes"
"Answer: What are the key themes in my research notes?"
"Generate a summary of notes containing 'machine learning'"

Configuration Options

Environment Variables

VariableRequiredDefaultDescription
OPENAI_API_KEYYes-Your OpenAI API key
OBSIDIAN_VAULT_PATHYes-Absolute path to your Obsidian vault
OPENAI_MODELNogpt-4OpenAI model to use
OPENAI_MAX_TOKENSNo2000Maximum tokens per request
OPENAI_TEMPERATURENo0.7Temperature for AI responses
MCP_SERVER_NAMENoobsidian-mcp-serverServer name
MCP_SERVER_VERSIONNo1.0.0Server version
DEBUGNofalseEnable debug logging
SERVER_HOSTNolocalhostHTTP server host
SERVER_PORTNo3000HTTP server port

Windsurf MCP Configuration

The server supports both local and remote configurations. See the config/ directory for complete examples:

  • windsurf-mcp-config.json - Local stdio configuration
  • windsurf-mcp-config-remote.json - Remote HTTP/SSE configuration

Related Projects

Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   MCP Client    β”‚    β”‚   MCP Server    β”‚    β”‚  Obsidian Vault β”‚
β”‚ (Windsurf/Claude│◄──►│                 │◄──►│   (Markdown)    β”‚
β”‚  /Cursor/etc.)  β”‚    β”‚                 β”‚    β”‚                 β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                              β”‚
                              β–Ό
                       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
                       β”‚   OpenAI API    β”‚
                       β”‚   (GPT-4/3.5)   β”‚
                       β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Core Components

  • ObsidianVault: File system operations and note management
  • OpenAIClient: AI-powered content processing
  • ObsidianMCPServer: MCP protocol implementation
  • ObsidianHTTPServer: HTTP/SSE transport layer

Development

Project Structure

obsidian-mcp-server/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ index.ts           # Main entry point (stdio)
β”‚   β”œβ”€β”€ http-index.ts      # HTTP server entry point
β”‚   β”œβ”€β”€ http-server.ts     # HTTP server implementation
β”‚   β”œβ”€β”€ mcp-server.ts      # MCP server implementation
β”‚   β”œβ”€β”€ vault.ts           # Obsidian vault operations
β”‚   β”œβ”€β”€ openai-client.ts   # OpenAI integration
β”‚   └── types.ts           # TypeScript definitions
β”œβ”€β”€ config/
β”‚   β”œβ”€β”€ windsurf-mcp-config.json        # Local config example
β”‚   └── windsurf-mcp-config-remote.json # Remote config example
β”œβ”€β”€ test/
β”‚   └── test.js            # Test script
β”œβ”€β”€ dist/                  # Compiled JavaScript
β”œβ”€β”€ package.json
β”œβ”€β”€ tsconfig.json
β”œβ”€β”€ .env.example
└── README.md

Building

npm run build      # Compile TypeScript to dist/
npm run dev        # Run MCP server in development mode (stdio)
npm run dev:http   # Run HTTP server in development mode  
npm start          # Run compiled MCP server (stdio)
npm run start:http # Run compiled HTTP server
npm test           # Run comprehensive test suite

Testing

# Run the test script
npm test

# Manual testing with curl (HTTP mode)
curl http://localhost:3000/health
curl http://localhost:3000/info

Security Considerations

  • API Keys: Store OpenAI API keys securely, never commit to version control
  • Vault Access: The server has full read/write access to your Obsidian vault
  • Network: HTTP mode exposes the server on the network - use appropriate firewall rules
  • Environment: Use environment variables for sensitive configuration

Troubleshooting

Common Issues

  1. "Vault path does not exist"

    • Ensure OBSIDIAN_VAULT_PATH points to a valid directory
    • Use absolute paths only
  2. "OpenAI API key not found"

    • Set OPENAI_API_KEY in your environment or .env file
    • Verify the API key is valid and has sufficient credits
  3. "Cannot connect to MCP server"

    • Check that the server is running
    • Verify the configuration in Windsurf matches your setup
    • Check file permissions and paths
  4. "Module not found" errors

    • Run npm install to install dependencies
    • Run npm run build to compile TypeScript

Debug Mode

Enable debug logging:

DEBUG=true npm run dev

Logs

  • Stdio mode: Logs go to stderr
  • HTTP mode: Logs go to console and can be redirected

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

License

MIT License - see LICENSE file for details

Support

  • Check the troubleshooting section above
  • Review the Windsurf MCP documentation
  • Open an issue for bugs or feature requests

Note: This server provides full access to your Obsidian vault and uses OpenAI's API. Ensure you understand the security implications and costs before deploying in production environments.