autowriter-mcp

autowriter-mcp

3.3

If you are the rightful owner of autowriter-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Autowriter MCP Server is a token-saving Model Context Protocol server that facilitates AI-powered content generation by integrating Obsidian vaults with LMStudio, significantly reducing Claude token usage.

Autowriter MCP Server

A token-saving Model Context Protocol (MCP) server that coordinates automated writing workflows between Obsidian vaults and LMStudio. This server enables AI-powered content generation while dramatically reducing Claude token usage by generating content locally via LMStudio and saving directly to Obsidian vaults.

🚀 Key Features

  • Token-Saving Architecture: Save 80-90% of Claude tokens by generating content locally via LMStudio
  • Direct Obsidian Integration: Seamlessly integrates with Obsidian vaults and markdown files
  • Automated Book Writing: Analyze book structure and generate missing sections automatically
  • Batch Content Generation: Generate multiple sections efficiently in one operation
  • Queue Management: Queue writing tasks and process them automatically
  • Index Link Management: Automatically update Obsidian index files with generated content links

🛠 Prerequisites

  • Python 3.11+: Required for running the MCP server
  • LMStudio: Running locally on http://localhost:1234 (or configurable URL)
  • Claude Desktop: With MCP support for connecting to the server
  • Obsidian: For managing your writing vault (optional but recommended)

📦 Installation

Method 1: Using uvx (Recommended)

# Install and run directly with uvx
uvx autowriter-mcp '/path/to/your/obsidian/vault'

Method 2: Using pip

# Install from PyPI
pip install autowriter-mcp

# Run the server
autowriter-mcp '/path/to/your/obsidian/vault'

Method 3: Local Development

# Clone the repository
git clone https://github.com/infinitimeless/autowriter-mcp.git
cd autowriter-mcp

# Create virtual environment
python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

# Install dependencies
pip install -e .

# Run the server
autowriter-mcp '/path/to/your/obsidian/vault'

⚙️ Configuration

Command Line Options

autowriter-mcp [vault_path] [options]

Arguments:
  vault_path              Path to your Obsidian vault directory (required)

Options:
  --index-file FILENAME   Index file name (default: book_index.md)
  --lmstudio-url URL      LMStudio server URL (default: http://localhost:1234)
  --version               Show version information
  --help                  Show help message

Claude Desktop Configuration

Add to your Claude Desktop MCP configuration:

For uvx installation:
{
  "mcpServers": {
    "autowriter-mcp": {
      "command": "uvx",
      "args": ["autowriter-mcp", "/path/to/your/obsidian/vault"]
    }
  }
}
For local development:
{
  "mcpServers": {
    "autowriter-mcp": {
      "command": "/path/to/autowriter-mcp/.venv/bin/python",
      "args": ["-m", "autowriter_mcp.server", "/path/to/your/obsidian/vault"]
    }
  }
}

🎯 Usage

1. Prepare Your Obsidian Vault

Create an index file (default: book_index.md) in your vault with your book structure:

# My Book Title

## Chapter 1: Introduction
## Chapter 2: Getting Started
## Chapter 3: Advanced Topics
## Chapter 4: Conclusion

2. Available MCP Tools

analyze_book_structure

Analyze your vault structure and identify missing content sections.

generate_and_save_section

🚀 Token-Saving: Generate content locally via LMStudio and save directly to vault.

batch_generate_sections

🚀 Ultra Token-Saving: Generate multiple sections in one operation.

process_writing_queue

🚀 Auto Token-Saving: Process entire writing queue automatically.

request_content_generation

Add content generation requests to the queue for later processing.

update_index_links

Update your index file with Obsidian links to newly created content.

get_writing_status

Get current progress and queue status.

🏗 Architecture

Token-Saving Design

The server is specifically designed to minimize Claude token usage:

  1. Local Generation: Content is generated by LMStudio, not Claude
  2. Direct File Writing: Content is saved directly to vault files
  3. Metadata Only: Claude only receives generation metadata, not content
  4. Batch Processing: Multiple sections can be generated in one operation

📄 License

This project is licensed under the MIT License - see the file for details.

🙏 Acknowledgments


🚀 Save Claude tokens while accelerating your writing workflow with autowriter-mcp!