llms-fetch-mcp

Crazytieguy/llms-fetch-mcp

3.2

If you are the rightful owner of llms-fetch-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

llms-fetch-mcp is a server that fetches web content in formats optimized for large language models (LLMs), utilizing the emerging llms.txt standard.

llms-fetch-mcp

MCP server that fetches web content in LLM-friendly formats. Automatically discovers and uses llms.txt files when available, tries Markdown versions, and falls back to clean HTML-to-Markdown conversion.

Quick Start

Add to your MCP client configuration:

Claude Desktop / Claude Code

{
  "mcpServers": {
    "llms-fetch": {
      "command": "npx",
      "args": ["-y", "llms-fetch-mcp"]
    }
  }
}

Cursor IDE

{
  "mcp.servers": {
    "llms-fetch": {
      "command": "npx",
      "args": ["-y", "llms-fetch-mcp"]
    }
  }
}

How It Works

When you fetch a URL, the server tries multiple sources in parallel:

  1. https://example.com/llms-full.txt - Comprehensive LLM documentation
  2. https://example.com/llms.txt - Concise LLM documentation
  3. https://example.com.md - Markdown version
  4. https://example.com/index.md - Directory Markdown
  5. https://example.com - Original URL (converts HTML to Markdown if needed)

Content is cached locally in .llms-fetch-mcp/ for quick access. The server automatically generates a table of contents for cached files to help navigate large documents.

Configuration

Table of Contents Settings

The server intelligently generates a table of contents, selecting heading levels to maximize detail while staying within budget:

  • --toc-budget - Maximum ToC size in bytes (default: 4000)
  • --toc-threshold - Minimum document size in bytes to generate ToC (default: 8000)

With npx:

{
  "mcpServers": {
    "llms-fetch": {
      "command": "npx",
      "args": ["-y", "llms-fetch-mcp", "--toc-budget", "2000", "--toc-threshold", "4000"]
    }
  }
}

With installed binary:

{
  "mcpServers": {
    "llms-fetch": {
      "command": "llms-fetch-mcp",
      "args": ["--toc-budget", "2000", "--toc-threshold", "4000"]
    }
  }
}

Custom Cache Directory

With npx:

{
  "mcpServers": {
    "llms-fetch": {
      "command": "npx",
      "args": ["-y", "llms-fetch-mcp", "/path/to/custom/cache"]
    }
  }
}

With installed binary:

{
  "mcpServers": {
    "llms-fetch": {
      "command": "llms-fetch-mcp",
      "args": ["/path/to/custom/cache"]
    }
  }
}

Why llms.txt?

llms.txt is an emerging standard for websites to provide LLM-optimized documentation. Sites like FastHTML, Anthropic Docs, and others are adopting it. This server automatically discovers and uses these files when available, giving you cleaner, more concise content than HTML scraping.

Installation

If you prefer installing instead of using npx:

Shell (macOS/Linux)

curl --proto '=https' --tlsv1.2 -LsSf https://github.com/Crazytieguy/llms-fetch-mcp/releases/latest/download/llms-fetch-mcp-installer.sh | sh

PowerShell (Windows)

irm https://github.com/Crazytieguy/llms-fetch-mcp/releases/latest/download/llms-fetch-mcp-installer.ps1 | iex

Homebrew

brew install Crazytieguy/tap/llms-fetch-mcp

npm

npm install -g llms-fetch-mcp

Cargo

cargo install llms-fetch-mcp

License

MIT