docs-mcp-server

docs-mcp-server

3.3

If you are the rightful owner of docs-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

A lightweight MCP server that searches and retrieves relevant documentation content from popular AI libraries like LangChain, LlamaIndex, and OpenAI using a combination of web search and content parsing.

The MCP Docs Search Server is designed to facilitate dynamic querying and fetching of up-to-date documentation content for Language Models (LLMs). It acts as a bridge between LLMs and external documentation sources, enabling seamless integration and interaction. The Model Context Protocol (MCP) is an open standard that allows developers to create secure, two-way connections between data sources and AI-powered tools. By using MCP, LLMs can interact with external tools and services in a standardized, modular, and scalable manner. This server uses web search integration to query Google and retrieve top documentation pages, parses HTML content to extract clean text, and provides a structured tool for LLM agents to query specific libraries in real-time.

Features

  • Web Search Integration: Uses the Serper API to query Google and retrieve the top documentation pages related to a given search query.
  • Clean Content Extraction: Parses HTML content using BeautifulSoup to extract clean, human-readable text, stripping away unnecessary tags, ads, or navigation content.
  • Seamless LLM Tooling: Exposes a structured get_docs tool that can be used within LLM agents (e.g., Claude, GPT) to query specific libraries in real-time.

Tools

  1. get_docs

    The core tool provided by the MCP server. It accepts a search term or phrase and a library name (langchain, llama-index, or openai) to search for relevant documentation pages, fetch and parse clean text content, and send the result back to the LLM.