Mcp-Server

Mcp-Server

3.2

If you are the rightful owner of Mcp-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This project implements a Model Context Protocol (MCP) server in Python to provide Large Language Models (LLMs) with real-time access to the latest documentation for specified Python libraries.

The MCP Server for Up-to-Date Library Documentation is designed to address the challenge of outdated code suggestions from LLMs by providing them with the most current documentation snippets. It integrates with LLMs like Anthropic's Claude, allowing them to fetch and incorporate up-to-date information before generating code suggestions. The server uses the Serper API for site-specific searches and retrieves content using `httpx` and `BeautifulSoup`. Built with Python 3.11+, it leverages modern tools like `asyncio`, `FastMCP`, and the `uv` package manager.

Features

  • MCP Standard: Implements the Model Context Protocol for seamless integration with compatible clients.
  • get_docs Tool: Exposes a tool that searches official documentation sites for the latest information.
  • Targeted Search: Uses the Serper API to perform site-specific Google searches for Langchain, LlamaIndex, and OpenAI.
  • Content Fetching: Retrieves and parses text content from top search results using `httpx` and `BeautifulSoup`.
  • Modern Tooling: Built with Python 3.11+, `asyncio`, `FastMCP`, and managed using the `uv` package manager.

Tools

  1. get_docs

    A tool that searches and retrieves documentation snippets for specified libraries.