langchain-ai/mcpdoc
mcpdoc is hosted online, so all tools can be tested directly either in theInspector tabor in theOnline Client.
If you are the rightful owner of mcpdoc and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
MCP LLMS-TXT Documentation Server provides a structured way to manage and retrieve LLM documentation using the Model Context Protocol.
Try mcpdoc with chat:
Tools
Functions exposed to the LLM to take actions
list_doc_sources
List all available documentation sources.
This is the first tool you should call in the documentation workflow.
It provides URLs to llms.txt files or local file paths that the user has made available.
Returns:
A string containing a formatted list of documentation sources with their URLs or file paths
fetch_docs
Fetch and parse documentation from a given URL or local file.
Use this tool after list_doc_sources to:
- First fetch the llms.txt file from a documentation source
- Analyze the URLs listed in the llms.txt file
- Then fetch specific documentation pages relevant to the user's question
Args: url: The URL to fetch documentation from.
Returns: The fetched documentation content converted to markdown, or an error message if the request fails or the URL is not from an allowed domain.
Prompts
Interactive templates invoked by user choice
No prompts
Resources
Contextual data attached and managed by the client