mcp-web-search-crawl

tomas-hanzlik/mcp-web-search-crawl

3.1

If you are the rightful owner of mcp-web-search-crawl and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The MCP Web Search Crawler is a server that provides web search and crawling capabilities using the Model Context Protocol (MCP). It allows AI assistants and other MCP clients to perform web searches via DuckDuckGo and crawl specific URLs to extract content in markdown format.

MCP Web Search Crawler

A Model Context Protocol (MCP) server that provides web search and crawling capabilities. This server enables AI assistants and other MCP clients to search the web using DuckDuckGo and crawl specific URLs to extract content in markdown format.

Features

  • Web Search: Search the web using DuckDuckGo and return link, titles and body snippets
  • URL Crawling: Crawl specific URLs and return their content as markdown

Installation

First clone the repository locally:

git clone https://github.com/tomas-hanzlik/mcp-web-search-crawl
cd mcp-web-search-crawl

Configuration

Configure the server in your MCP client:

{
    "mcpServers": {
        "mcp-web-search-crawl": {
            "command": "uv",
            "args": [
                "--directory",
                "/ABSOLUTE/PATH/TO/PARENT/FOLDER/src/mcp_web_search_crawl",
                "run",
                "mcpsearchcrawl"
            ]
        }
    }
}

Alternatively, you can also configure using the following command:

uvx --from git+https://github.com/tomas-hanzlik/mcp-web-search-crawl mcpsearchcrawl

Testing Configurations Programmatically

Test the server programmatically using a Python client:

from fastmcp import Client

config = {
    "mcpServers": {
        "mcp-web-search-crawl": {
            "command": "uv",
            "args": [
                "--directory",
                "/ABSOLUTE/PATH/TO/PARENT/FOLDER/src/mcp_web_search_crawl",
                "run",
                "mcpsearchcrawl"
            ]
        }
    }
}

# Create a client that connects to all servers
client = Client(config)

async def main():
    async with client:
        # Access tools and resources with server prefixes
        answer = await client.call_tool("search_links", {"query": "What is MCP?"})
        print("Search Links Result:", answer)
        
if __name__ == "__main__":
    import asyncio
    asyncio.run(main())

Development Setup

Set up the development environment:

uv venv
uv run mcpsearchcrawl --transport sse # or stdio as alternative

Debugging with MCP Inspector

Debug the server using MCP Inspector:

uv run fastmcp dev src/mcp_web_search_crawl/server.py:mcp --with-editable .