mcp-server-ollama-deep-researcher

mcp-server-ollama-deep-researcher

3.4

If you are the rightful owner of mcp-server-ollama-deep-researcher and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Ollama Deep Researcher is a Model Context Protocol (MCP) server that provides deep research capabilities using local LLMs via Ollama.

Ollama Deep Researcher is an MCP server adaptation of LangChain Ollama Deep Researcher, designed to provide deep research capabilities within the model context protocol ecosystem. It allows AI assistants to perform in-depth research on topics using local LLMs hosted by Ollama. The server generates web search queries, gathers results via Tavily or Perplexity API, summarizes findings, identifies knowledge gaps, and iteratively improves the summary through multiple research cycles. The final output is a markdown summary with all sources used. The server requires Node.js, Python 3.10 or higher, and API keys for Tavily, Perplexity, and LangSmith. It can be installed via standard or Docker methods and integrates with LangSmith for tracing and monitoring.

Features

  • Iterative Research Process: Generates queries, gathers results, summarizes, identifies gaps, and iteratively improves summaries.
  • Local LLM Integration: Uses local LLMs hosted by Ollama for research tasks.
  • API Integration: Supports Tavily and Perplexity APIs for web search.
  • LangSmith Integration: Provides tracing and monitoring of research processes.
  • Docker Support: Offers Docker installation for simplified setup.

Tools

  1. configure

    Configure research parameters such as maxLoops, llmModel, and searchApi.

  2. research

    Research any topic using web search and LLM synthesis.

  3. get_status

    Get the current status of ongoing research.