Cam10001110101_mcp-server-ollama-deep-researcher

Cam10001110101_mcp-server-ollama-deep-researcher

3.2

If you are the rightful owner of Cam10001110101_mcp-server-ollama-deep-researcher and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Ollama Deep Researcher is a Model Context Protocol (MCP) server that provides deep research capabilities using local LLMs via Ollama.

The Ollama Deep Researcher MCP server is an adaptation of the LangChain Ollama Deep Researcher, designed to provide in-depth research capabilities within the model context protocol ecosystem. It leverages local LLMs hosted by Ollama to perform comprehensive research on various topics. The server generates web search queries, gathers results using APIs like Tavily or Perplexity, summarizes findings, identifies knowledge gaps, and iteratively refines the research output. The final output is a markdown summary with citations to all sources used. The server requires Node.js, Python 3.10 or higher, and a capable compute environment. It integrates with LangSmith for tracing and monitoring, and stores research results as MCP resources for persistent access.

Features

  • Iterative Research Process: Generates queries, gathers results, summarizes, identifies gaps, and refines output.
  • Local LLM Integration: Utilizes local LLMs hosted by Ollama for research synthesis.
  • API Flexibility: Supports Tavily and Perplexity APIs for web search.
  • LangSmith Integration: Provides comprehensive tracing and monitoring of research processes.
  • MCP Resource Storage: Stores research results as MCP resources for persistent access.

Tools

  1. configure

    Configure research parameters such as maxLoops, llmModel, and searchApi.

  2. research

    Conduct research on a specified topic using web search and LLM synthesis.

  3. get_status

    Retrieve the current status of ongoing research.