mcp-server-dify

mcp-server-dify

3.2

If you are the rightful owner of mcp-server-dify and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Model Context Protocol Server for Dify AI enabling LLMs to interact with Dify AI's chat completion capabilities.

The mcp-server-dify is a Model Context Protocol Server designed to facilitate interaction between large language models (LLMs) and Dify AI's chat completion API. This server provides a standardized protocol for seamless communication, allowing for enhanced AI-driven interactions. It supports features such as conversation context management and streaming responses, making it a robust solution for applications requiring dynamic and context-aware AI responses. Implemented in TypeScript, it ensures a modern and efficient codebase. The server also includes a restaurant recommendation tool, 'meshi-doko', which leverages Dify AI's capabilities to provide location and budget-based dining suggestions.

Features

  • Integration with Dify AI chat completion API
  • Restaurant recommendation tool (meshi-doko)
  • Support for conversation context
  • Streaming response support
  • TypeScript implementation

Tools

  1. meshi-doko

    Restaurant recommendation tool, interacting with Dify AI, supporting location, budget and query parameters