bloobglob/manim-mcp-server
3.2
If you are the rightful owner of manim-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
MCP Server for creating Manim GPT with Dify
Dify-manim-mcp-agent
📐 Dify Manim Agent Setup with Custom LLM
This guide walks you through setting up Dify, integrating a custom local LLM model, and creating a Manim Agent using the Dify platform for mathematical animation generation.
🛠️ Prerequisites
Ensure the following are available:
- Dify (at least v1.6.0) installed and running via Docker. Do not run on
localhost
- A local or remote LLM endpoint (e.g., hosted via FastAPI)
- Access to the LLM model
llm_en_v_1_3_1
orllm_zh_v_1_3_1
(or your preferred OpenAI-compatible model) - Python environment with Manim dependencies
🚀 Step-by-Step Setup
Step 1: Add LLM Model to Dify
- Click the user icon in the top-right corner of the interface.
- Select Settings from the dropdown menu.
- Navigate to the Model Provider tab.
- Install the OpenAI-API-compatible model if not already installed.
- In the OpenAI-API-compatible model, click Add Model and configure:
- Model Type:
LLM
- Model Name:
llm_en_v_1_3_1
orllm_zh_v_1_3_1
- Model display name:
llm_en_v_1_3_1
orllm_zh_v_1_3_1
- API endpoint URL:
http://<your-server-ip>:5001/openai/v1
- Model context size:
4096
- API Key: Leave blank if not needed
- Model Type:
- Mark it as default if desired.
Step 2: Set up the MCP Server
- Start the MCP server by navigating to this project directory and running:
python server.py
- At the top of the Dify interface, navigate to the Tool tab.
- Click the MCP tab.
- Click Add MCP Server (HTTP) and configure:
- Server URL:
http://host.docker.internal:8000/mcp/
- Name & Icon:
Manim MCP Server
- Server Identifier:
manim-mcp-server
- Server URL:
Step 3: Create the MCP Agent in Dify
- On the main interface in Dify, click Import DSL file in the Create App block.
- Select
Manim Agent.yml
in this directory. - Configure the agent workflow:
- Connect the LLM Node to use your configured model
- Map the MCP server tools to the appropriate workflow nodes
- Define system prompts for mathematical animation generation
✅ Final Notes
- Ensure the local LLM server is accessible to Dify.
- The MCP server must be running before using the Manim Agent.
- Debug logs can help confirm each stage is functioning as expected.
- Test with simple mathematical expressions before attempting complex animations.