namurokuro/mcp-srver-starter-pack
If you are the rightful owner of mcp-srver-starter-pack and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Blender-Ollama MCP Server is a Model Context Protocol server designed to integrate the Blender-Ollama system with Cursor, enabling advanced 3D scene creation and management through natural language and specialized agents.
Blender Evolving MCP generator
Model Context Protocol (MCP) server for integrating Blender-Ollama system with Cursor.
📚 Repository Index: See for complete documentation index and navigation guide.
Overview
This MCP server exposes the Blender-Ollama specialized agent system to Cursor, allowing you to:
- Keep a constantly evolving workspace that individuals can tailor to their own pipelines—use the Starter Pack deliverables as stable references while experimenting freely with learning workflows, AI generators, or new agent behaviors.
- Create 3D scenes in Blender via natural language
- Query operation history from 11 specialized databases
- Access code patterns and performance metrics
- Use 14 specialized agents for different Blender domains
- Monitor all agent activities in real-time via web viewport (see )
Starter Pack (AI-ready)
Need a lightweight bundle for onboarding, demos, or source sharing? Regenerate the curated Starter Pack anytime:
python build_starter_pack.py- Use the
starter_pack/folder as the upload artifact (Git, ZIP, etc.).
The Starter Pack keeps:
- Final/future-proof documentation (all
FINAL_*,COMPLETE_*,READY*files) - AI generator entrypoints such as
finalize_vape_ad.pyandrender_final.py - Ready-to-run configs (
cursor_mcp_config_ready.json, Docker-ready equivalent)
Together those files represent the full functionality showcase—users can study the docs, run the generator scripts, or extend them with their own models without needing the rest of the experimental workspace.
Architecture
Cursor → MCP Server (stdio) → Agent Coordinator → Specialized Agents → Blender
↓
11 SQLite Databases
↓
Agent Activity Tracker → Viewport Server (Web)
Agent Activity Viewport
Monitor all agent activities in real-time through a web-based dashboard:
- Start viewport:
start_viewport.bat(Windows) or./start_viewport.sh(Linux/Mac) - Access dashboard: http://localhost:5000
- See real-time agent status, operations, progress, and activity logs
- Full documentation:
Installation
Option 1: Direct Installation (Recommended for Development)
- Ensure the parent
blender-ollamadirectory is accessible - The server automatically imports from the parent directory
- Install dependencies:
pip install -r requirements.txt
Option 2: Docker Installation (Recommended for Production)
- Prerequisites: Docker and Docker Compose installed
- Migrate from direct installation:
migrate-to-docker.bat(Windows) - Start services:
docker-start.bat(Windows) or./docker-start.sh(Linux/Mac) - Pull Ollama models:
docker-pull-models.bat - Check status:
docker-status.bat
For detailed Docker setup, see:
- - Migration guide (start here!)
- - Quick start guide
- - Complete documentation
Configuration
Cursor Configuration
Add to your Cursor settings (.cursor/mcp.json or Cursor settings):
{
"mcpServers": {
"blender-ollama": {
"command": "python",
"args": [
"F:/mcp server/mcp_server.py"
],
"env": {
"OLLAMA_URL": "http://localhost:11434",
"BLENDER_HOST": "localhost",
"BLENDER_PORT": "9876"
}
}
}
}
Available Tools
1. create_scene
Create a 3D scene in Blender from natural language description.
Parameters:
description(required): Natural language descriptionfield(optional): Specialist agent to use
Example:
{
"name": "create_scene",
"arguments": {
"description": "Create a red cube on a blue plane",
"field": "modeling"
}
}
2. get_scene_info
Get current Blender scene information.
3. execute_blender_code
Execute Python code directly in Blender.
4. query_database
Query operation history, patterns, errors, or performance.
Parameters:
database: Which database to query (or "all")query_type: "recent", "patterns", "errors", or "performance"limit: Maximum results
5. get_model_performance
Get LLM model performance metrics.
6. get_successful_patterns
Get successful code generation patterns.
7. list_specialists
List all available specialist agents.
8. get_development_proposals
Get development proposals based on current trends and innovations. Monitors trends in Blender, AI, video editing, fashion, furniture, TikTok, Instagram, gaming, and other project-relevant areas. Automatically adapts to your current project context.
Parameters:
focus_area(optional): "general", "blender", "ai", "tech", "video", "fashion", "furniture", "tiktok", "instagram", "gaming", or "custom"custom_topics(optional): Array of custom topics for project-specific analysisuse_project_context(optional): Use current project context to adapt proposals (default: true)
9. set_project_context
Set your current project context so trend monitoring adapts to your specific project type.
Parameters:
project_type(required): "fashion", "furniture", "video", "tiktok", "instagram", "gaming", "blender", "3d", "modeling", or "custom"project_description(optional): Description of your project
10. get_project_relevant_trends
Get trends automatically adapted to your current project context. No parameters needed - uses your set project context.
Available Resources
Resources provide read-only access to data:
blender://database/{field}/schema- Database schemablender://database/{field}/operations- Recent operationsblender://database/{field}/patterns- Code patternsblender://database/{field}/errors- Error patternsblender://database/{field}/performance- Performance metricsblender://scene/current- Current Blender sceneblender://agents/list- Available agents
Available Prompts
create_modeling_scene- Create modeling scene workflowcreate_material_setup- Material setup workflowanalyze_performance- Performance analysis workflowfind_similar_operations- Find similar operations
Testing
Using MCP Inspector
npx @modelcontextprotocol/inspector python "F:/mcp server/mcp_server.py"
Manual Testing
python "F:/mcp server/mcp_server.py"
Then send JSON-RPC requests via stdin.
Specialized Agents
The server routes tasks to 10 specialized agents:
- Modeling - 3D modeling and mesh operations
- Shading - Materials and shaders (includes Sanctus Library procedural shaders support)
- Animation - Animation and keyframes
- VFX - Visual effects and simulations
- Motion Graphics - Text and motion graphics
- Rendering - Rendering and export
- Rigging - Armatures and rigging
- Sculpting - Digital sculpting
- Camera Operator - Camera operations
- Videography - Video editing
Databases
Each agent maintains its own SQLite database:
modeling_data.dbshading_data.dbanimation_data.dbvfx_data.dbmotiongraphics_data.dbrendering_data.dbrigging_data.dbsculpting_data.dbcameraoperator_data.dbvideography_data.db
Troubleshooting
Import Errors
- Ensure the parent
blender-ollamadirectory is accessible - Check that
specialized_agents.pyanddata_collector.pyexist - Verify Python path includes the parent directory
Connection Errors
- Ensure Ollama is running on
localhost:11434 - Ensure Blender addon is running on
localhost:9876 - Check firewall settings
Database Errors
- Ensure database files exist in the parent directory
- Check file permissions
- Verify database paths are correct
Sanctus Library Integration
The Shading agent now supports Sanctus Library procedural shaders collection, providing access to 690+ high-quality procedural materials.
Installation
-
Purchase and download Sanctus Library from: https://superhivemarket.com/products/sanctus-library-addon---procedural-shaders-collection-for-blender/
-
Install in Blender:
- Edit > Preferences > Add-ons
- Click "Install..." and select the Sanctus Library .zip file
- Enable the addon
-
Access materials through Asset Browser (Shift+A) or use the Python API
Usage
Via MCP Server:
{
"name": "create_scene",
"arguments": {
"description": "Apply Sanctus Library metal material to cube",
"field": "shading"
}
}
Via Python Script:
from sanctus_library_tools import apply_sanctus_material_to_object
# Apply material to object
result = apply_sanctus_material_to_object("Cube", "MetalMaterial")
Example Scripts:
use_sanctus_library.py- Check installation and list materialsexample_sanctus_materials.py- Create scene with Sanctus materials
Available Functions
The sanctus_library_tools.py module provides:
check_sanctus_installed()- Check if addon is installedapply_sanctus_material_to_object()- Apply material to objectget_sanctus_materials()- List available materialsget_sanctus_material_categories()- Get material categories- Code generation functions for programmatic material application
References
Protocol & Integration
- Cursor MCP Server Guide - Building MCP servers for Cursor
- MCP Protocol Specification - Model Context Protocol specification
Blender Documentation
- Blender Python API - Official Blender Python API reference
- Blender Developer Documentation - Blender development handbook and architecture
- Blender Features Documentation - Design and implementation of Blender features
- Blender Projects - Blender source code, issues, and development platform
- Blender Developer Forum - Community forum for Blender developers
- Blender Release Notes - API changes and compatibility notes
License
Same as parent Blender-Ollama project.