54696d20/McpServer
If you are the rightful owner of McpServer and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The MCP Task Management System is a local, private, and efficient task management application powered by conversational AI, utilizing Model Context Protocol (MCP) with local LLM integration.
MCP Task Management System
A complete Model Context Protocol (MCP) implementation with local LLM integration using .NET, MCPSharp, and Ollama. This system provides a conversational AI-powered task management application with a modern Blazor WASM UI that runs entirely locally.
šļø Architecture
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā Blazor WASM ā ā Ollama LLM ā ā Task Service ā
ā (Frontend) āāāāāŗā (Container) āāāāāŗā (Backend) ā
ā Port: 8080 ā ā Port: 11434 ā ā Port: 5000 ā
ā MudBlazor UI ā ā Local Models ā ā REST API ā
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
š Quick Start with Docker (Recommended)
Prerequisites
- Docker Desktop installed and running
- Docker Compose (usually included with Docker Desktop)
- 8GB+ RAM (for LLM models)
- 10GB+ free disk space (for models and containers)
One-Command Setup
Windows:
start.bat
macOS/Linux:
./start.sh
Manual:
docker-compose up --build -d
Access Your Application
- š Blazor UI: http://localhost:8080
- Dashboard: Task statistics and overview
- Task Management: Create, edit, delete, and filter tasks
- AI Chat: Natural language task management with Ollama LLM
- Dark Mode: Toggle between light and dark themes (dark mode default)
- š§ Task Service API: http://localhost:5001
- š¤ Ollama API: http://localhost:11434
First Run Notes
- Model Download: First run downloads the LLM model (~4GB)
- Setup Time: 5-10 minutes depending on internet speed
- Monitor Progress:
docker-compose logs -f ollama
š ļø Manual Setup (Alternative)
Prerequisites
- .NET 9.0 SDK
- Ollama (for local LLM models)
- Git
1. Install Ollama
macOS:
brew install ollama
Linux:
curl -fsSL https://ollama.ai/install.sh | sh
Windows: Download from ollama.ai
2. Start Ollama and Pull a Model
# Start Ollama service
ollama serve
# In another terminal, pull a model
ollama pull llama2:7b
3. Clone and Build
git clone <your-repo>
cd McpServer
dotnet restore
dotnet build
4. Environment Configuration
Create a .env
file based on env.example
:
# Ollama model name (e.g., llama2:7b, mistral:7b, codellama:7b)
MCP_OLLAMA_MODEL=llama2:7b
# Path to your MCP server executable
MCP_SERVER_PATH=/path/to/McpServer.Server/bin/Debug/net9.0/McpServer.Server.exe
5. Set Environment Variables
export MCP_OLLAMA_MODEL="llama2:7b"
export MCP_SERVER_PATH="/path/to/McpServer.Server/bin/Debug/net9.0/McpServer.Server.exe"
6. Run the Application
# Start the MCP server
cd McpServer.Server
dotnet run
# In another terminal, run the client
cd McpServer.Client
dotnet run
šÆ How It Works
Task Management Flow
- User Input: Natural language task request via Blazor UI or API
- LLM Processing: Ollama receives the request and responds with structured JSON
- Task Operations: System performs CRUD operations on tasks
- Response: Results displayed in the modern UI
UI Features
- š Dashboard: Real-time task statistics and overview
- š Task Management: Full CRUD operations with filtering and sorting
- š¬ AI Chat: Natural language interface for task management
- š Dark Mode: Beautiful dark theme (default) with light mode toggle
- š± Responsive: Works on desktop, tablet, and mobile devices
Example Interactions
User: "Add a task to buy groceries tomorrow"
LLM: { "operation": "create", "task": { "title": "Buy groceries", "dueDate": "tomorrow" } }
Result: Task created successfully and displayed in UI
User: "Show me all tasks"
LLM: { "operation": "read", "filter": "all" }
Result: Tasks displayed in table with filtering options
User: "Mark the grocery task as done"
LLM: { "operation": "update", "taskId": "123", "task": { "status": "completed" } }
Result: Task updated and UI refreshed
š Project Structure
McpServer/
āāā McpServer.LLM/ # LLM integration using Ollama HTTP API
āāā McpServer.Client/ # Task management client and API
āāā McpServer.Server/ # MCP server implementation
āāā McpServer.Client.UI.Client/ # Blazor WASM UI with MudBlazor
ā āāā Pages/ # Application pages (Dashboard, Tasks, Chat)
ā āāā Layout/ # Main layout and navigation
ā āāā Components/ # Reusable UI components
ā āāā Models/ # Data models and DTOs
ā āāā wwwroot/ # Static assets and configuration
āāā docker-compose.yml # Container orchestration
āāā Dockerfile # Task service container
āāā start.sh # Linux/macOS startup script
āāā start.bat # Windows startup script
āāā README.md # This file
š§ Docker Commands
Management
# Start services
docker-compose up -d
# View logs
docker-compose logs -f
# Stop services
docker-compose down
# Restart services
docker-compose restart
# Rebuild and start
docker-compose up --build -d
Individual Services
# View Ollama logs
docker-compose logs -f ollama
# View task service logs
docker-compose logs -f task-service
# View UI logs
docker-compose logs -f blazor-ui
š Troubleshooting
Docker Issues
- Docker not running: Start Docker Desktop
- Port conflicts: Check if ports 8080, 5000, or 11434 are in use
- Insufficient memory: Increase Docker memory limit (8GB+ recommended)
Ollama Issues
- Model not found:
docker-compose logs ollama
to check download progress - Connection refused: Wait for Ollama to fully start
- Slow responses: First run loads models into memory
Application Issues
- UI not loading: Check if Blazor container is running
- API errors: Check task-service logs
- LLM errors: Verify Ollama is healthy
- Dark mode not working: Clear browser cache and refresh
- Chat not responding: Check if Ollama model is loaded and responding
Using Different Models
# Pull different models
docker exec mcp-ollama ollama pull mistral:7b
docker exec mcp-ollama ollama pull codellama:7b
# Update environment variable
export MCP_OLLAMA_MODEL="mistral:7b"
docker-compose restart task-service
š Benefits of This Approach
Local & Private
- No cloud dependencies: Everything runs locally
- Data privacy: Your tasks never leave your machine
- Offline capable: Works without internet after setup
Performance
- Fast responses: Local LLM inference
- No API limits: Unlimited usage
- Customizable: Use any Ollama model
Developer Friendly
- Easy setup: One command with Docker
- Cross-platform: Works on Windows, Mac, Linux
- Extensible: Easy to add new features
Modern UI Experience
- Beautiful interface: Material Design with MudBlazor
- Dark mode: Easy on the eyes with toggle option
- Responsive design: Works on all device sizes
- Real-time updates: Instant feedback and auto-scrolling chat
š® Future Enhancements
- More LLM Models: Support for different model types
- Advanced Task Features: Recurring tasks, reminders, categories
- User Authentication: Multi-user support
- Mobile App: React Native or Flutter companion
- API Extensions: Webhook support, external integrations
- Task Automation: .NET Orleans integration for automated task processing
- Enhanced UI: More themes, customizations, and accessibility features
- Data Persistence: Database integration (SQLite, PostgreSQL)
- Export/Import: Task backup and sharing capabilities
š License
This project is licensed under the MIT License - see the LICENSE file for details.
š¤ Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test with Docker
- Submit a pull request
Happy task managing with AI! šÆ