McpServer

54696d20/McpServer

3.1

If you are the rightful owner of McpServer and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The MCP Task Management System is a local, private, and efficient task management application powered by conversational AI, utilizing Model Context Protocol (MCP) with local LLM integration.

MCP Task Management System

A complete Model Context Protocol (MCP) implementation with local LLM integration using .NET, MCPSharp, and Ollama. This system provides a conversational AI-powered task management application with a modern Blazor WASM UI that runs entirely locally.

šŸ—ļø Architecture

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│   Blazor WASM   │    │   Ollama LLM    │    │   Task Service  │
│   (Frontend)    │◄──►│   (Container)   │◄──►│   (Backend)     │
│   Port: 8080    │    │   Port: 11434   │    │   Port: 5000    │
│   MudBlazor UI  │    │   Local Models  │    │   REST API      │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

šŸš€ Quick Start with Docker (Recommended)

Prerequisites

  • Docker Desktop installed and running
  • Docker Compose (usually included with Docker Desktop)
  • 8GB+ RAM (for LLM models)
  • 10GB+ free disk space (for models and containers)

One-Command Setup

Windows:

start.bat

macOS/Linux:

./start.sh

Manual:

docker-compose up --build -d

Access Your Application

  • 🌐 Blazor UI: http://localhost:8080
    • Dashboard: Task statistics and overview
    • Task Management: Create, edit, delete, and filter tasks
    • AI Chat: Natural language task management with Ollama LLM
    • Dark Mode: Toggle between light and dark themes (dark mode default)
  • šŸ”§ Task Service API: http://localhost:5001
  • šŸ¤– Ollama API: http://localhost:11434

First Run Notes

  • Model Download: First run downloads the LLM model (~4GB)
  • Setup Time: 5-10 minutes depending on internet speed
  • Monitor Progress: docker-compose logs -f ollama

šŸ› ļø Manual Setup (Alternative)

Prerequisites

  • .NET 9.0 SDK
  • Ollama (for local LLM models)
  • Git

1. Install Ollama

macOS:

brew install ollama

Linux:

curl -fsSL https://ollama.ai/install.sh | sh

Windows: Download from ollama.ai

2. Start Ollama and Pull a Model

# Start Ollama service
ollama serve

# In another terminal, pull a model
ollama pull llama2:7b

3. Clone and Build

git clone <your-repo>
cd McpServer
dotnet restore
dotnet build

4. Environment Configuration

Create a .env file based on env.example:

# Ollama model name (e.g., llama2:7b, mistral:7b, codellama:7b)
MCP_OLLAMA_MODEL=llama2:7b

# Path to your MCP server executable
MCP_SERVER_PATH=/path/to/McpServer.Server/bin/Debug/net9.0/McpServer.Server.exe

5. Set Environment Variables

export MCP_OLLAMA_MODEL="llama2:7b"
export MCP_SERVER_PATH="/path/to/McpServer.Server/bin/Debug/net9.0/McpServer.Server.exe"

6. Run the Application

# Start the MCP server
cd McpServer.Server
dotnet run

# In another terminal, run the client
cd McpServer.Client
dotnet run

šŸŽÆ How It Works

Task Management Flow

  1. User Input: Natural language task request via Blazor UI or API
  2. LLM Processing: Ollama receives the request and responds with structured JSON
  3. Task Operations: System performs CRUD operations on tasks
  4. Response: Results displayed in the modern UI

UI Features

  • šŸ“Š Dashboard: Real-time task statistics and overview
  • šŸ“ Task Management: Full CRUD operations with filtering and sorting
  • šŸ’¬ AI Chat: Natural language interface for task management
  • šŸŒ™ Dark Mode: Beautiful dark theme (default) with light mode toggle
  • šŸ“± Responsive: Works on desktop, tablet, and mobile devices

Example Interactions

User: "Add a task to buy groceries tomorrow"
LLM: { "operation": "create", "task": { "title": "Buy groceries", "dueDate": "tomorrow" } }
Result: Task created successfully and displayed in UI

User: "Show me all tasks"
LLM: { "operation": "read", "filter": "all" }
Result: Tasks displayed in table with filtering options

User: "Mark the grocery task as done"
LLM: { "operation": "update", "taskId": "123", "task": { "status": "completed" } }
Result: Task updated and UI refreshed

šŸ“ Project Structure

McpServer/
ā”œā”€ā”€ McpServer.LLM/           # LLM integration using Ollama HTTP API
ā”œā”€ā”€ McpServer.Client/        # Task management client and API
ā”œā”€ā”€ McpServer.Server/        # MCP server implementation
ā”œā”€ā”€ McpServer.Client.UI.Client/  # Blazor WASM UI with MudBlazor
│   ā”œā”€ā”€ Pages/               # Application pages (Dashboard, Tasks, Chat)
│   ā”œā”€ā”€ Layout/              # Main layout and navigation
│   ā”œā”€ā”€ Components/          # Reusable UI components
│   ā”œā”€ā”€ Models/              # Data models and DTOs
│   └── wwwroot/             # Static assets and configuration
ā”œā”€ā”€ docker-compose.yml       # Container orchestration
ā”œā”€ā”€ Dockerfile              # Task service container
ā”œā”€ā”€ start.sh                # Linux/macOS startup script
ā”œā”€ā”€ start.bat               # Windows startup script
└── README.md               # This file

šŸ”§ Docker Commands

Management

# Start services
docker-compose up -d

# View logs
docker-compose logs -f

# Stop services
docker-compose down

# Restart services
docker-compose restart

# Rebuild and start
docker-compose up --build -d

Individual Services

# View Ollama logs
docker-compose logs -f ollama

# View task service logs
docker-compose logs -f task-service

# View UI logs
docker-compose logs -f blazor-ui

šŸ› Troubleshooting

Docker Issues

  1. Docker not running: Start Docker Desktop
  2. Port conflicts: Check if ports 8080, 5000, or 11434 are in use
  3. Insufficient memory: Increase Docker memory limit (8GB+ recommended)

Ollama Issues

  1. Model not found: docker-compose logs ollama to check download progress
  2. Connection refused: Wait for Ollama to fully start
  3. Slow responses: First run loads models into memory

Application Issues

  1. UI not loading: Check if Blazor container is running
  2. API errors: Check task-service logs
  3. LLM errors: Verify Ollama is healthy
  4. Dark mode not working: Clear browser cache and refresh
  5. Chat not responding: Check if Ollama model is loaded and responding

Using Different Models

# Pull different models
docker exec mcp-ollama ollama pull mistral:7b
docker exec mcp-ollama ollama pull codellama:7b

# Update environment variable
export MCP_OLLAMA_MODEL="mistral:7b"
docker-compose restart task-service

šŸš€ Benefits of This Approach

Local & Private

  • No cloud dependencies: Everything runs locally
  • Data privacy: Your tasks never leave your machine
  • Offline capable: Works without internet after setup

Performance

  • Fast responses: Local LLM inference
  • No API limits: Unlimited usage
  • Customizable: Use any Ollama model

Developer Friendly

  • Easy setup: One command with Docker
  • Cross-platform: Works on Windows, Mac, Linux
  • Extensible: Easy to add new features

Modern UI Experience

  • Beautiful interface: Material Design with MudBlazor
  • Dark mode: Easy on the eyes with toggle option
  • Responsive design: Works on all device sizes
  • Real-time updates: Instant feedback and auto-scrolling chat

šŸ”® Future Enhancements

  • More LLM Models: Support for different model types
  • Advanced Task Features: Recurring tasks, reminders, categories
  • User Authentication: Multi-user support
  • Mobile App: React Native or Flutter companion
  • API Extensions: Webhook support, external integrations
  • Task Automation: .NET Orleans integration for automated task processing
  • Enhanced UI: More themes, customizations, and accessibility features
  • Data Persistence: Database integration (SQLite, PostgreSQL)
  • Export/Import: Task backup and sharing capabilities

šŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

šŸ¤ Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test with Docker
  5. Submit a pull request

Happy task managing with AI! šŸŽÆ