thiagarajanbe/mcp-server
If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server is a robust system designed to facilitate seamless communication between clients and various tools or models, leveraging modern .NET and AI technologies.
MCP Chat System - Quick Reference
Prerequisites
- .net 9
- ollama (brew install ollama)
- Deepseek llm for Ollama (ollama pull deepseek-r1:1.5b)
🚀 One-Click Startup
chmod +x *.sh
./quick-start.sh
📊 Check System Status
./check-status.sh
🛑 Stop All Services
./stop-stack.sh
🌐 Quick Access URLs
| Service | URL | Purpose |
|---|---|---|
| Chat Interface | http://localhost:5001 | Main React chat UI |
| API Docs | http://localhost:5001/swagger | REST API documentation |
| MCP Server | http://localhost:5000 | Tool execution server |
| Ollama API | http://localhost:11434 | AI model service |
💬 Example Chat Messages
Try these in the chat interface:
"What time is it?""Calculate 15 * 7""What's the weather like?""Tell me about this system""Can you help me with 125 / 5?"
🔧 Troubleshooting
Service Not Starting?
# Check what's running
./check-status.sh
# Stop everything and restart
./stop-stack.sh
./quick-start.sh
Port Conflicts?
# Check what's using the ports
lsof -i :5000 # MCP Server
lsof -i :5001 # MCP Client API
lsof -i :11434 # Ollama
Model Issues?
# Re-pull the model
ollama pull deepseek-r1:1.5b
# List available models
ollama list
📁 Log Files
Logs are stored in ./logs/ directory:
ollama_TIMESTAMP.logmcp_server_TIMESTAMP.logmcp_client_TIMESTAMP.log
🏗️ Architecture
Browser → MCP Client API → MCP Server → Tools
↘ Ollama API ↗
Each user gets an isolated session for secure multi-user conversations.
🛠️ Manual Setup (Alternative)
If you prefer to start services individually:
# Manual startup sequence
# 1. Start Ollama
ollama serve
# 2. Pull model (if needed)
ollama pull deepseek-r1:1.5b
# 3. Start MCP Server
cd McpServer
dotnet run
# 4. Start MCP Client API (in new terminal)
cd McpClientApi
dotnet run
# 5. Start the Frontend
cd Frontend
npm start
Step 5: Direct API Testing (Optional)
You can also test the server directly using HTTP requests:
# Check server health
curl http://localhost:5000/health
# Get available tools
curl -X POST http://localhost:5000/mcp/tools \
-H "Content-Type: application/json"
# Execute calculator tool
curl -X POST http://localhost:5000/mcp/call-tool \
-H "Content-Type: application/json" \
-d '{
"name": "calculator",
"arguments": {
"operation": "multiply",
"a": 15,
"b": 7
}
}'
Expected responses:
// Health check
{"status":"Healthy","timestamp":"2025-07-28T14:23:45.123Z"}
// Available tools
{
"tools": [
{
"name": "calculator",
"description": "Perform mathematical calculations",
"inputSchema": {
"type": "object",
"properties": {
"operation": {"type": "string", "enum": ["add", "subtract", "multiply", "divide"]},
"a": {"type": "number"},
"b": {"type": "number"}
}
}
}
// ... other tools
]
}
// Calculator execution
{
"content": [
{
"type": "text",
"text": "Result: 15 multiply 7 = 105"
}
]
}
Configuration
The server uses appsettings.json for configuration:
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning"
}
},
"McpServer": {
"Name": "ModernMcpServer",
"Version": "1.0.0"
},
"Ollama": {
"DefaultModel": "deepseek-r1:1.5b",
"BaseUrl": "http://localhost:11434"
}
}
Available Endpoints
GET /health- Health check endpointPOST /mcp/tools- Get available toolsPOST /mcp/call-tool- Execute a specific toolGET /metrics- Prometheus metrics endpoint
Development
Adding New Tools
- Create a new class in the
Tools/directory - Implement tool methods with proper validation
- Register in
ToolRegistry.cs - Add any configuration needed
Error Handling
- All tools return descriptive error messages rather than throwing exceptions
- JSON-RPC error responses follow MCP specification
- Logging uses stderr to avoid STDIO corruption
MCP Protocol Details
This server implements:
- Initialize: Handshake and capability exchange via HTTP
- Tools: Discovery and execution of available tools through REST endpoints
- Error Handling: Proper JSON-RPC error responses over HTTP
- Health Monitoring: Built-in health check endpoints
- Logging: Structured logging with configurable levels
Support
This is a reference implementation demonstrating:
- Modern .NET patterns for MCP servers
- Enterprise-ready architecture with HTTP REST APIs
- MCP 2025-06-18 specification compliance
- Integration with standard MCP clients via HTTP
- Production-ready Ollama integration
The included Ollama MCP Client provides a complete bridge between this server and Ollama models using HTTP communication.
Usage Examples
Once running, try these commands:
What time is it?- Uses the Time toolCalculate 15 * 7- Uses the Calculator toolWhat's the weather like?- Uses the Weather toolTell me about this system- Uses the SystemInfo tooltools- List available toolsmodels- Switch between Ollama modelsquit- Exit the client
Request Flow Diagram
Here's the detailed flow of how a user request is processed through the entire system:
┌─────────────────────────────────────────────────────────────────────────────────┐
│ COMPLETE REQUEST FLOW │
└─────────────────────────────────────────────────────────────────────────────────┘
1. User Input
┌─────────────┐
│ USER │ "Calculate 15 * 7"
│ │ ────────────────────┐
└─────────────┘ │
▼
2. Client Receives Input
┌─────────────┐ ┌─────────────┐
│ MCP CLIENT │◄───────────────│ User Input │
│ │ │ Processing │
└─────────────┘ └─────────────┘
│
▼
3. First LLM Call (Tool Decision)
┌─────────────┐ Chat Request ┌─────────────┐
│ MCP CLIENT │────────────────►│ OLLAMA LLM │
│ │ │ (deepseek) │
└─────────────┘ └─────────────┘
▲ │
│ ▼
│ ┌─────────────┐
│ │ LLM Response│
│ │ TOOL_CALL: │
└────────────────────────│ {"tool": │
│ "calculator"│
│ "args":{... │
└─────────────┘
4. Tool Call Parsing
┌─────────────┐ Parse Tool ┌─────────────┐
│ MCP CLIENT │◄────────────────│ Tool Call │
│ │ Call JSON │ Extraction │
└─────────────┘ └─────────────┘
│
▼
5. MCP Server Request
┌─────────────┐ HTTP POST ┌─────────────┐
│ MCP CLIENT │────────────────►│ MCP SERVER │
│ │ /mcp/call-tool │ │
└─────────────┘ {name: calc, └─────────────┘
▲ args: {op: mul, │
│ a: 15, b: 7}} ▼
│ ┌─────────────┐
│ │ TOOL │
│ │ EXECUTION │
│ │ 15 * 7 = 105│
│ └─────────────┘
│ │
│ ▼
6. Tool Result Response ┌─────────────┐
┌─────────────┐ HTTP Response │ Tool Result │
│ MCP CLIENT │◄─────────────────│ "Result: │
│ │ {content: [{ │ 15 multiply │
└─────────────┘ text: "Result: │ 7 = 105"} │
│ 15*7=105"}]} └─────────────┘
▼
7. Second LLM Call (Natural Response)
┌─────────────┐ Context + ┌─────────────┐
│ MCP CLIENT │ Tool Result ───►│ OLLAMA LLM │
│ │ │ (deepseek) │
└─────────────┘ └─────────────┘
▲ │
│ ▼
│ ┌─────────────┐
│ │Natural Lang │
└────────────────────────│Response: │
│"15*7 = 105" │
└─────────────┘
8. Final User Response
┌─────────────┐ Display ┌─────────────┐
│ USER │◄────────────────│ MCP CLIENT │
│ │ "15 * 7 = 105" │ │
└─────────────┘ └─────────────┘
┌─────────────────────────────────────────────────────────────────────────────────┐
│ FLOW SUMMARY │
├─────────────────────────────────────────────────────────────────────────────────┤
│ 1. User Input → MCP Client │
│ 2. MCP Client → Ollama LLM (Initial processing) │
│ 3. Ollama LLM → Tool Call Instructions │
│ 4. MCP Client → Parse Tool Call │
│ 5. MCP Client → MCP Server (HTTP POST) │
│ 6. MCP Server → Tool Execution → Result │
│ 7. MCP Client → Ollama LLM (Result synthesis) │
│ 8. Ollama LLM → Natural Language Response │
│ 9. MCP Client → User (Final output) │
└─────────────────────────────────────────────────────────────────────────────────┘