JoaoToledoSE/mcp-server
If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The MCP Server for Customer & Order Analytics is a FastAPI-based server that provides AI-powered tools for querying customer and order data, enabling LLM clients to access structured business data.
MCP Server - Customer & Order Analytics
A Model Context Protocol (MCP) server implementation that provides AI-powered tools for querying customer and order data. This FastAPI-based server enables LLM clients to access structured business data through three specialized tools for customer analytics and order management.
🚀 Overview
This MCP server acts as a bridge between AI language models and business data, allowing LLMs to:
- Count customer orders by month
- Retrieve recent customers by country
- Calculate customer lifetime value and spending patterns
The server follows the Model Context Protocol specification, making it compatible with various AI clients and frameworks that support MCP integration.
✨ Features
Core MCP Tools
- Order Count by Customer & Month - Track customer ordering patterns over time
- Recent Customers by Country - Discover new customers in specific markets
- Customer Total Spend - Analyze customer lifetime value and spending behavior
Technical Features
- FastAPI Framework - High-performance async API with automatic documentation
- Pydantic Validation - Type-safe data models with automatic validation
- JSON Data Backend - Simple file-based storage for customers and orders
- Comprehensive Logging - Full request/response logging for debugging
- Health Monitoring - Built-in health check endpoints
- CORS Support - Cross-origin requests enabled for web clients
🏗️ Architecture
mcp-server/
├── main.py # FastAPI application entry point
├── requirements.txt # Python dependencies
├── data/ # JSON data files
│ ├── customers.json # Customer records
│ └── orders.json # Order transactions
├── src/ # Source code
│ ├── models/ # Data models and schemas
│ │ ├── schemas.py # Pydantic models for validation
│ │ └── data_loader.py # Data loading utilities
│ └── tools/ # MCP tools implementation
│ └── mcp_tools.py # Tool implementations
├── tests/ # Test suite
├── llm_client/ # Example LLM client implementation
└── docs/ # Documentation
🛠️ Available Tools
1. get_order_count_by_customer_and_month
Count orders for a specific customer in a given calendar month.
Input:
{
"customerName": "John Doe",
"isoMonth": "2025-03"
}
Output:
{
"count": 2
}
2. list_recent_customers_by_country
Fetch the newest N customers from a specific country.
Input:
{
"country": "USA",
"limit": 5
}
Output:
{
"customers": [
{
"id": 1,
"name": "John Doe",
"country": "USA",
"joinedAt": "2025-01-15T10:30:00Z",
"totalSpend": 1250.50,
"orderCount": 3
}
]
}
3. get_customer_total_spend
Calculate total spending and order statistics for a customer.
Input:
{
"customerName": "John Doe"
}
Output:
{
"customerName": "John Doe",
"totalSpend": 1250.50,
"orderCount": 3,
"averageOrderValue": 416.83
}
🚦 Quick Start
Prerequisites
- Python 3.9+
- pip or poetry for dependency management
Installation
- Clone the repository:
git clone <repository-url>
cd mcp-server
- Install dependencies:
pip install -r requirements.txt
- Start the server:
python main.py
The server will start on http://localhost:8000
Production Deployment
uvicorn main:app --host 0.0.0.0 --port 8000
📡 API Endpoints
Health Check
- GET
/- Basic health check - GET
/health- Detailed health status
MCP Protocol Endpoints
- GET
/tools/list- List available MCP tools - POST
/tools/call- Execute a specific tool
Example Usage
List available tools:
curl http://localhost:8000/tools/list
Call a tool:
curl -X POST http://localhost:8000/tools/call \
-H "Content-Type: application/json" \
-d '{
"name": "get_order_count_by_customer_and_month",
"arguments": {
"customerName": "John Doe",
"isoMonth": "2025-03"
}
}'
🧪 Testing
Run the test suite:
pytest tests/
Run with coverage:
pytest tests/ --cov=src --cov-report=html
🔧 Configuration
Environment Variables
HOST- Server host (default: 0.0.0.0)PORT- Server port (default: 8000)LOG_LEVEL- Logging level (default: INFO)
Data Files
The server reads from JSON files in the data/ directory:
customers.json- Customer records with id, name, country, joinedAtorders.json- Order records with id, customerId, customerName, date, amount
🤝 Integration
LLM Client Example
This repository includes an example LLM client (llm_client/) that demonstrates how to integrate with the MCP server using Mistral AI. The client:
- Connects to the MCP server
- Uses Mistral AI for natural language processing
- Automatically calls appropriate MCP tools based on user questions
- Returns AI-generated answers with real data
Using with Other AI Frameworks
The server is compatible with any system that supports the Model Context Protocol, including:
- Claude Desktop
- Custom AI applications
- Other MCP-compatible frameworks
📚 Documentation
- - Detailed setup and usage instructions
- - Technical architecture and design decisions
- API Documentation - Interactive API docs (when server is running)
🛡️ Error Handling
The server provides comprehensive error handling:
- 400 Bad Request - Invalid input parameters
- 404 Not Found - Unknown tool names
- 500 Internal Server Error - Server-side errors
All errors follow the MCP protocol format with detailed error messages.
🔄 Development
Adding New Tools
- Define the tool schema in
src/tools/mcp_tools.py - Implement the tool logic
- Add input validation using Pydantic models
- Update the tool definitions list
- Add comprehensive tests
Code Style
- Follow PEP 8 guidelines
- Use type hints throughout
- Add docstrings for all public methods
- Maintain test coverage above 80%