dividor/hxp-mcp2
If you are the rightful owner of hxp-mcp2 and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The HDX MCP Server provides AI assistants with seamless access to the Humanitarian Data Exchange (HDX) API, offering a wide array of humanitarian data.
HDX MCP Server
A Model Context Protocol (MCP) server that provides AI assistants with seamless access to the Humanitarian Data Exchange (HDX) API. This provides a wide array of humanitarian data as hosted on the Humanitarian Data Exchange
This server automatically generates MCP tools from the HDX OpenAPI specification and includes custom tools for enhanced functionality.
Features
- š Automatic OpenAPI Integration: Automatically generates MCP tools from the HDX HAPI OpenAPI specification
- š§ Custom Tools: Additional hand-crafted tools for common HDX operations
- š Secure Authentication: Environment-based API key management with proper security practices
- š” Dual Transport Support: Both stdio and HTTP transports for local and remote usage
- š·ļø Smart Categorization: Intelligent tagging and categorization of API endpoints
- š”ļø Security Best Practices: Following MCP security guidelines with proper input validation
- ā Comprehensive Testing: Full test suite with unit and integration tests
- š Extensible Design: Easy to add custom tools alongside auto-generated ones
Caveats
The HDX API offers a very rich source of Humanitarian data, but it is complex. The coverage of data by regions differs per country, as showing in the table here. The server prompt instructs calling LLMs to check data coverage, but it is worth noting that in this release aggregate queries such as 'What's the population of country X' or 'How many conflict events were there last year in country Y' will be challenging for countries which only have data at a granular level (eg Afghanistan conflict events). The LLM would have to sum this data, which of course may be error prone. To resolve this in future work, either the API can provide aggregate values or data needs to be extracted and processed.
Quick Start
Get up and running in 2 minutes:
# 1. Install UV (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# 2. Clone and setup
git clone <repository-url>
cd hdx-mcp2
uv sync --all-extras --dev
# 3. Configure your HDX API key
cp env.example .env
nano .env # Edit HDX_API_KEY and HDX_APP_EMAIL
# 4. Run the server
uv run hdx-mcp-server
That's it! The server is now running and ready to serve HDX data via MCP.
Installation
Prerequisites
Setup Instructions
UV is a fast Python package manager that automatically handles virtual environments and dependency management. No need to manually create or activate virtual environments - uv
does this automatically!
-
Install UV (if not already installed):
# On macOS and Linux curl -LsSf https://astral.sh/uv/install.sh | sh # On Windows powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
-
Clone the repository:
git clone <repository-url> cd hdx-mcp2
-
Install dependencies:
# Install all dependencies including development tools uv sync --all-extras # Or install only production dependencies uv sync
-
Configure your environment:
# Copy environment template cp env.example .env # Edit .env with your HDX API key and settings nano .env
Important Environment Variables:
HDX_API_KEY
- Your HDX API key (required)HDX_RATE_LIMIT_REQUESTS
- Max requests per period (default: 10)HDX_RATE_LIMIT_PERIOD
- Rate limit period in seconds (default: 60.0)HDX_TIMEOUT
- Request timeout in seconds (default: 30.0)
Usage
Running the Server
Option 1: Using UV (Development)
# Stdio transport (default) - for local MCP clients
uv run hdx-mcp-server
# HTTP transport - for remote access
uv run hdx-mcp-server --transport http
# With verbose logging
uv run hdx-mcp-server --verbose
# HTTP with custom configuration
uv run hdx-mcp-server --transport http --host 0.0.0.0 --port 9000 --verbose
Option 2: Using Docker (Recommended for Claude Desktop)
# Build the Docker image
docker build -t hdx-mcp-server .
# Run with stdio transport (for MCP clients)
docker run --rm -i --env-file .env hdx-mcp-server
# Run with HTTP transport
docker run --rm -p 8000:8000 --env-file .env hdx-mcp-server --transport http --host 0.0.0.0
# Using docker-compose (includes proper volume mounts)
docker-compose up
Command Line Options
usage: uv run hdx-mcp-server [-h] [--transport {stdio,http}] [--host HOST] [--port PORT] [--verbose]
HDX MCP Server - Humanitarian Data Exchange API Server
options:
-h, --help show this help message and exit
--transport {stdio,http}
Transport method to use (default: stdio)
--host HOST Host to bind HTTP server to (default: localhost)
--port PORT Port to bind HTTP server to (default: 8000)
--verbose, -v Enable verbose logging
Examples:
uv run hdx-mcp-server # Run with stdio transport
uv run hdx-mcp-server --transport http # Run with HTTP transport
uv run hdx-mcp-server --transport http --port 9000 # HTTP on custom port
Using with Claude Desktop
To use this HDX MCP server with Claude Desktop, you need to configure it in Claude's MCP settings.
Setup Steps
-
Complete the basic setup above - ensure the server works with
uv run hdx-mcp-server
-
Get the absolute path to your project:
cd /path/to/your/hdx-mcp2 pwd # Copy this path
-
Configure Claude Desktop:
Open Claude Desktop settings and add to your MCP servers configuration:
Option A: Using Docker (Recommended)
{ "hdx-humanitarian-data": { "command": "docker", "args": ["run", "--rm", "-i", "--env-file", "/absolute/path/to/your/hdx-mcp2/.env", "hdx-mcp-server"] } }
Option B: Using UV
{ "hdx-humanitarian-data": { "command": "uv", "args": ["run", "hdx-mcp-server"], "cwd": "/absolute/path/to/your/hdx-mcp2" } }
Replace
/absolute/path/to/your/hdx-mcp2
with the actual path from step 2. -
Ensure your
.env
file is configured with your HDX API key:HDX_API_KEY=your_hdx_api_key_here HDX_BASE_URL=https://hapi.humdata.org
-
Restart Claude Desktop to load the new MCP server
Verification
Once configured, you should see the HDX server appear in Claude's MCP servers list. You can test it by asking Claude to:
- "List available HDX tools"
- "Get information about Afghanistan using HDX data"
- "Search for refugee data in Syria"
Troubleshooting
If Claude Desktop can't connect to the server:
- Check the path - ensure the
cwd
path is correct and absolute - Verify uv is available - make sure
uv
is installed and in your PATH - Test standalone - verify
uv run hdx-mcp-server
works from the project directory - Check logs - Claude Desktop logs may show connection errors
- Verify environment - ensure your
.env
file is in the project root with valid HDX API key
Alternative: HTTP Transport
If stdio transport doesn't work, you can also run the server in HTTP mode and configure Claude to connect via HTTP:
-
Start the server in HTTP mode:
uv run hdx-mcp-server --transport http --host 127.0.0.1 --port 9001
-
Configure Claude Desktop for HTTP:
{ "hdx-humanitarian-data": { "url": "http://127.0.0.1:9001/mcp" } }
Available Tools
The server provides tools automatically generated from the HDX HAPI OpenAPI specification, plus custom tools for enhanced functionality.
Auto-Generated Tools (Examples)
All HDX HAPI endpoints are automatically converted to MCP tools:
- Metadata Tools:
get_locations
,get_admin1
,get_admin2
,get_organizations
, etc. - Affected People Tools:
get_humanitarian_needs
,get_idps
,get_refugees
, etc. - Climate Tools:
get_rainfall
- Coordination Tools:
get_conflict_events
,get_funding
,get_operational_presence
, etc. - Food Security Tools:
get_food_prices
,get_food_security
,get_poverty_rate
- Geography Tools:
get_baseline_population
Custom Tools
hdx_server_info
Get information about the HDX MCP server instance.
Returns:
{
"server_name": "HDX MCP Server",
"version": "1.0.0",
"base_url": "https://hapi.humdata.org/api/v2",
"total_endpoints": 25,
"available_tools": 28,
"description": "MCP server for Humanitarian Data Exchange API"
}
hdx_get_dataset_info
Get detailed information about a specific HDX dataset.
Parameters:
dataset_hdx_id
(string): The HDX dataset identifier
Example:
{
"dataset_hdx_id": "dataset-123",
"status": "success",
"dataset": {
"dataset_hdx_title": "Afghanistan - Baseline Population",
"hdx_provider_name": "OCHA",
"...": "..."
}
}
hdx_search_locations
Search for locations (countries) in the HDX system.
Parameters:
name_pattern
(optional string): Pattern to match location nameshas_hrp
(optional boolean): Filter for locations with Humanitarian Response Plans
Example:
{
"status": "success",
"locations": [
{
"code": "AFG",
"name": "Afghanistan",
"has_hrp": true
}
],
"count": 1,
"filters_applied": {
"name_pattern": "Afghanistan",
"has_hrp": true
}
}
Tool Categories and Tags
Tools are automatically categorized with relevant tags:
- Metadata Tools:
metadata
,reference
- Affected People Tools:
affected-people
,humanitarian
- Climate Tools:
climate
,environmental
- Coordination Tools:
coordination
,humanitarian
- Food Security Tools:
food-security
,nutrition
,poverty
- Geography Tools:
geography
,infrastructure
,population
- Utility Tools:
utility
,system
All tools also include global tags: hdx
, humanitarian
, data
Configuration
Environment Variables
The server is configured via environment variables. Copy env.example
to .env
and customize:
Variable | Required | Default | Description |
---|---|---|---|
HDX_API_KEY | Yes | - | Your HDX API key |
HDX_BASE_URL | No | https://hapi.humdata.org/api/v2 | HDX API base URL |
HDX_OPENAPI_URL | No | https://hapi.humdata.org/openapi.json | OpenAPI spec URL |
HDX_TIMEOUT | No | 30.0 | HTTP request timeout (seconds) |
HDX_APP_NAME | No | hdx-mcp-server | Application name for HDX |
HDX_APP_EMAIL | No | assistant@example.com | Contact email for user of HDX API |
MCP_HOST | No | localhost | Host for HTTP transport |
MCP_PORT | No | 8000 | Port for HTTP transport |
Complete Configuration Example
See env.example
for a complete configuration template with detailed comments:
# View the configuration template
cat env.example
# Copy and edit for your environment
cp env.example .env
nano .env
Excluded Endpoints
The server automatically excludes the following endpoints:
/api/v2/encode_app_identifier
- Internal utility for generating app identifiers
Adding Custom Tools
To add your own tools alongside the auto-generated ones, modify the _register_custom_tools
method in hdx_mcp_server.py
:
def _register_custom_tools(self):
"""Register custom tools alongside the OpenAPI-derived ones."""
@self.mcp.tool("my_custom_tool")
async def my_custom_tool(param1: str, param2: int = 10) -> Dict[str, Any]:
"""
Description of my custom tool.
Args:
param1: Description of parameter 1
param2: Description of parameter 2 with default value
Returns:
Dictionary containing the results
"""
# Your custom tool implementation
return {"result": "success", "data": param1}
Development
Development Setup with UV
# Install all dependencies including development tools
uv sync --all-extras --dev
# Install production dependencies only
uv sync
# Install pre-commit hooks (recommended)
uv run pre-commit install
Development Commands
# Run tests
uv run pytest
# Run tests with coverage
uv run pytest --cov=src --cov-report=html --cov-report=term-missing
# Run specific test categories
uv run pytest tests/unit/ # Unit tests
uv run pytest tests/integration/ # Integration tests
uv run pytest tests/security/ # Security tests
# Code quality checks
uv run black src/ tests/ # Format code
uv run isort src/ tests/ # Sort imports
uv run flake8 src/ tests/ # Lint code
uv run mypy src/ # Type checking
uv run bandit -r src/ # Security scanning
# Run all quality checks
uv run pre-commit run --all-files
# Clean cache files
find . -type d -name "__pycache__" -exec rm -rf {} +
find . -type f -name "*.pyc" -delete
rm -rf .pytest_cache .mypy_cache .coverage htmlcov/
Project Structure
hdx-mcp-server/
āāā src/hdx_mcp_server/ # Main source code
ā āāā __init__.py # Package initialization
ā āāā __main__.py # CLI entry point
ā āāā server.py # Main server implementation
āāā tests/ # Test suite
ā āāā unit/ # Unit tests
ā āāā integration/ # Integration tests
ā āāā security/ # Security tests
ā āāā conftest.py # Shared test fixtures
āāā .github/workflows/ # GitHub Actions CI/CD
āāā pyproject.toml # Project configuration
Testing
Running Tests
# All tests
uv run pytest
# Specific test categories
uv run pytest tests/unit/ -v # Unit tests only
uv run pytest tests/integration/ -v # Integration tests only
uv run pytest tests/security/ -v # Security tests only
# With coverage reporting
uv run pytest --cov=src --cov-report=html
Test Categories
- Unit Tests (
tests/unit/
): Test individual components in isolation - Integration Tests (
tests/integration/
): Test interactions with external APIs - Security Tests (
tests/security/
): Test security features and protections
Test Coverage
The test suite includes:
- Unit Tests: Server initialization, configuration, authentication
- Integration Tests: Real API interaction (when API key available)
- Security Tests: API key protection, input validation, error handling
- Edge Case Tests: Malformed specs, network errors, timeouts
Manual Testing
Test the server manually with curl (HTTP transport):
# Start server
uv run hdx-mcp-server --transport http
# Test server info tool
curl -X POST http://localhost:8000/mcp \
-H "Content-Type: application/json" \
-d '{"method": "tools/call", "params": {"name": "hdx_server_info", "arguments": {}}}'
Security Considerations
This server follows MCP security best practices:
Authentication & Authorization
- ā API keys stored in environment variables, never hardcoded
- ā API keys not logged or exposed in error messages
- ā Proper HTTP client configuration with timeouts
- ā Base64 encoding for app identifiers as required by HDX
Input Validation
- ā FastMCP handles input schema validation automatically
- ā Custom tools include proper type hints and validation
- ā Error handling prevents information leakage
Network Security
- ā HTTPS-only communication with HDX API
- ā Configurable timeouts to prevent hanging connections
- ā Proper error handling for network failures
Data Handling
- ā No persistent data storage
- ā Proper cleanup of HTTP connections
- ā Graceful shutdown handling
Troubleshooting
Common Issues
-
"Required environment variable HDX_API_KEY is not set"
- Ensure you have copied
.env.example
to.env
- Verify your HDX API key is correctly set in
.env
- Ensure you have copied
-
"Failed to load OpenAPI specification"
- Check your internet connection
- Verify HDX API is accessible:
curl https://hapi.humdata.org/openapi.json
-
HTTP transport not accessible.
- Check if the port is already in use
- Verify firewall settings if accessing remotely
- Use
--host 0.0.0.0
for external access
-
Authentication errors
- Verify your HDX API key is valid
- Check if your account has necessary permissions
Logging
Enable verbose logging for debugging:
uv run hdx-mcp-server --verbose
This will show detailed information about:
- Server initialization
- OpenAPI spec loading
- Tool registration
- HTTP requests and responses
- Error details
Contributing
Development Setup
-
Install dependencies:
uv sync --all-extras # Install all dependencies including dev tools uv run pre-commit install # Install pre-commit hooks
-
Development workflow:
uv run black src/ tests/ # Format code uv run flake8 src/ tests/ # Lint code uv run pytest # Run tests uv run pre-commit run --all-files # Run all checks
Code Quality & CI/CD
This project includes comprehensive quality controls:
Code Quality Tools
- Black: Code formatting
- isort: Import sorting
- flake8: Linting with security and complexity extensions
- mypy: Type checking
- bandit: Security vulnerability scanning
- pytest: Testing with coverage reporting
GitHub Actions
- CI Pipeline: Tests on Python 3.8-3.12
- Code Quality: Automated formatting, linting, and type checking
- Security Scanning: Vulnerability detection and security tests
- Integration Tests: Full API integration testing
Pre-commit Hooks
Automatically run quality checks before each commit:
uv run pre-commit install # One-time setup
git commit # Hooks run automatically
See for detailed development guidelines.
Adding New Endpoints
New HDX API endpoints are automatically included when the OpenAPI specification is updated. No code changes are required.
Adding Custom Tools
- Add your tool function in the
_register_custom_tools
method - Include proper type hints and documentation
- Add corresponding tests in
test_hdx_mcp_server.py
- Update this README with documentation for your tool
License
This project is licensed under the MIT License - see the LICENSE file for details.
Related Resources
- HDX (Humanitarian Data Exchange)
- HDX HAPI Documentation
- Model Context Protocol Specification
- FastMCP Documentation
Support
For issues related to:
- This MCP server: Open an issue in this repository
- HDX API: Consult HDX HAPI documentation
- MCP protocol: See MCP specification
- FastMCP library: Check FastMCP documentation
Note: This server requires a valid HDX API key. Please ensure you comply with HDX's terms of service and rate limiting guidelines when using this server.