shekinahfire77/huggingface-mcp-server
If you are the rightful owner of huggingface-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A FastAPI-based Model Context Protocol (MCP) server for Hugging Face Hub integration, providing AI model search, dataset access, and trending model discovery.
Hugging Face MCP Server
A FastAPI-based Model Context Protocol (MCP) server for Hugging Face Hub integration, providing AI model search, dataset access, and trending model discovery.
Features
- Model Search: Search for AI models by name, task, or criteria
- Dataset Access: Browse and search Hugging Face datasets
- Trending Models: Discover popular and trending models
- Model Information: Detailed metadata for models and datasets
- MCP Protocol: Compatible with Claude Code and other AI assistants
API Endpoints
GET /health
- Health checkGET /models/{model_id}
- Get model informationGET /models/trending
- List trending modelsGET /models/search
- Search modelsGET /datasets/{dataset_id}
- Get dataset informationGET /datasets/search
- Search datasets
Environment Variables
HF_TOKEN
- Hugging Face API token (required)PORT
- Server port (default: 8000)
Deployment
This service is designed for deployment on Render, Vercel, or similar platforms with Python support.
Render Deployment
- Connect this repository to Render
- Set environment variable
HF_TOKEN
with your Hugging Face API token - Use Python runtime with:
- Build command:
pip install -r requirements.txt
- Start command:
uvicorn main:app --host 0.0.0.0 --port $PORT
- Build command:
Local Development
pip install -r requirements.txt
export HF_TOKEN=your_token_here
uvicorn main:app --reload
Integration
This MCP server is part of the Covenant Automata AI automation platform, providing model selection and AI capabilities for social media content generation and analysis.