huggingface-mcp-server

shekinahfire77/huggingface-mcp-server

3.2

If you are the rightful owner of huggingface-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

A FastAPI-based Model Context Protocol (MCP) server for Hugging Face Hub integration, providing AI model search, dataset access, and trending model discovery.

Hugging Face MCP Server

A FastAPI-based Model Context Protocol (MCP) server for Hugging Face Hub integration, providing AI model search, dataset access, and trending model discovery.

Features

  • Model Search: Search for AI models by name, task, or criteria
  • Dataset Access: Browse and search Hugging Face datasets
  • Trending Models: Discover popular and trending models
  • Model Information: Detailed metadata for models and datasets
  • MCP Protocol: Compatible with Claude Code and other AI assistants

API Endpoints

  • GET /health - Health check
  • GET /models/{model_id} - Get model information
  • GET /models/trending - List trending models
  • GET /models/search - Search models
  • GET /datasets/{dataset_id} - Get dataset information
  • GET /datasets/search - Search datasets

Environment Variables

  • HF_TOKEN - Hugging Face API token (required)
  • PORT - Server port (default: 8000)

Deployment

This service is designed for deployment on Render, Vercel, or similar platforms with Python support.

Render Deployment

  1. Connect this repository to Render
  2. Set environment variable HF_TOKEN with your Hugging Face API token
  3. Use Python runtime with:
    • Build command: pip install -r requirements.txt
    • Start command: uvicorn main:app --host 0.0.0.0 --port $PORT

Local Development

pip install -r requirements.txt
export HF_TOKEN=your_token_here
uvicorn main:app --reload

Integration

This MCP server is part of the Covenant Automata AI automation platform, providing model selection and AI capabilities for social media content generation and analysis.