sahilpardasani/MCP-Server
If you are the rightful owner of MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project sets up a dual-mode Model Context Protocol (MCP) server that supports both tool-based responses and prompt-response LLM completions.
MCP-Server
This project sets up a dual-mode Model Context Protocol (MCP) server that supports both:
๐งฉ Tool-based responses (e.g., current time, jokes) via FastMCP ๐ฌ Prompt-response LLM completions via FastAPI
๐ Project Structure
File | Purpose |
---|---|
server.py | Handles LLM loading, generation, config |
๐ Features
โ๏ธ Compatible with Claude Desktop, MCP Inspector, LangGraph, etc. ๐ FastMCP standard for tool registration and stdin communication ๐ง Run local LLM completions from app.py using /generate API ๐ Optional HTTP server mode for broader integrations ๐งช MCP tools return structured JSON responses
๐ Installation pip install fastapi uvicorn transformers torch pip install requests pip install bitsandbytes accelerate # Only needed if using quantized models
๐ Registered MCP Tools Tool Name What It Does time_now Returns current UTC timestamp dad_joke Returns a random dad joke from API
๐ก What is MCP? Model Context Protocol (MCP) allows custom tools and local models to be integrated into AI assistants like Claude Desktop or LangGraph. You can register Python functions as tools, and they become callable by the assistant when relevant.