MCP-Server

sahilpardasani/MCP-Server

3.2

If you are the rightful owner of MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This project sets up a dual-mode Model Context Protocol (MCP) server that supports both tool-based responses and prompt-response LLM completions.

Tools
2
Resources
0
Prompts
0

MCP-Server

This project sets up a dual-mode Model Context Protocol (MCP) server that supports both:

๐Ÿงฉ Tool-based responses (e.g., current time, jokes) via FastMCP ๐Ÿ’ฌ Prompt-response LLM completions via FastAPI

๐Ÿ“ Project Structure

FilePurpose
server.pyHandles LLM loading, generation, config

๐Ÿš€ Features

โš™๏ธ Compatible with Claude Desktop, MCP Inspector, LangGraph, etc. ๐Ÿ”Œ FastMCP standard for tool registration and stdin communication ๐Ÿง  Run local LLM completions from app.py using /generate API ๐ŸŒ Optional HTTP server mode for broader integrations ๐Ÿงช MCP tools return structured JSON responses

๐Ÿ›  Installation pip install fastapi uvicorn transformers torch pip install requests pip install bitsandbytes accelerate # Only needed if using quantized models

๐Ÿ›  Registered MCP Tools Tool Name What It Does time_now Returns current UTC timestamp dad_joke Returns a random dad joke from API

๐Ÿ’ก What is MCP? Model Context Protocol (MCP) allows custom tools and local models to be integrated into AI assistants like Claude Desktop or LangGraph. You can register Python functions as tools, and they become callable by the assistant when relevant.