MCP-Server-App-using-FAST-MCP

Duaa-fatimaa/MCP-Server-App-using-FAST-MCP

3.2

If you are the rightful owner of MCP-Server-App-using-FAST-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

This document provides a structured overview of a Model Context Protocol (MCP) server integrated with FastAPI and Gemini CLI.

Tools
1
Resources
0
Prompts
0

Simple FastAPI + MCP + Gemini CLI Demo

Video https://drive.google.com/file/d/1Eu4Cm78XYP3vP1GNpWaKmjtq1DgRHtqV/view?usp=sharing A minimal example showing FastAPI integration with MCP (Model Context Protocol) for Gemini CLI.

Files

  • app.py - Simple FastAPI app with /hello endpoint
  • mcp_server.py - MCP server with say_hello tool
  • .gemini/settings.json - Gemini CLI MCP configuration

Setup

  1. Install dependencies:
pip install fastapi uvicorn httpx mcp
  1. Install Gemini CLI:
pip install google-generativeai

Usage

1. Start FastAPI app

python app.py

App runs on http://localhost:8000

2. Start MCP server (in another terminal)

python mcp_server.py

3. Use Gemini CLI

gemini mcp list

4. Call the tool

gemini mcp call say_hello

Test the FastAPI endpoint directly

curl http://localhost:8000/hello

Expected Output

The say_hello tool should return:

{
  "message": "Hello World"
}