learning-mcp-servers

THEYASHGAUR/learning-mcp-servers

3.2

If you are the rightful owner of learning-mcp-servers and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

This project demonstrates the creation and integration of a Model Context Protocol (MCP) server with an LLM tool bridge, showcasing the evolution from basic MCP server setup to real-world API integration.

Tools
3
Resources
0
Prompts
0

MCP Server + LLM Tool Bridge

This repo/project contains 2 phases:


Phase 1: Basic MCP Server

We built a very simple MCP style server using FastAPI. This server exposes 3 tools:

  • get_weather(city)
  • summarize_text(text)
  • convert_currency(amount, from_currency, to_currency)

We test these tools using Postman / curl / python requests.

The server exposes a single endpoint:

POST /call_tool

where we pass:

{
  "tool": "get_weather",
  "params": { "city": "Delhi" }
}

The server receives the request → calls the correct function → returns response.


Phase 2: LLM Tool Bridge

We then created another python script which connects an LLM to the MCP server.

Flow:

  • User asks a normal english question
  • LLM decides which tool should be used
  • We call the MCP server endpoint from python
  • And return final answer to user

This is how the LLM agent + MCP style integration works.


Result

We learned how to:

  • expose simple tools via a server
  • test them via postman
  • let an LLM act like an agent and call those tools

This is the base concept of AI Agents + MCP style architecture.


Tech used

  • Python
  • FastAPI
  • OpenAI
  • Requests
  • Postman

Run server

uvicorn main:app --reload

Run LLM client

python llm_client.py

This is a great base for building more complex MCP tool networks later.

MCP Server Project - Phase 1 & Phase 2 README

Overview

This project demonstrates building a simple local MCP server that exposes basic capabilities, and then expands it by connecting it to an external API. It is divided into 2 Phases to help you understand step–by–step.


Phase 1 (Local basic MCP Server)

What happens in Phase 1?

  • You create a basic MCP server using the mcp python package.
  • You expose 2 simple endpoints that return static JSON responses.
  • These endpoints behave like mini tools.
  • There is no external API involved in Phase 1.

Goal of Phase 1: Understand how MCP server works, how tools are exposed, and how the MCP protocol responds.


Phase 2 (Connecting external API - Joke API)

What happens in Phase 2?

  • You extend the same MCP server logic.
  • You call a third‑party public Joke API.
  • The MCP server returns a real API response instead of static JSON.

Goal of Phase 2: Learn how to call external APIs using fetch (python requests) and send dynamic results back through MCP.


Tech used

  • Python
  • mcp python lib
  • FastAPI (optional) for checking via POSTMAN

How to test using Postman

  • You run your python server using: python server.py

  • The server exposes an HTTP endpoint (FastAPI) for testing

  • You can do a POST request on Postman:

    {
      "tool": "getJoke"
    }
    
  • You get JSON response back from MCP server


Summary

PhaseTypeAPI involvedPurpose
Phase‑1Local❌ No external APILearn MCP basics
Phase‑2External API✅ Yes Joke APILearn MCP with real API

This project shows clear evolution from MCP basics → to real use case integration.

Next Phase ideas: integrate DB, RabbitMQ, Redis, Billing, Analytics