MCP-Tool-Calling

gopi1949/MCP-Tool-Calling

3.2

If you are the rightful owner of MCP-Tool-Calling and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The Model Context Protocol (MCP) server is a framework that connects language models to various tools, enabling them to access real-time data and perform complex tasks through automatic tool orchestration.

Tools
2
Resources
0
Prompts
0

MCP Tool Calling Demo with Web Search and Weather

This project demonstrates Model Context Protocol (MCP) tool calling capabilities. It shows how to build and connect MCP tools that can be chained together intelligently by language models.

What This Demonstrates

This is a practical example of MCP tool calling:

  • MCP Server with two tools (web search + weather)
  • MCP Client that connects tools to GPT-5
  • Tool Chaining - where one tool's output becomes another's input
  • Automatic Tool Selection - the model decides which tools to use

The interesting part is watching how the model automatically chains tools together to solve complex queries.

Tool Calling Examples

Single Tool Call

You: What's the weather in Tokyo?
Assistant: Weather in Tokyo: clear sky, 15°C (feels like 13°C), humidity 45%.

Tool Chaining Demo

You: What is the weather in the city where Zohran Mamdani was elected as mayor in 2025?
Assistant: Weather in New York City: clear sky, 7.2°C (feels like 4.3°C), humidity 52%.

Here's the tool calling sequence:

  1. Tool Call 1: tavily_search with query "Zohran Mamdani elected mayor 2025"
  2. Tool Response: Found he was elected mayor of NYC
  3. Tool Call 2: get_weather with city "New York City"
  4. Final Response: Combined result from both tool calls

This demonstrates automatic tool orchestration - the model analyzed the query and determined it needed two sequential tool calls.

More Tool Chaining Examples

You: What's the weather in Paris and find recent news about France
You: Check weather in London and search for best restaurants there  
You: Find where the Taj Mahal is located and tell me the weather there

How MCP Tool Calling Works

This demonstrates Model Context Protocol (MCP) - a standard for connecting language models to tools. Here's the architecture:

MCP Server (mcp_server.py):

  • Implements MCP protocol using FastMCP
  • Provides tool definitions and execution
  • Handles stdio transport

MCP Client (mcp_client.py):

  • Connects to MCP server
  • Translates MCP tools to OpenAI function calling format
  • Handles tool orchestration with GPT-5

Tool Selection Logic: GPT-5 analyzes each query and decides:

  • Which tools are needed
  • What arguments to pass
  • What order to call them in
  • How to use results from one tool in another

This all happens automatically through the OpenAI function calling API.

Quick Setup

1. Get Your API Keys

You need two free API keys:

2. Install Stuff

pip install fastmcp requests python-dotenv openai

3. Add Your Keys

Create a .env file:

OPENAI_API_KEY=your_openai_key_here
OPENWEATHER_API_KEY=your_weather_key_here  
TAVILY_API_KEY=your_tavily_key_here

4. Run It

python mcp_client.py

That's it! Start asking questions.

The Tools

Web Search (Tavily)

  • Searches the entire internet
  • Gets AI-powered answers
  • Way smarter than Google for specific questions
  • Perfect for current events, research, finding locations

Weather Tool

  • Real-time weather for any city
  • Temperature, humidity, conditions
  • Works worldwide
  • Handles different city name formats

Files You Get

├── mcp_server.py     # The tools (web search + weather)
├── mcp_client.py     # Chat interface with GPT-5
├── requirements.txt  # What to install
└── README.md        # This file

Why MCP Tool Calling Matters

Without MCP: Language models are isolated - they can't access real-time data or perform actions.

With MCP: Language models become interfaces to any API or service - web search, weather, databases, file systems, etc.

The Power: Complex queries that require multiple data sources can be handled in a single conversation turn through automatic tool orchestration.

Testing Tool Calling

To see MCP tool calling in action:

  • Single tool queries: "What's the weather in London?" (uses weather tool only)
  • Chained tool queries: "What's the weather where [person] lives?" (search → weather)
  • Complex queries: "Weather and news about [location]" (uses both tools independently)

Watch the console output to see the actual tool calls being made.

Extending the MCP Server

Adding new tools is straightforward:

  1. Define the tool with @mcp.tool() decorator
  2. Add the function that implements the tool logic
  3. Restart the server - the client will automatically discover new tools

Example tool structure:

@mcp.tool()
def my_new_tool(param: str) -> str:
    """Tool description for the language model"""
    # Your tool logic here
    return result

The language model will automatically learn to use new tools based on their descriptions and parameters.

This demonstrates the extensibility of MCP for building custom tool ecosystems.