mcp-sample

alejandrogarcia-hub/mcp-sample

3.1

If you are the rightful owner of mcp-sample and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This project demonstrates the Model Context Protocol (MCP) through a tutorial and an enhanced job search application.

Tools
2
Resources
0
Prompts
0

MCP Sample: Tutorial & Job Search Application

Based on a tutorial with an enhanced job search server demonstrating the Model Context Protocol (MCP).

Quick Start

Prerequisites

  • Python 3.11+
  • uv package manager

Setup

  1. Install dependencies:

    uv sync
    
  2. Environment variables: Create .env file:

    OPENAI_API_KEY=your_openai_api_key
    LLM_MODEL=gpt-4
    RAPIDAPI_HOST=jsearch.p.rapidapi.com
    RAPIDAPI_KEY=your_rapidapi_key
    DATA_DIR=./data  # Optional, defaults to ./data
    

Running

Tutorial (Weather Demo):

# Server (terminal 1)
uv run tutorial/main.py

# Client (terminal 2)
uv run tutorial/client_stdio_llm.py

Job Search Server:

uv run job_search/server.py

Testing:

pytest

Claude Desktop Integration

Add to your Claude Desktop MCP settings:

"mcp-job-search": {
  "command": "/Users/<username>/.local/bin/uv",
  "args": [
    "--directory",
    "<path>/mcp-sample/job_search",
    "run",
    "server.py"
  ]
}

Replace <path> with your actual project path.

Project Structure

mcp-sample/
ā”œā”€ā”€ tutorial/               # Basic MCP tutorial examples
│   ā”œā”€ā”€ main.py            # Weather server
│   ā”œā”€ā”€ client_stdio.py    # Basic client
│   └── client_stdio_llm.py # OpenAI integrated client
ā”œā”€ā”€ job_search/            # Enhanced job search application
│   ā”œā”€ā”€ server.py          # MCP server with tools/resources/prompts
│   └── test_*.py          # Integration tests
ā”œā”€ā”€ pytest.ini            # Test configuration
└── .env                  # Environment variables

Key Features

Tutorial Examples:

  • Basic weather tool demonstration
  • OpenAI function calling integration
  • Two-phase LLM processing

Job Search Server:

  • Job search tools (JSearch API integration)
  • Resume resource management
  • Market analysis prompts
  • Temporary caching and permanent job saving

MCP Components

  • Tools: Functions callable by LLM (@mcp.tool())
  • Resources: Data sources accessible via URIs (resume://default)
  • Prompts: Pre-defined templates for analysis
  • Transport: stdio-based client-server communication

Reference

Based on: The Full MCP Blueprint by Daily Dose of Data Science

Troubleshooting

  • "Connection refused": Ensure server is running
  • "OpenAI API error": Check API key in .env
  • "Module not found": Run uv sync
  • Job search errors: Verify RAPIDAPI_KEY is valid