sofianhw/mcp-client-server-python
If you are the rightful owner of mcp-client-server-python and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project demonstrates a simple client-server Model Context Protocol (MCP) implementation using Python.
MCP Client-Server Python Example
This project demonstrates a simple client-server MCP.
What is MCP (Model Context Protocol)?
MCP is an open protocol introduced by Anthropic to enable large language models (LLMs) to interact with external tools, APIs, and resources in a standardized, extensible way.
It facilitates secure, multi-channel communication between AI models and external systems, supporting advanced agentic workflows and tool use.š Anthropic's announcement
š MCP documentation
Features
- MCP Server: Exposes tools (e.g., addition) and resources (e.g., greetings) via SSE.
- MCP Client: Connects to the server, lists available tools, and interacts using OpenAI's chat completions.
- OpenAI Integration: Uses OpenAI's GPT models to process user queries and call server tools as needed.
Requirements
- Python 3.12+
- MCP Python SDK
- OpenAI Python SDK
- Uvicorn (for running the server)
- python-dotenv (for loading environment variables)
- uv (fast Python package installer and resolver)
Project Structure
.
āāā client.py # MCP client implementation
āāā server.py # MCP server implementation
āāā pyproject.toml # Project metadata and dependencies
āāā .env # Environment variables (not committed)
āāā README.md # This file
Install dependencies with uv:
uv sync
(This will install all dependencies as specified in uv.lock.)
Setup
-
Environment Variables
Create a
.envfile in the project directory:OPENAI_API_KEY=your-openai-api-key MCP_SSE_URL=http://localhost:8080/sse -
Start the Server
uv run server.py --host 0.0.0.0 --port 8080The server exposes tools and resources via SSE at
/sse. -
Run the Client
In another terminal:
uv run client.pyThe client will connect to the server, list available tools, and start an interactive chat loop.
Usage
- Type your queries in the client prompt.
- The client will use OpenAI to process your query and call server tools if needed.
- Type
quitto exit the client.