Robinson777-prog/mcp-server
If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This document provides a guide on integrating OpenAI with a Model Context Protocol (MCP) server to enable dynamic tool usage.
OpenAI Integration with MCP
This section demonstrates how to integrate the Model Context Protocol (MCP) with OpenAI's API to create a system where OpenAI can access and use tools provided by your MCP server.
Overview
This example shows how to:
- Create an MCP server that exposes a knowledge base tool
- Connect OpenAI to this MCP server
- Allow OpenAI to dynamically use the tools when responding to user queries
Connection Methods
This example uses the stdio transport for communication between the client and server, which means:
- The client and server run in the same process
- The client directly launches the server as a subprocess
- No separate server process is needed
If you want to split your client and server into separate applications (e.g., running the server on a different machine), you'll need to use the SSE (Server-Sent Events) transport instead. For details on setting up an SSE connection, see the section.
Data Flow Explanation
- User Query: The user sends a query to the system (e.g., "What is our company's vacation policy?")
- OpenAI API: OpenAI receives the query and available tools from the MCP server
- Tool Selection: OpenAI decides which tools to use based on the query
- MCP Client: The client receives OpenAI's tool call request and forwards it to the MCP server
- MCP Server: The server executes the requested tool (e.g., retrieving knowledge base data)
- Response Flow: The tool result flows back through the MCP client to OpenAI
- Final Response: OpenAI generates a final response incorporating the tool data
How OpenAI Executes Tools
OpenAI's function calling mechanism works with MCP tools through these steps:
- Tool Registration: The MCP client converts MCP tools to OpenAI's function format
- Tool Choice: OpenAI decides which tools to use based on the user query
- Tool Execution: The MCP client executes the selected tools and returns results
- Context Integration: OpenAI incorporates the tool results into its response
The Role of MCP
MCP serves as a standardized bridge between AI models and your backend systems:
- Standardization: MCP provides a consistent interface for AI models to interact with tools
- Abstraction: MCP abstracts away the complexity of your backend systems
- Security: MCP allows you to control exactly what tools and data are exposed to AI models
- Flexibility: You can change your backend implementation without changing the AI integration
Implementation Details
Server (server.py
)
The MCP server exposes a get_knowledge_base
tool that retrieves Q&A pairs from a JSON file.
Client (client.py
)
The client:
- Connects to the MCP server
- Converts MCP tools to OpenAI's function format
- Handles the communication between OpenAI and the MCP server
- Processes tool results and generates final responses
Knowledge Base (data/kb.json
)
Contains Q&A pairs about company policies that can be queried through the MCP server.
Running the Example
- Ensure you have the required dependencies installed
- Set up your OpenAI API key in the
.env
file - Run the client:
python client.py
Note: With the stdio transport used in this example, you don't need to run the server separately as the client will automatically start it.