mcp-server

Robinson777-prog/mcp-server

3.1

If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This document provides a guide on integrating OpenAI with a Model Context Protocol (MCP) server to enable dynamic tool usage.

Tools
1
Resources
0
Prompts
0

OpenAI Integration with MCP

This section demonstrates how to integrate the Model Context Protocol (MCP) with OpenAI's API to create a system where OpenAI can access and use tools provided by your MCP server.

Overview

This example shows how to:

  1. Create an MCP server that exposes a knowledge base tool
  2. Connect OpenAI to this MCP server
  3. Allow OpenAI to dynamically use the tools when responding to user queries

Connection Methods

This example uses the stdio transport for communication between the client and server, which means:

  • The client and server run in the same process
  • The client directly launches the server as a subprocess
  • No separate server process is needed

If you want to split your client and server into separate applications (e.g., running the server on a different machine), you'll need to use the SSE (Server-Sent Events) transport instead. For details on setting up an SSE connection, see the section.

Data Flow Explanation

  1. User Query: The user sends a query to the system (e.g., "What is our company's vacation policy?")
  2. OpenAI API: OpenAI receives the query and available tools from the MCP server
  3. Tool Selection: OpenAI decides which tools to use based on the query
  4. MCP Client: The client receives OpenAI's tool call request and forwards it to the MCP server
  5. MCP Server: The server executes the requested tool (e.g., retrieving knowledge base data)
  6. Response Flow: The tool result flows back through the MCP client to OpenAI
  7. Final Response: OpenAI generates a final response incorporating the tool data

How OpenAI Executes Tools

OpenAI's function calling mechanism works with MCP tools through these steps:

  1. Tool Registration: The MCP client converts MCP tools to OpenAI's function format
  2. Tool Choice: OpenAI decides which tools to use based on the user query
  3. Tool Execution: The MCP client executes the selected tools and returns results
  4. Context Integration: OpenAI incorporates the tool results into its response

The Role of MCP

MCP serves as a standardized bridge between AI models and your backend systems:

  • Standardization: MCP provides a consistent interface for AI models to interact with tools
  • Abstraction: MCP abstracts away the complexity of your backend systems
  • Security: MCP allows you to control exactly what tools and data are exposed to AI models
  • Flexibility: You can change your backend implementation without changing the AI integration

Implementation Details

Server (server.py)

The MCP server exposes a get_knowledge_base tool that retrieves Q&A pairs from a JSON file.

Client (client.py)

The client:

  1. Connects to the MCP server
  2. Converts MCP tools to OpenAI's function format
  3. Handles the communication between OpenAI and the MCP server
  4. Processes tool results and generates final responses

Knowledge Base (data/kb.json)

Contains Q&A pairs about company policies that can be queried through the MCP server.

Running the Example

  1. Ensure you have the required dependencies installed
  2. Set up your OpenAI API key in the .env file
  3. Run the client: python client.py

Note: With the stdio transport used in this example, you don't need to run the server separately as the client will automatically start it.