kofiadom/MCP-Get-Started
If you are the rightful owner of MCP-Get-Started and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A demonstration of the Model Context Protocol (MCP) that connects an OpenAI-powered client to a knowledge base server.
MCP Knowledge Base Assistant
A demonstration of the Model Context Protocol (MCP) that connects an OpenAI-powered client to a knowledge base server. This project showcases how to build a simple but powerful AI assistant that can answer questions about company policies by accessing a knowledge base through MCP.
📋 Overview
This project demonstrates:
- How to build an MCP server that exposes a knowledge base as a tool
- How to create an MCP client that connects to the server
- How to integrate OpenAI's API to create a natural language interface
- How to use Docker to containerize the server component
The system allows users to ask questions in natural language about company policies, and the AI will retrieve relevant information from the knowledge base to provide accurate answers.
🏗️ Architecture
The project follows the MCP client-host-server architecture:
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ │ │ │ │ │
│ OpenAI Model │◄────┤ MCP Client │◄────┤ MCP Server │
│ (GPT-4o) │ │ (client.py) │ │ (server.py) │
│ │ │ │ │ │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│
▼
┌─────────────────┐
│ │
│ Knowledge Base │
│ (kb.json) │
│ │
└─────────────────┘
- MCP Server: Exposes the knowledge base as a tool that can be queried
- MCP Client: Connects to the server and integrates with OpenAI's API
- OpenAI Model: Processes natural language queries and generates responses
- Knowledge Base: JSON file containing Q&A pairs about company policies
🚀 Getting Started
Prerequisites
- Python 3.11 or higher
- Docker (optional, for containerized server)
- OpenAI API key
Installation
-
Clone the repository:
git clone <repository-url> cd MCP-Get-Started -
Create a virtual environment and install dependencies:
python -m venv venv # On Windows venv\Scripts\activate # On macOS/Linux source venv/bin/activate pip install -r requirements.txt -
Create a
.envfile in the project root with your OpenAI API key:OPENAI_API_KEY=your_openai_api_key_here
Running the Server
Option 1: Run directly with Python
python server.py
Option 2: Run with Docker
# Build the Docker image
docker build -t mcp-server .
# Run the container
docker run -p 8050:8050 mcp-server
Running the Client
With the server running, open a new terminal and run:
python client.py
📝 Usage
The client will connect to the server and ask a sample question about the company's equal opportunity policy. You can modify the query in client.py to ask different questions about company policies.
Example output:
Connected to server with tools:
- get_knowledge_base: Retrieve the entire knowledge base as a formatted string.
Query: What is the company's equal opportunity policy?
Response: The company's equal opportunity policy is as follows: The company is an equal opportunity employer and prohibits discrimination based on race, gender, age, religion, disability, or any other protected characteristic.
🔧 Project Structure
server.py: MCP server implementation that exposes the knowledge baseclient.py: MCP client that connects to the server and integrates with OpenAIdata/kb.json: Knowledge base containing company policy Q&A pairsDockerfile: Configuration for containerizing the serverrequirements.txt: Python dependencies
🧩 How It Works
MCP Server (server.py)
The server exposes a single tool called get_knowledge_base that retrieves information from the knowledge base file (data/kb.json). It runs using the SSE (Server-Sent Events) transport on port 8050.
MCP Client (client.py)
The client:
- Connects to the MCP server using SSE transport
- Retrieves the list of available tools
- Takes a natural language query from the user
- Sends the query to OpenAI along with the available tools
- If OpenAI decides to use a tool, the client executes the tool call
- Returns the final response to the user
Knowledge Base (data/kb.json)
The knowledge base is a simple JSON file containing question-answer pairs about company policies. This can be extended with additional policies or information as needed.
🔄 Lifecycle Management
The client implements proper lifecycle management using Python's async context managers:
async with MCPOpenAIClient() as client:
await client.connect_to_server()
response = await client.process_query("What is the policy?")
# Resources are automatically cleaned up when exiting the context
This ensures that all resources are properly initialized and cleaned up, following MCP best practices.
🛠️ Customization
Adding More Knowledge
To expand the knowledge base, simply add more question-answer pairs to the data/kb.json file.
Adding More Tools
You can add more tools to the server by defining additional functions with the @mcp.tool() decorator in server.py.
Changing the Model
To use a different OpenAI model, modify the model parameter in the MCPOpenAIClient class in client.py.
📚 Learn More About MCP
The Model Context Protocol (MCP) is a standardized way for LLMs to interact with external tools and services. It provides:
- Reusability: Build a server once, use it with any MCP-compatible client
- Composability: Combine multiple servers to create complex capabilities
- Ecosystem growth: Benefit from servers created by others
For more information, visit the MCP documentation.