siddhivinayak-sk/oai-mcp
If you are the rightful owner of oai-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project implements a Model Context Protocol (MCP) server using FastAPI, integrating Azure OpenAI and Azure AI Search for enhanced query responses.
MCP Server with Azure OpenAI and Azure AI Search
Overview
This project implements a Model Context Protocol (MCP) server in Python using FastAPI. It provides a /query
endpoint and a custom command query
that, when detected in a query, enriches the request with context from Azure AI Search and sends it to Azure OpenAI for a response.
Features
- Standalone MCP server using FastAPI
- Dockerized for easy deployment
- Custom
query
command triggers Azure AI Search and OpenAI integration - Configurable via environment variables
Configuration
Set the following environment variables (see docker-compose.yml
for example):
Azure OpenAI
AZURE_OPENAI_MODEL
: Deployed model nameAZURE_OPENAI_VERSION
: API versionAZURE_OPENAI_ENDPOINT
: Endpoint URLAZURE_OPENAI_KEY
: API key
Azure AI Search
AZURE_SEARCH_ENDPOINT
: Endpoint URLAZURE_SEARCH_KEY
: API keyAZURE_SEARCH_INDEX
: Index name
Installation
Local (Python 3.10+ required)
pip install -r requirements.txt
uvicorn mcp_server.main:app --reload
Docker
docker-compose up --build
Usage
uv run oai-mcp-server --transport sse --port 8000
Send a POST request to http://localhost:8000/query
with JSON body:
{
"query": "query What is the latest update?"
}
If the query contains query
, the server will use Azure AI Search and OpenAI to answer.
Extending
- Add more commands by extending the FastAPI app and handler modules.
License
MIT