Agentic-RAG-with-MCP-Server
If you are the rightful owner of Agentic-RAG-with-MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Agentic RAG with MCP Server is a project that integrates an MCP server and client to build Retrieval-Augmented Generation applications.
🚀 Agentic RAG with MCP Server 
✨ Overview
Agentic RAG with MCP Server is a powerful project that brings together an MCP (Model Context Protocol) server and client for building Agentic RAG (Retrieval-Augmented Generation) applications.
This setup empowers your RAG system with advanced tools such as:
- 🕵️♂️ Entity Extraction
- 🔍 Query Refinement
- ✅ Relevance Checking
The server hosts these intelligent tools, while the client shows how to seamlessly connect and utilize them.
🖥️ Server — server.py
Powered by the FastMCP
class from the mcp
library, the server exposes these handy tools:
Tool Name | Description | Icon |
---|---|---|
get_time_with_prefix | Returns the current date & time | ⏰ |
extract_entities_tool | Uses OpenAI to extract entities from a query — enhancing document retrieval relevance | 🧠 |
refine_query_tool | Improves the quality of user queries with OpenAI-powered refinement | ✨ |
check_relevance | Filters out irrelevant content by checking chunk relevance with an LLM | ✅ |
🤝 Client — mcp-client.py
The client demonstrates how to connect and interact with the MCP server:
- Establish a connection with
ClientSession
from themcp
library - List all available server tools
- Call any tool with custom arguments
- Process queries leveraging OpenAI or Gemini and MCP tools in tandem
⚙️ Requirements
- Python 3.9 or higher
openai
Python packagemcp
librarypython-dotenv
for environment variable management
🛠️ Installation Guide
# Step 1: Clone the repository
git clone https://github.com/ashishpatel26/Agentic-RAG-with-MCP-Server.git
# Step 2: Navigate into the project directory
cd Agentic-RAG-with-MCP-Serve
# Step 3: Install dependencies
pip install -r requirements.txt
🔐 Configuration
- Create a
.env
file (use.env.sample
as a template) - Set your OpenAI model in
.env
:
OPENAI_MODEL_NAME="your-model-name-here"
GEMINI_API_KEY="your-model-name-here"
🚀 How to Use
- Start the MCP server:
python server.py
- Run the MCP client:
python mcp-client.py
📜 License
This project is licensed under the .
Thanks for Reading 🙏