Agentic-RAG-with-MCP-Server

Agentic-RAG-with-MCP-Server

3.4

If you are the rightful owner of Agentic-RAG-with-MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Agentic RAG with MCP Server is a project that integrates an MCP server and client to build Retrieval-Augmented Generation applications.

🚀 Agentic RAG with MCP Server


✨ Overview

Agentic RAG with MCP Server is a powerful project that brings together an MCP (Model Context Protocol) server and client for building Agentic RAG (Retrieval-Augmented Generation) applications.

This setup empowers your RAG system with advanced tools such as:

  • 🕵️‍♂️ Entity Extraction
  • 🔍 Query Refinement
  • Relevance Checking

The server hosts these intelligent tools, while the client shows how to seamlessly connect and utilize them.


🖥️ Server — server.py

Powered by the FastMCP class from the mcp library, the server exposes these handy tools:

Tool NameDescriptionIcon
get_time_with_prefixReturns the current date & time
extract_entities_toolUses OpenAI to extract entities from a query — enhancing document retrieval relevance🧠
refine_query_toolImproves the quality of user queries with OpenAI-powered refinement
check_relevanceFilters out irrelevant content by checking chunk relevance with an LLM

🤝 Client — mcp-client.py

The client demonstrates how to connect and interact with the MCP server:

  • Establish a connection with ClientSession from the mcp library
  • List all available server tools
  • Call any tool with custom arguments
  • Process queries leveraging OpenAI or Gemini and MCP tools in tandem

⚙️ Requirements

  • Python 3.9 or higher
  • openai Python package
  • mcp library
  • python-dotenv for environment variable management

🛠️ Installation Guide

# Step 1: Clone the repository
git clone https://github.com/ashishpatel26/Agentic-RAG-with-MCP-Server.git

# Step 2: Navigate into the project directory
cd Agentic-RAG-with-MCP-Serve

# Step 3: Install dependencies
pip install -r requirements.txt

🔐 Configuration

  1. Create a .env file (use .env.sample as a template)
  2. Set your OpenAI model in .env:
OPENAI_MODEL_NAME="your-model-name-here"
GEMINI_API_KEY="your-model-name-here"

🚀 How to Use

  1. Start the MCP server:
python server.py
  1. Run the MCP client:
python mcp-client.py

📜 License

This project is licensed under the .


Thanks for Reading 🙏