local_RAG_mcp
If you are the rightful owner of local_RAG_mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project provides a local RAG MCP server to interact with Word and Excel documents using Ollama for local language model inference, ensuring data privacy.
The Local Document Q&A with Ollama & MCP project is a Python-based MCP agent designed to facilitate natural language interaction with local Microsoft Word (.docx) and Excel (.xlsx) documents. By leveraging Ollama for local language model inference, the system ensures that all data processing remains on the user's machine, maintaining privacy and security. The project utilizes LangChain community components for document loading, text splitting, embeddings, and vector storage through ChromaDB. The MCP SDK is employed to expose these functionalities as a set of tools, allowing users to query their documents in natural language. The system is designed to be extensible, with potential support for additional document types like PDFs and text files. The integration with MCP tools makes it accessible via MCP Inspector or `mcp-cli`, providing a user-friendly interface for document interaction.
Features
- Private & Local: All processing and inference occur locally, ensuring data privacy.
- Supported Document Types: Works with Microsoft Word (.docx) and Excel (.xlsx) files.
- Simple Indexing: MCP tool for scanning directories and building a searchable vector index.
- Natural Language Q&A: Enables querying documents in natural language.
- MCP Integration: Functionality exposed through MCP tools for easy access.
Tools
initialize_and_index
Processes documents and creates a local vector database.
ask_question
Retrieves relevant information and generates an answer based on a user question.