gamma-omg/go-rag
If you are the rightful owner of go-rag and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
Go RAG is a Retrieval-Augmented Generation server that implements the Model Context Protocol (MCP) to provide document search capabilities for AI assistants.
Go RAG
Go RAG is a Retrieval-Augmented Generation server that implements the Model Context Protocol (MCP) to provide document search capabilities for AI assistants. Built in Go, it allows AI models to search through your documents to provide more accurate, contextually relevant answers.
Features
- Document Processing: Automatically indexes documents from a specified directory
- Real-time Monitoring: Watches for file changes and updates the index automatically
- Efficient Chunking: Splits documents into optimally sized chunks with configurable overlap
- Vector Database Integration: Uses Chroma DB for efficient semantic search
- MCP Implementation: Exposes search capabilities via the Model Context Protocol
- Multiple Embedding Models: Supports both OpenAI and Google Gemini embeddings
- Universal Document Support: Reads various document formats including PDF, DOCX, ODT, TXT, and more
Prerequisites
- Go 1.23 or later
- Docker and Docker Compose (for running ChromaDB)
- OpenAI API key or Google Gemini API key
Quick Start
-
Clone the repository:
git clone https://github.com/gamma-omg/go-rag.git cd go-rag -
Configure your settings:
cp cfg/template.yaml cfg/config.yamlEdit
cfg/config.yamlto set your API keys and other preferences. -
Start the tool:
docker-compose up -d -
MCP server is now available at (the port can be changed in config.yaml)
http://localhost:3001/sse
Cursor
To make this tool avaialbe in Cursor, go to Settings -> MCP -> Add new global MCP server and use this configuration:
"mcpServers": {
"rag-mcp": {
"url": "http://localhost:3001/sse"
}
}