sui-mcp-server
If you are the rightful owner of sui-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project provides a proof-of-concept implementation of a Machine Conversation Protocol (MCP) server that allows an AI agent to query a vector database and retrieve relevant documents for Retrieval-Augmented Generation (RAG).
The MCP Server with FAISS for RAG is designed to facilitate the integration of AI agents with vector databases for efficient document retrieval and processing. It leverages FastAPI to create MCP endpoints and integrates with the FAISS vector database for high-performance document retrieval. The server supports document chunking and embedding, allowing for efficient processing and retrieval of relevant information. Additionally, it includes GitHub Move file extraction and processing capabilities, enabling users to extract and index Move files from GitHub repositories. The server is equipped with LLM integration, providing a complete RAG workflow that enhances AI-generated responses with contextually relevant information. The project includes a simple client example and sample documents to demonstrate its capabilities.
Features
- FastAPI server with MCP endpoints
- FAISS vector database integration
- Document chunking and embedding
- GitHub Move file extraction and processing
- LLM integration for complete RAG workflow