AppliNH/mcp-rag-vector
3.2
If you are the rightful owner of mcp-rag-vector and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Model Context Protocol (MCP) server facilitates communication between different components in a system, allowing for efficient data exchange and tool invocation.
Tools
1
Resources
0
Prompts
0
mcp-rag-vector
Configure MCP with Claude Desktop
- Generate code install in GOBIN:
make generate-code-design
make install-bin
- Configure the following in Claude Desktop settings:
{
"mcpServers": {
"greetingmcp": {
"command": "$GOPATH/bin/mcp-rag-vector",
"args": ["mcp"]
}
}
}
Local usage with ollama
-
Install ollama and run
ollama serve -
Start the stack
docker compose up -d --build
-
Wait for the model to be downloaded (2 GB)
-
Query the API
curl -X POST http://localhost:8000/api/chat \
-H "Content-Type: application/json" \
-d '{
"model": "llama3.2:3b",
"messages": [
{ "role": "user", "content": "Use the greet tool with my name thomas, return what it says" }
],
"stream": false
}'
Query the MCP via cURL
- Start the API
docker-commpose up or make run-server
- Use
make http-call-mcp