sib-swiss/sparql-mcp
If you are the rightful owner of sparql-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The SPARQL MCP server is designed to assist users in crafting SPARQL queries for open-access SPARQL endpoints, specifically developed for the SIB Expasy portal.
๐ญ SPARQL MCP server
A Model Context Protocol (MCP) server to help users write SPARQL queries for open-access SPARQL endpoints, developed for the SIB Expasy portal.
The server will automatically index metadata present in the list of SPARQL endpoints defined in a JSON config file, such as:
- SPARQL query examples,
- Endpoints schema using the Vocabulary of Interlinked Datasets (VoID), which can be automatically generated using the void-generator.
๐งฉ Endpoints
The HTTP API comprises 2 main endpoints:
-
/mcp
: MCP server that searches for relevant data to answer a user question using the EOSC Data Commons search API- Uses
rmcp
with Streamable HTTP transport - ๐งฐ Available tools:
-
access_sparql_resources
: retrieve relevant information about the resources to help build a SPARQL query to answer the question (query examples, classes schema) -
get_resources_info
: retrieve relevant information about the SPARQL endpoints resources themselves (e.g. description, list of available endpoints) -
execute_sparql
: execute a SPARQL query against a given endpoint
-
- Uses
-
/chat
: optional HTTP POST endpoint (JSON) to query the MCP server via an LLM provider
๐ Use
Use it through the sparql-mcp
package on pip:
uvx sparql-mcp ./sparql-mcp.json
Or download the binary corresponding to your architecture from the releases page.
๐ ๏ธ Development
[!IMPORTANT]
Requirements:
- Rust
- Protobuf installed (e.g.
brew install protobuf
)- API key for a LLM provider: Mistral.ai or OpenAI, you can use the free tier, you just need to login
Recommend VSCode extension:
rust-analyzer
๐ฅ Install dev dependencies
rustup update
cargo install cargo-release cargo-deny cargo-watch git-cliff
Create a .cargo/config.toml
file with your Mistral API key or OpenAI API key:
[env]
MISTRAL_API_KEY = "YOUR_API_KEY"
OPENAI_API_KEY = "YOUR_API_KEY"
GROQ_API_KEY = "YOUR_API_KEY"
โก๏ธ Start dev server
Start the MCP server in dev at http://localhost:8000/mcp, with OpenAPI UI at http://localhost:8000/docs
cargo run
Customize server configuration through CLI arguments:
cargo run -- --force-index --mcp-only --db-path ./data/lancedb
Provide a custom list of servers through a .json
file with:
cargo run -- ./sparql-mcp.json
Example sparql-mcp.json
:
{
"endpoints": [
{
"label": "UniProt",
"endpoint_url": "https://sparql.uniprot.org/sparql/",
"description": "UniProt is a comprehensive resource for protein sequence and annotation data."
},
{
"label": "Bgee",
"endpoint_url": "https://www.bgee.org/sparql/",
"description": "Bgee is a database for retrieval and comparison of gene expression patterns across multiple animal species.",
"homepage_url": "https://www.bgee.org/"
}
]
}
[!TIP]
Run and reload on change to the code:
cargo watch -x run
[!NOTE]
Example
curl
request:curl -X POST http://localhost:8000/search -H "Content-Type: application/json" -H "Authorization: SECRET_KEY" -d '{"messages": [{"role": "user", "content": "What is the HGNC symbol for the P68871 protein?"}], "model": "mistral/mistral-small-latest", "stream": true}'
Recommended model per supported provider:
openai/gpt-4.1
mistralai/mistral-large-latest
groq/moonshotai/kimi-k2-instruct
๐ Connect MCP client
Follow the instructions of your client, and use the /mcp
URL of your deployed server (e.g. http://localhost:8000/mcp)
๐ VSCode GitHub Copilot
Add a new MCP server through the VSCode UI:
- Open the Command Palette (
ctrl+shift+p
orcmd+shift+p
) - Search for
MCP: Add Server...
- Choose
HTTP
, and provide the MCP server URL http://localhost:8000/mcp
Your VSCode mcp.json
should look like:
{
"servers": {
"sparql-mcp-server": {
"url": "http://localhost:8000/mcp",
"type": "http"
}
},
"inputs": []
}
๐ฆ Build for production
Build binary in target/release/
cargo build --release
[!NOTE]
Start the server with (change flags at your convenience):
./target/release/sparql-mcp ./sparql-mcp.json --force-index
Start using the python wheel:
uvx --from ./target/release/sparql_mcp-0.1.0-py3-none-any.whl . sparql-mcp
๐ Build python package
Require
uv
installed
Bundle the CLI as python package in target/wheels
:
uvx maturin build
๐ณ Deploy with Docker
Create a keys.env
file with the API keys:
MISTRAL_API_KEY=YOUR_API_KEY
SEARCH_API_KEY=SECRET_KEY_YOU_CAN_USE_IN_FRONTEND_TO_AVOID_SPAM
[!TIP]
SEARCH_API_KEY
can be used to add a layer of protection against bots that might spam the LLM, if not provided no API key will be needed to query the API.
Build and deploy the service:
docker compose up
๐งผ Format & lint
Automatically format the codebase using rustfmt
:
cargo fmt
Lint with clippy
:
cargo clippy --all
Automatically apply possible fixes:
cargo fix
โ๏ธ Check supply chain
Check the dependency supply chain: licenses (only accept dependencies with OSI or FSF approved licenses), and vulnerabilities (CVE advisories).
cargo deny check
Update dependencies in Cargo.lock
:
cargo update
๐ท๏ธ Release
Dry run:
cargo release patch
Or
minor
/major
Create release:
cargo release patch --execute