artifex-mcp-server

elebihan/artifex-mcp-server

3.1

If you are the rightful owner of artifex-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Artifex MCP Server is an educational project demonstrating the integration of a Model Context Protocol server with the Artifex Engine, allowing interaction with a Large Language Model.

Artifex MCP Server

⚠️ Disclaimer: this project is FOR EDUCATION, NOT PRODUCTION* ️

This project is an example of Model Context Protocol server, which allows a Large Language Model to interact with Artifex Engine.

Build instructions

To build the server, execute:

cargo build

Usage

This MCP server can be used with Ollama via:

  • Ollama MCP Bridge, by using the "stdio" transport mode.
  • [Ollama MCP Client][ollama-mcp-client], for a REPL interface.

Set up ollama

Follow the instructions to install ollama on a GNU/Linux system.

Then, run Ollama:

ollama serve

Using ollama-mcp-bridge

Install ollama-mcp-bridge using uv Python package manager:

uv tool install ollama-mcp-bridge

Then, run the bridge:

PATH=$PATH:${PWD}/target/debug ollama-mcp-bridge \
    --config ${PWD}/data/ollama-mcp-bridge/mcp-config.json \
    --host 0.0.0.0 --port 8000

Use curl to interact using the chat API. For example:

curl -N -X POST http://localhost:8000/api/chat \
    -H "accept: application/json" \
    -H "Content-Type: application/json" \
    -d '{
    "model": "qwen3:0.6b",
    "messages": [
      {
        "role": "system",
        "content": "You are an Artifex Engine assistant."
      },
      {
        "role": "user",
        "content": "Give me the result of the inspection."
      }
    ],
    "think": false,
    "stream": false
  }'

Using Ollama MCP Client

🔧 The configuration file data/ollama-mcp-bridge/mcp-config.json must be modified.

Change:

                "../../data/samples/config.toml"

To:

                "./data/samples/config.toml"

Install ollama-mcp-client using uv:

uv tool install ollmcp

Then, start the client:

PATH=$PATH:${PWD}/target/debug ollmcp --model qwen3:0.6b \
    --servers-json ${PWD}/data/ollama-mcp-bridge/mcp-config.json

Use the tools command to list the available tools: "artifex-engine" should be listed. Enable it and start chatting.

License

Copyright © 2025 Eric Le Bihan

This program is distributed under the terms of the MIT License.

See the file for license details.