CANoe_MCP

MohamedHamed19m/CANoe_MCP

3.2

If you are the rightful owner of CANoe_MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The CANoe MCP Server is an asynchronous server designed to enable seamless interaction with a running CANoe instance, allowing LLMs to control and query CANoe without manual intervention.

Tools
6
Resources
0
Prompts
0

CANoe MCP Server

Project Overview

This repository hosts an asynchronous Micro-Controller Platform (MCP) server designed to facilitate seamless interaction with a running CANoe instance. Leveraging py_canoe for CANoe automation and fastmcp for exposing functionalities as AI tools, this server allows Large Language Models (LLMs) to control and query CANoe without manual intervention. It attaches to an already-open CANoe instance, eliminating the need for hard-coded paths or specific configurations at startup.

Features

The CANoe MCP Server exposes the following core functionalities as AI tools:

  • Compile CAPL Nodes: Compile all CAPL nodes within the loaded CANoe configuration.
  • Read System Variables: Retrieve the current value of a specified CANoe system variable.
  • Write System Variables: Set the value of a specified CANoe system variable.
  • Start Measurement: Initiate the CANoe measurement.
  • Stop Measurement: Halt the currently running CANoe measurement.
  • Get Measurement State: Query the current state of the CANoe measurement (e.g., running, stopped).

Benefits

  • Zero-Click Automation: Enables LLMs to interact with CANoe directly, automating routine tasks.
  • Flexibility: Attaches to any already-open CANoe instance, regardless of the loaded configuration or project path.
  • Real-time Interaction: Provides immediate feedback and control over the CANoe environment.
  • Developer-Friendly: Built with asynchronous Python, making it easy to integrate into modern AI workflows.
  • Extensible: The modular design allows for easy addition of new CANoe functionalities as MCP tools.

Prerequisites

  1. CANoe must be running before starting the MCP server.
  2. Python 3.7+

Installation

  1. Clone the repository:
    git clone https://github.com/your-username/CANoe_MCP.git
    cd CANoe_MCP
    
  2. Install the dependencies:
    pip install -r requirements.txt
    

Usage

  1. Start CANoe: Ensure a CANoe instance is running with your desired configuration loaded.
  2. Run the server:
    python -m src.server
    
    The server will automatically attach to the running CANoe instance.
  3. Integrate with LLM: Configure your LLM environment (e.g., Claude Desktop) to connect to this MCP server. An example configuration for Claude Desktop might look like this:
    "mcpServers": {
      "canoe": {
        "command": "python",
        "args": ["-m", "src.server"],
        "cwd": "C:/path/to/your/CANoe_MCP"
      }
    }
    
  4. Interact via LLM: You can now use natural language prompts in your LLM chat to control CANoe, e.g., "read variable Namespace::MyVar", "compile nodes", "start measurement".

Next Steps / Potential Enhancements

  • Streaming CAPL traces: Expose an async generator that yields write() lines as MCP log events.
  • Signal groups: Add functionality to read specific signals (e.g., read_signal("CAN1::EngineData::EngineSpeed")).
  • Multi-instance support: Allow spawning several CANoe() objects in SIL mode and expose them as canoe_1, canoe_2, etc., via the same server.
  • Error Handling & Reconnection: Implement more robust error handling and an automatic reconnection strategy for CANoe.