mcp-hyperautomation

pr0xc3nt4ur1/mcp-hyperautomation

3.2

If you are the rightful owner of mcp-hyperautomation and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The HyperAutomation MCP Server is a cutting-edge solution that bridges LLM clients with dynamic security orchestration workflows, enabling real-time interaction and automation in security operations.

HyperAutomation MCP Server (Alpha)

Table of Contents

Project Description

The Hyperautomation MCP Server provides a bridge between LLM clients and a middle layer of HA workflows, enabling dynamic security orchestration through natural language interactions.

This MCP server is the core component of an architecture called Interactive Security Orchestration that reimagines how standard SOAR solutions can operate.

Unlike traditional SOAR platforms, where workflows are pre-built and static, this architecture enables analysts to "create workflows" in real-time as the LLM interprets high-level security goals and dynamically selects the optimal tools and actions from available in the HyperAutomation layer.

This unlocks a new paradigm where analysts can seamlessly blend dynamic workflow orchestration with traditional static SOAR approaches, creating a hybrid system that combines the reliability and predictability of pre-built workflows with the flexibility and intelligence of user-directed automation.

📖 Companion blog post: The Interactive Security Orchestration Paradigm (Part I)

Architecture Blueprint

Image

Primary Components

MCP Server

  • Location:
  • Purpose:
    • Core MCP protocol implementation that handles client requests and routes them to appropriate agents implemented as HyperAutomation workflows.
    • Retrieval of results from a cloud DB (slight deviation from standard MCP)

Agents

The system integrates with multiple domain-specific HyperAutomation Agents through webhook endpoints:

  • - Virus Total threat intelligence lookups and sample downloads
  • - Singularity Data Lake query execution and analysis
  • - Endpoint and asset management operations
  • - Alert management and case handling
  • - Remote script execution and task management
  • - Stores Agents' results in an internet-facing Database

Cloud DB

This component handles the retrieval of results generated by the HA Agents via polling

  • While the code included in this repo leverages Google Big Query as DB to temporarily store results, analysts can choose any other DB in the cloud, provided that they:
    1. Update the DB_Manager class to use a different Database and implement the necessary polling mechanism
    2. Update the DB Agent in HyperAutomation layer to leverage a different integration

Environment Variables Configuration

Make sure to create a .env file at the root of this project. Customise the content based on your needs.

Required Variables

Database Configuration:

# Google BigQuery Configuration (required when using BigQuery)
GOOGLE_CLOUD_PROJECT="your-gcp-project-id"
BIGQUERY_DATASET_ID="your-dataset-id"  
BIGQUERY_TABLE_ID="your-table-id"

# Credentials file path (MUST be updated)
CREDENTIALS_FILE="/path/to/your/credentials/file.json"

Optional Variables

# Logging Configuration
MCP_SERVER_LOG_FILE="mcp_server.log"  # Default: mcp_server.log

# Database Polling Configuration  
DB_MAX_RETRIES=200                    # Default: 200
DB_RETRY_DELAY=1                      # Default: 1 second
DB_SERVICE_TYPE="bigquery"            # Options: bigquery, gsheet. Default: bigquery

# BigQuery Configuration (optional)
BIGQUERY_REQ_ID_COLUMN="req_id"       # Default: req_id

Setup Instructions

  1. Clone the repository
  2. Install dependencies: uv install (or uv sync if using uv.lock)
  3. Configure environment variables (see instructions above)
  4. Test the server: uv run python server/server.py
    • Make sure you see no errors
    • Kill process

Usage

Note: Testing was performed with chatgpt-4.1 (This is the model I recommend using for now)

Connect the MCP server to your preferred LLM client.

/PATH_TO_BINARY/uv --directory /mcp-hyperautomation/server/ run server.py --transport stdio

Run a quick test by typing list endpoints in your client

Image

Dependencies

  • Python 3.8+
  • uv (for dependency management)
  • FastMCP
  • Google Cloud BigQuery Python client
  • uvicorn (for SSE transport)
  • Additional dependencies listed in

Author: antonio.monaca@sentinelone.com