pr0xc3nt4ur1/mcp-hyperautomation
If you are the rightful owner of mcp-hyperautomation and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The HyperAutomation MCP Server is a cutting-edge solution that bridges LLM clients with dynamic security orchestration workflows, enabling real-time interaction and automation in security operations.
HyperAutomation MCP Server (Alpha)
Table of Contents
- Project Description
- Architecture Blueprint
- Primary Components
- Environment Variables Configuration
- Setup Instructions
- Usage
- Dependencies
Project Description
The Hyperautomation MCP Server provides a bridge between LLM clients and a middle layer of HA workflows, enabling dynamic security orchestration through natural language interactions.
This MCP server is the core component of an architecture called Interactive Security Orchestration that reimagines how standard SOAR solutions can operate.
Unlike traditional SOAR platforms, where workflows are pre-built and static, this architecture enables analysts to "create workflows" in real-time as the LLM interprets high-level security goals and dynamically selects the optimal tools and actions from available in the HyperAutomation layer.
This unlocks a new paradigm where analysts can seamlessly blend dynamic workflow orchestration with traditional static SOAR approaches, creating a hybrid system that combines the reliability and predictability of pre-built workflows with the flexibility and intelligence of user-directed automation.
📖 Companion blog post: The Interactive Security Orchestration Paradigm (Part I)
Architecture Blueprint
Primary Components
MCP Server
- Location:
- Purpose:
- Core MCP protocol implementation that handles client requests and routes them to appropriate agents implemented as HyperAutomation workflows.
- Retrieval of results from a cloud DB (slight deviation from standard MCP)
Agents
The system integrates with multiple domain-specific HyperAutomation Agents through webhook endpoints:
- - Virus Total threat intelligence lookups and sample downloads
- - Singularity Data Lake query execution and analysis
- - Endpoint and asset management operations
- - Alert management and case handling
- - Remote script execution and task management
- - Stores Agents' results in an internet-facing Database
Cloud DB
This component handles the retrieval of results generated by the HA Agents via polling
- While the code included in this repo leverages Google Big Query as DB to temporarily store results, analysts can choose any other DB in the cloud, provided that they:
- Update the DB_Manager class to use a different Database and implement the necessary polling mechanism
- Update the DB Agent in HyperAutomation layer to leverage a different integration
Environment Variables Configuration
Make sure to create a .env file at the root of this project. Customise the content based on your needs.
Required Variables
Database Configuration:
# Google BigQuery Configuration (required when using BigQuery)
GOOGLE_CLOUD_PROJECT="your-gcp-project-id"
BIGQUERY_DATASET_ID="your-dataset-id"
BIGQUERY_TABLE_ID="your-table-id"
# Credentials file path (MUST be updated)
CREDENTIALS_FILE="/path/to/your/credentials/file.json"
Optional Variables
# Logging Configuration
MCP_SERVER_LOG_FILE="mcp_server.log" # Default: mcp_server.log
# Database Polling Configuration
DB_MAX_RETRIES=200 # Default: 200
DB_RETRY_DELAY=1 # Default: 1 second
DB_SERVICE_TYPE="bigquery" # Options: bigquery, gsheet. Default: bigquery
# BigQuery Configuration (optional)
BIGQUERY_REQ_ID_COLUMN="req_id" # Default: req_id
Setup Instructions
- Clone the repository
- Install dependencies:
uv install(oruv syncif using uv.lock) - Configure environment variables (see instructions above)
- Test the server:
uv run python server/server.py- Make sure you see no errors
- Kill process
Usage
Note: Testing was performed with chatgpt-4.1 (This is the model I recommend using for now)
Connect the MCP server to your preferred LLM client.
/PATH_TO_BINARY/uv --directory /mcp-hyperautomation/server/ run server.py --transport stdio
Run a quick test by typing list endpoints in your client
Dependencies
- Python 3.8+
- uv (for dependency management)
- FastMCP
- Google Cloud BigQuery Python client
- uvicorn (for SSE transport)
- Additional dependencies listed in
Author: antonio.monaca@sentinelone.com