aie2e-server

aie2e/aie2e-server

3.2

If you are the rightful owner of aie2e-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The AIE2E MCP Server is a Python-based backend component for AI-powered end-to-end testing, utilizing the Model Context Protocol (MCP) to execute browser automation tasks with AI agents.

Tools
1
Resources
0
Prompts
0

AIE2E MCP Server

The Python server component of the AI-powered end-to-end testing framework. This package provides the Model Context Protocol (MCP) server that executes browser automation tasks using AI agents.

Description

AIE2E MCP Server is the backend component that powers the AI End-to-End Testing Framework. It provides a Model Context Protocol (MCP) server for running browser-based tests using AI agents and supports both stdio and HTTP transport mechanisms.

Usage

The simplest way to run the MCP server is by using uv

Stdio Transport (Default)

The stdio transport is used by default and is recommended for most use cases. You must specify the LLM configuration:

uvx --from aie2e aie2e-server --model "gpt-4" --llm-provider "openai" --api-key "your-api-key"

HTTP Transport

The HTTP transport uses streamable HTTP and is useful for remote connections:

uvx --from aie2e aie2e-server --transport http --host 127.0.0.1 --port 3001 --model "claude-3-sonnet" --llm-provider "anthropic" --api-key "your-api-key"

Claude Code

To use this tool with Claude Code, configure it in your Claude Code MCP settings. Example .mcp.json:

{
  "mcpServers": {
    "aie2e": {
      "command": "uvx",
      "args": [
        "--from",
        "aie2e",
        "aie2e-server",
        "--model",
        "claude-3-5-sonnet-20241022",
        "--llm-provider",
        "anthropic"
      ],
      "env": {
        "ANTHROPIC_API_KEY": "your-anthropic-api-key-here"
      }
    }
  }
}

Using Environment Variables

The API key can be set using environment variables instead of the --api-key command line argument. Use the standard environment variable name for your chosen provider, as supported by Browser-Use

# Set the API key as an environment variable
export OPENAI_API_KEY="your-openai-key"
export ANTHROPIC_API_KEY="your-anthropic-key"
export GOOGLE_API_KEY="your-google-key"

MCP Protocol

The server implements the Model Context Protocol (MCP) and provides the following tools:

run_test_session

Executes browser-based test sessions with multiple test cases using AI agents.

Use Cases:

  • Test web applications end-to-end using AI agents
  • Validate user workflows across multiple pages or steps
  • Maintain state across a series of related test cases

MCP Tool Parameters:

  • description: Description of the test session
  • tests: Array of test cases to execute sequentially
  • allowed_domains: List of allowed domains for navigation (optional)
  • sensitive_data: Sensitive data for form filling (optional)

Returns: JSON string containing TestSessionResult with execution summary and statistics.

run_test_case

Executes a single browser-based test case using AI agents.

Use Cases:

  • Test user interactions with a web application using AI agents
  • Test a single specific workflow or task
  • Validate behavior of a web page or feature

MCP Tool Parameters:

  • task: Description of the task to be performed in the test case
  • initial_actions: List of initial actions to perform before the main task (optional)
  • use_vision: Whether to use vision capabilities in the test case (optional, default: false)
  • allowed_domains: List of allowed domains for navigation (optional)
  • sensitive_data: Sensitive data for form filling (optional)

Returns: JSON string containing TestSessionResult with execution summary for the single test case.

Server Configuration

Command Line Arguments: The server configuration is set once at startup via command line arguments:

  • --model: AI model to use (e.g., "gemini-2.5-pro", "gpt-4") - Required
  • --llm-provider: LLM provider (e.g., "google", "openai", "anthropic") - Required
  • --api-key: API key for the LLM provider (optional, can use environment variables like OPENAI_API_KEY)
  • --headless: Run browser in headless mode (default: false)

Integration

The MCP server is designed to work with MCP-compatible clients. For JavaScript/TypeScript projects, use the AIE2E Node.js client which automatically connects to this server.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

Related Projects

License

MIT License - see for details