mcpburst

arberrexhepi/mcpburst

3.1

If you are the rightful owner of mcpburst and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The Model Context Protocol (MCP) server is a versatile platform designed to facilitate communication between different components using the MCP SDK. It supports both HTTP and stdio transport servers, providing a robust framework for building and testing applications that require model context interactions.

MCP Burst

An Express app with AI capabilities powered by MCP (Model Context Protocol)

MCP Burst supports multiple server types and includes a built-in chat interface for testing. In a way acting as a bridge for MCP Clients to an array of tools and MCP servers/server gateways.

What it does

  • Chat Integration: Works with Ollama, OpenAI, and Docker Model Runner
  • MCP Server Support: Runs streamablehttp servers, stdio servers such as Docker MCP Gateway server, and n8n MCP trigger node URLs
  • Custom Tools: Create discoverable tools at the /mcp endpoint
  • Multi-Client Support: Use MCP_BURST_CLIENT=true in .env to serve multiple clients simultaneously

Quick Status Check

  • Session debugging: curl http://localhost:4000/status
  • Note: Multi-client sessions are still being thoroughly tested

NOTE: You might have to upgrade the security if you want to use this in production. Otherwise, test and be patient until a secure merge


## Table of Contents

- [Features](#features)
- [Prerequisites](#prerequisites)
- [Installation](#installation)
- [Configuration](#configuration)
- [Usage](#usage)
- [Scripts](#scripts)
- [Running MCP Burst in MCP Clients](#running-mcp-burst-in-mcp-clients)
- [Prebuilt Demo Chatbot Agent App](#prebuilt-demo-chatbot-agent-app)
- [License](#license)

## Features

- Streamable HTTP MCP transport server built on Model Context Protocol SDK
- Serve MCP Burst as an http and/or stdio client _(tested with Claude Desktop, VS Code)_
- Built-in demo tools: 'echo' and 'update_session_planner'
- Express façade handling JSON-RPC at `/mcp` and health checks at `/health`
- Built in Planner Tool (Required for built in chatbot, but not necessarily for MCP Clients.)
- OPTIONAL: Built in Gamification (ie positive reinforcement for successfully completing tasks, worked well with Claude Desktop so I kept it in the repo)
- Demo Chatbot Agent App to test MCP integration, and now session planner resource to execute multiple tools.

## Prerequisites

- Node.js v16 or higher
- npm (included with Node.js)

## Installation

1. Clone the repository:

   ```sh
   git clone https://github.com/arberrexhepi/mcpburst.git
   cd mcpburst
  1. Install root dependencies in root:

    npm install
    
  2. Install hub and server dependencies:

    cd hub && npm install && cd ..
    

Configuration

Create environment variable files in both hub/ and server/ directories:

hub/.env

# MCP Burst hub/.env example

MCP_REQUIRE_AUTH=false
PORT=4000
HOST=127.0.0.1
DEBUG=mcp:*
MCP_BURST_CLIENT=true
ALLOWED_HOSTS=["127.0.0.1", "localhost"]

# Limit or extend these environment variables as needed for your tools
WINDOWS_BASE_VARS=["PATH","TEMP","TMP","LOCALAPPDATA","ProgramData"]
MAC_BASE_VARS=["PATH","TMPDIR","USER"]
LINUX_BASE_VARS=["PATH","TMPDIR","USER"]

# MCP server specific keys
BRAVE_API_KEY=your_brave_api_key
GITHUB_API_KEY=github_token

server/.env

# MCP Burst server/.env example

MCP_ENDPOINT=http://127.0.0.1:4000/mcp
PORT=3500
STRATEGY=DMR
ALLOWED_ORIGINS=["http://localhost:5173"]

# Model runner endpoints
DOCKER_MODEL_RUNNER_URL=http://localhost:12434/engines/llama.cpp/v1/chat/completions
OLLAMA_URL=http://localhost:11434/v1/chat/completions

# API keys and model names
OPENAI_API_KEY=your_openai_api_key
OPENAI_MODEL_NAME=gpt-4o-mini
DOCKER_MODEL_NAME=ai/qwen3:latest
OLLAMA_MODEL_NAME=llama3.2

Usage

  1. Build and start the MCP hub server:

    npm run start:hub
    
  2. Start the front-end proxy server:

  • Start without demo app
  npm run start:server
  • Start with demo app
npm run dev
  1. Open your browser at http://localhost:5173 to access the chat UI.

  2. The hub JSON-RPC endpoint is available at http://localhost:4000/mcp.

Scripts

All scripts are defined in the root package.json:

  • npm run build: Compile TypeScript in hub and copy bridge files
  • npm run copy:bridges: Copy bridge YAML definitions from hub/bridges to hub/dist
  • npm run start:hub: Build and run the MCP hub server (hub/dist/hub.js)
  • npm run start:server: Run the Express static server with chat UI (server/index.js)
  • npm run install:frontend: Install front-end dependencies
  • npm run build:frontend: Build the front-end app for production
  • npm run start:frontend: Start the front-end development server (frontend)
  • npm run dev: Run both the Express server and front-end concurrently

Running MCP Burst in MCP Clients

VSCode:

  1. In your MCP Burst folder, run the hub:
npm run start:hub
  1. Create .vscode/mcp.json in your workspces for example, using this config:
{
  "servers": {
    "mcp-burst": {
      "type": "http",
      "url": "http://127.0.0.1:4000"
    }
  }
}

Claude Desktop: Update your claude_desktop_config.json to include:

{
  "mcpServers": {
    "mcp-burst": {
      "command": "node",
      "args": ["/path/to/mcpburst/hub/stdio-client-hub-entry.js"]
    }
  }
}

## Prebuilt Demo Chatbot Agent App

In your MCP Burst folder, run these commands:

1.

```bash
npm run start:hub
npm run dev

License

  • This project is licensed under the Apache License 2.0. See the file for details.

Copyright (c) 2025 Arbër Rexhepi (arbër inc)