techgopal/ultrafast-mcp-sequential-thinking
If you are the rightful owner of ultrafast-mcp-sequential-thinking and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
UltraFast MCP Sequential Thinking is a high-performance, Rust-based Model Context Protocol (MCP) server and client implementation designed for sequential thinking.
sequential_thinking
Main thinking tool for problem-solving.
export_session
Export thinking sessions in various formats.
analyze_session
Get analytics and insights from session.
merge_sessions
Combine multiple thinking sessions.
UltraFast MCP Sequential Thinking
Official Docker image available at techgopal/ultrafast-mcp-sequential-thinking
A high-performance, Rust-based Model Context Protocol (MCP) server and client implementation for sequential thinking, built on the UltraFast MCP framework.
๐ Overview
UltraFast MCP Sequential Thinking provides a structured approach to problem-solving through dynamic and reflective thinking processes. This implementation offers significant performance advantages over the official TypeScript version while maintaining full compatibility with the MCP 2025-06-18 specification.
โจ Features
Core Capabilities
- Dynamic Problem Breakdown: Break complex problems into manageable steps
- Reflective Thinking: Revise and refine thoughts as understanding deepens
- Branching Logic: Explore alternative paths of reasoning
- Adaptive Planning: Adjust the total number of thoughts dynamically
- Solution Verification: Generate and verify solution hypotheses
- Context Preservation: Maintain thinking context across multiple steps
Performance Advantages
- 10-100x Faster: Rust implementation provides significant performance gains
- Memory Efficient: Optimized for handling large thinking sessions
- Concurrent Processing: Async/await support for high-throughput operations
- Type Safety: Compile-time guarantees for protocol compliance
- Production Ready: Comprehensive error handling and monitoring
Enhanced Features
- Progress Tracking: Real-time progress notifications for long thinking sessions
- Cancellation Support: Interrupt thinking processes when needed
- Session Management: Persistent thinking sessions across connections
- Export Capabilities: Export thinking sessions in various formats
- Analytics: Detailed analytics on thinking patterns and efficiency
๐๏ธ Architecture
Server Components
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Sequential Thinking Server โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โ โ Thinking โ โ Session โ โ Analytics โ โ
โ โ Engine โ โ Management โ โ & Metrics โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โ โ Branch โ โ Context โ โ Export โ โ
โ โ Management โ โ Tracking โ โ Handlers โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ UltraFast MCP Core โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โ โ Protocol โ โ Transport โ โ Handlers โ โ
โ โ Layer โ โ Layer โ โ System โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Client Components
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Sequential Thinking Client โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โ โ Session โ โ Progress โ โ Export โ โ
โ โ Manager โ โ Tracker โ โ Manager โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โ โ Thinking โ โ Branch โ โ Analytics โ โ
โ โ Interface โ โ Explorer โ โ Dashboard โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ UltraFast MCP Client โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ ๏ธ Installation
Prerequisites
- Rust 1.70 or later
- Cargo (Rust package manager)
Quick Start
1. Clone the Repository
git clone https://github.com/your-org/ultrafast-mcp-sequential-thinking.git
cd ultrafast-mcp-sequential-thinking
2. Build the Project
cargo build --release
3. Run the Server
# STDIO transport (default)
cargo run --bin sequential-thinking-server
# HTTP transport
cargo run --bin sequential-thinking-server -- --transport http --port 8080
4. Run the Client
# Connect to server
cargo run --bin sequential-thinking-client -- --server stdio://
# Connect to HTTP server
cargo run --bin sequential-thinking-client -- --server http://localhost:8080
5. Use MCP Inspector (Optional)
# Start HTTP server for Inspector
cargo run --bin sequential-thinking-server -- --transport http --port 8080
# Open MCP Inspector and load mcp-inspector-config.json
# Choose "sequential-thinking-server-http" for HTTP transport
Quick Start with Docker
1. Use the Official Docker Image
# Pull the latest image from Docker Hub
docker pull techgopal/ultrafast-mcp-sequential-thinking:latest
# Run HTTP server on port 8080
docker run --rm -p 8080:8080 techgopal/ultrafast-mcp-sequential-thinking:latest
# Run with custom configuration
docker run --rm -p 8080:8080 -e MAX_THOUGHTS=200 techgopal/ultrafast-mcp-sequential-thinking:latest
# Run in detached mode
docker run -d --name thinking-server -p 8080:8080 techgopal/ultrafast-mcp-sequential-thinking:latest
2. (Optional) Build the Docker Image Locally
# Build the server image locally (if you want to make changes)
docker build -t sequential-thinking-server .
# Verify the image was created
docker images | grep sequential-thinking-server
3. Connect Client to Docker Server
# Connect to HTTP server running in Docker
cargo run --bin sequential-thinking-client -- --server http://localhost:8080
# Or use curl to test the server
curl -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"2025-06-18"},"id":1}'
Example MCP Inspector Docker Configuration
If you want to use MCP Inspector or other tools to launch the server in a Docker container with advanced options (such as analytics and increased max thoughts), use the following configuration:
{
"mcpServers": {
"sequential-thinking": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"techgopal/ultrafast-mcp-sequential-thinking:latest",
"/usr/local/bin/sequential-thinking-server",
"--transport",
"stdio",
"--max-thoughts",
"200",
"--enable-analytics"
],
"env": {}
}
}
}
This configuration will start the server in a Docker container using STDIO transport, with analytics enabled and a higher max thoughts limit. Adjust the arguments as needed for your use case.
4. Docker Compose (Optional)
# Create docker-compose.yml for easy deployment
cat > docker-compose.yml << EOF
version: '3.8'
services:
sequential-thinking-server:
build: .
ports:
- "8080:8080"
environment:
- MAX_THOUGHTS=100
- ENABLE_ANALYTICS=true
restart: unless-stopped
EOF
# Start with Docker Compose
docker-compose up -d
# Stop the service
docker-compose down
Multi-Platform Docker Support
The Docker image supports multiple platforms for maximum compatibility:
Supported Platforms
- linux/amd64: Intel/AMD 64-bit processors
- linux/arm64: ARM 64-bit processors (Apple Silicon, ARM servers)
- linux/arm/v7: ARM 32-bit processors (Raspberry Pi, older ARM devices)
Platform-Specific Usage
# Pull for specific platform
docker pull --platform linux/amd64 techgopal/ultrafast-mcp-sequential-thinking:latest
docker pull --platform linux/arm64 techgopal/ultrafast-mcp-sequential-thinking:latest
docker pull --platform linux/arm/v7 techgopal/ultrafast-mcp-sequential-thinking:latest
# Run on specific platform
docker run --platform linux/amd64 --rm -p 8080:8080 techgopal/ultrafast-mcp-sequential-thinking:latest
docker run --platform linux/arm64 --rm -p 8080:8080 techgopal/ultrafast-mcp-sequential-thinking:latest
docker run --platform linux/arm/v7 --rm -p 8080:8080 techgopal/ultrafast-mcp-sequential-thinking:latest
Build Multi-Platform Images Locally
# Set up Docker Buildx for multi-platform builds
docker buildx create --name multiplatform --use
# Build for all supported platforms
docker buildx build --platform linux/amd64,linux/arm64,linux/arm/v7 \
-t your-username/ultrafast-mcp-sequential-thinking:latest \
--push .
# Build for specific platform only
docker buildx build --platform linux/arm64 \
-t your-username/ultrafast-mcp-sequential-thinking:arm64 \
--push .
Docker Compose with Platform Specification
# docker-compose.yml
version: '3.8'
services:
sequential-thinking:
image: techgopal/ultrafast-mcp-sequential-thinking:latest
platform: linux/arm64 # Specify platform for ARM64
ports:
- "8080:8080"
environment:
- MAX_THOUGHTS=200
- ENABLE_ANALYTICS=true
restart: unless-stopped
CI/CD Multi-Platform Pipeline
The project includes GitHub Actions workflows that automatically build and push multi-platform images:
- Docker Test: Builds and tests images on pull requests
- Docker Release: Builds and pushes multi-platform images on main branch
- Caching: Uses GitHub Actions cache for faster builds
- Provenance: Disabled for better compatibility
Build Scripts
The project includes convenient scripts for building and testing multi-platform images:
# Build multi-platform images locally
./scripts/build-multi-platform.sh --push
# Test multi-platform images
./scripts/test-multi-platform.sh
# Build for specific platforms only
./scripts/build-multi-platform.sh -p linux/amd64,linux/arm64 --push
# Test with custom image and port
./scripts/test-multi-platform.sh -i my-image -t v1.0.0 -p 9090
๐ง Configuration
MCP Inspector
This project includes an MCP Inspector configuration file (mcp-inspector-config.json
) that supports both HTTP and STDIO transport modes.
Using MCP Inspector
-
Start the server (choose one):
# HTTP transport (recommended for Inspector) cargo run --bin sequential-thinking-server -- --transport http --port 8080 # STDIO transport cargo run --bin sequential-thinking-server -- --transport stdio
-
Open MCP Inspector and load the config file:
- Navigate to the project directory
- Select
mcp-inspector-config.json
- Choose your preferred transport mode:
- HTTP:
sequential-thinking-server-http
(default) - STDIO:
sequential-thinking-server-stdio
- HTTP:
-
Test the tools:
sequential_thinking
: Main thinking tool for problem-solvingexport_session
: Export sessions in various formatsanalyze_session
: Get analytics and insightsmerge_sessions
: Combine multiple sessions
Config File Structure
The mcp-inspector-config.json
includes:
- HTTP Server:
http://localhost:8080/mcp
- STDIO Server: Cargo subprocess with stdio transport
- Tool Schemas: Complete input/output schemas for all tools
- Documentation: Detailed descriptions for each tool
Server Configuration
# config.toml
[server]
name = "ultrafast-sequential-thinking"
version = "1.0.0"
transport = "http"
port = 8080
[thinking]
max_thoughts_per_session = 100
max_branches_per_session = 10
session_timeout_seconds = 3600
enable_analytics = true
[export]
formats = ["json", "markdown", "pdf"]
auto_export = false
Client Configuration
# client_config.toml
[client]
server_url = "http://localhost:8080"
timeout_seconds = 30
retry_attempts = 3
[thinking]
auto_save_interval = 60
enable_progress_tracking = true
show_thought_visualization = true
๐ API Reference
Core Types
ThoughtData
pub struct ThoughtData {
pub thought: String,
pub thought_number: u32,
pub total_thoughts: u32,
pub next_thought_needed: bool,
pub is_revision: Option<bool>,
pub revises_thought: Option<u32>,
pub branch_from_thought: Option<u32>,
pub branch_id: Option<String>,
pub needs_more_thoughts: Option<bool>,
}
ThinkingSession
pub struct ThinkingSession {
pub session_id: String,
pub title: String,
pub thoughts: Vec<ThoughtData>,
pub branches: HashMap<String, Vec<ThoughtData>>,
pub metadata: SessionMetadata,
}
Server Endpoints
Tools
sequential_thinking
: Main thinking tool (MCP 2025-06-18 compliant)export_session
: Export thinking session in various formatsanalyze_session
: Get analytics and insights from sessionmerge_sessions
: Merge multiple thinking sessions
Resources
session_history
: Access to thinking session historyanalytics_data
: Session analytics and metricsexport_templates
: Export format templates
๐งช Testing
Run Tests
# Run all tests
cargo test
# Run specific test suite
cargo test --test thinking_tests
# Run with coverage
cargo tarpaulin
Integration Tests
# Run integration tests
cargo test --test integration_tests
# Run performance benchmarks
cargo bench
๐ Performance
Benchmarks
- Thought Processing: ~0.1ms per thought (vs 1-5ms in TypeScript)
- Session Creation: ~0.5ms (vs 10-20ms in TypeScript)
- Branch Management: ~0.2ms per branch (vs 2-8ms in TypeScript)
- Export Operations: ~1ms for JSON, ~5ms for Markdown (vs 50-200ms in TypeScript)
Memory Usage
- Per Session: ~2KB base + 100 bytes per thought
- Server Memory: ~10MB base + 1KB per active session
- Client Memory: ~5MB base + 500 bytes per session
๐ Security
Features
- Input Validation: Comprehensive validation of all inputs
- Session Isolation: Complete isolation between sessions
- Rate Limiting: Built-in rate limiting for API endpoints
- Authentication: Optional OAuth 2.1 support
- Audit Logging: Complete audit trail for all operations
Best Practices
- Validate all user inputs before processing
- Implement proper session cleanup
- Use HTTPS for all HTTP communications
- Regular security audits and updates
๐ค Contributing
Development Setup
# Clone repository
git clone https://github.com/techgopal/ultrafast-mcp-sequential-thinking.git
cd ultrafast-mcp-sequential-thinking
# Install dependencies
cargo build
# Run development server
cargo run --bin sequential-thinking-server -- --dev
# Run tests
cargo test
Code Style
- Follow Rust coding standards
- Use
cargo fmt
for formatting - Use
cargo clippy
for linting - Write comprehensive tests
Pull Request Process
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Update documentation
- Submit a pull request
๐ License
This project is licensed under the MIT License - see the file for details.
๐ Acknowledgments
- Model Context Protocol: For the excellent protocol specification
- UltraFast MCP: For the high-performance Rust framework
- Official Sequential Thinking Server: For the original TypeScript implementation
๐ Changelog
v1.0.0
- Initial release
- Full MCP 2025-06-18 compliance
- High-performance Rust implementation
- Comprehensive testing suite
- Production-ready features
Built with โค๏ธ using UltraFast MCP