pocketflow-mcp

pocketflow-mcp

3.2

If you are the rightful owner of pocketflow-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

PocketFlow MCP Server is a tool that leverages the PocketFlow methodology to generate comprehensive tutorials from GitHub repositories using advanced AI analysis.

PocketFlow MCP Server

A Model Context Protocol (MCP) server that brings the powerful PocketFlow tutorial generation methodology to all AI assistants. Generate comprehensive, beginner-friendly tutorials from any GitHub repository using advanced AI analysis.

What is PocketFlow?

PocketFlow is an innovative methodology for automatically generating high-quality tutorials from codebases. It:

  1. Identifies Core Abstractions - Finds the key concepts and components in a codebase
  2. Maps Relationships - Understands how different parts interact with each other
  3. Orders Explanations - Determines the best sequence to explain concepts
  4. Generates Tutorials - Creates beginner-friendly, step-by-step learning content
  5. Creates Visual Diagrams - Includes Mermaid diagrams for better understanding

Features

  • Universal AI Assistant Support - Works with Cline, Cursor, Claude Desktop, and any MCP-compatible client
  • 🔍 Deep Repository Analysis - Analyzes GitHub repositories to identify key abstractions
  • 🧠 Intelligent Concept Mapping - Understands relationships between code components
  • 📚 Comprehensive Tutorial Generation - Creates structured, beginner-friendly tutorials
  • 📊 Visual Architecture Diagrams - Generates Mermaid flowcharts and sequence diagrams
  • 🌐 Multi-LLM Provider Support - OpenRouter, Google Gemini, Anthropic Claude, OpenAI
  • 🌍 Multi-Language Support - Generate tutorials in different languages
  • 🔒 Secure & Local - All processing happens locally, API keys stored securely
  • Smart Caching - Caches LLM responses for faster subsequent runs

Quick Start

Prerequisites

  • Node.js 18+
  • npm or yarn
  • An API key for your preferred LLM provider

Installation

  1. Clone and Build
git clone https://github.com/tmtcomeup/pocketflow-mcp.git
cd pocketflow-mcp
npm install
npm run build
  1. Configure Your AI Assistant
For Cline (VSCode)

Add to your Cline settings:

{
  "mcpServers": {
    "pocketflow": {
      "command": "node",
      "args": ["path/to/pocketflow-mcp/build/index.js"]
    }
  }
}
For Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "pocketflow": {
      "command": "node", 
      "args": ["path/to/pocketflow-mcp/build/index.js"]
    }
  }
}

Usage

Once connected, you'll have access to these tools:

analyze_github_repository

Generate a complete tutorial from any GitHub repository:

// Basic usage
analyze_github_repository({
  repo_url: "https://github.com/microsoft/vscode",
  llm_provider: "openrouter",
  api_key: "sk-or-v1-your-key-here",
  model: "anthropic/claude-3.5-sonnet"
})

// Advanced options
analyze_github_repository({
  repo_url: "https://github.com/pytorch/pytorch", 
  llm_provider: "google",
  api_key: "your-gemini-key",
  model: "gemini-2.5-pro",
  max_abstractions: 8,
  language: "spanish",
  include_patterns: ["*.py", "*.md"],
  exclude_patterns: ["*test*", "*docs/*"]
})
get_repository_structure

Explore repository structure before analysis:

get_repository_structure({
  repo_url: "https://github.com/facebook/react",
  include_patterns: ["*.js", "*.jsx", "*.ts"],
  max_depth: 3
})

LLM Provider Setup

OpenRouter (Recommended)

  • Sign up at openrouter.ai
  • Get your API key from the dashboard
  • Access 100+ models including Claude, GPT-4, Gemini, and more

Google Gemini

  • Get an API key from Google AI Studio
  • Use models like gemini-2.5-pro or gemini-2.5-flash

Anthropic Claude

OpenAI

How It Works

The PocketFlow methodology follows a 6-step process:

  1. Repository Fetching - Downloads and filters code files based on patterns
  2. Abstraction Identification - Uses AI to identify 5-10 core concepts in the codebase
  3. Relationship Analysis - Maps how abstractions interact with each other
  4. Chapter Ordering - Determines the optimal learning sequence
  5. Chapter Writing - Generates detailed, beginner-friendly explanations for each concept
  6. Tutorial Compilation - Combines everything into a cohesive tutorial with diagrams

Example Output

The generated tutorial includes:

  • Index Page with project overview and visual architecture diagram
  • Individual Chapters for each core abstraction
  • Mermaid Diagrams showing relationships and workflows
  • Code Examples with detailed explanations
  • Cross-References between related concepts
  • Beginner-Friendly Language with analogies and examples

Configuration Options

ParameterDescriptionDefault
repo_urlGitHub repository URLRequired
llm_providerAI provider (openrouter, google, anthropic, openai)Required
api_keyAPI key for the LLM providerRequired
modelSpecific model to useProvider default
max_abstractionsNumber of key concepts to identify10
languageTutorial language"english"
include_patternsFile patterns to analyzeCommon code files
exclude_patternsFile patterns to skipTests, docs, builds
max_file_sizeMaximum file size in bytes100000
use_cacheEnable LLM response cachingtrue

Contributing

We welcome contributions! Please see our .

License

MIT License - see file for details.

Original PocketFlow

This MCP server is based on the original PocketFlow project by The-Pocket. We've adapted their brilliant methodology to work seamlessly with all MCP-compatible AI assistants.

Support


Ready to transform any codebase into a comprehensive learning resource? Start analyzing repositories with PocketFlow MCP today!