pocketflow-mcp
If you are the rightful owner of pocketflow-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
PocketFlow MCP Server is a tool that leverages the PocketFlow methodology to generate comprehensive tutorials from GitHub repositories using advanced AI analysis.
PocketFlow MCP Server
A Model Context Protocol (MCP) server that brings the powerful PocketFlow tutorial generation methodology to all AI assistants. Generate comprehensive, beginner-friendly tutorials from any GitHub repository using advanced AI analysis.
What is PocketFlow?
PocketFlow is an innovative methodology for automatically generating high-quality tutorials from codebases. It:
- Identifies Core Abstractions - Finds the key concepts and components in a codebase
- Maps Relationships - Understands how different parts interact with each other
- Orders Explanations - Determines the best sequence to explain concepts
- Generates Tutorials - Creates beginner-friendly, step-by-step learning content
- Creates Visual Diagrams - Includes Mermaid diagrams for better understanding
Features
- ✅ Universal AI Assistant Support - Works with Cline, Cursor, Claude Desktop, and any MCP-compatible client
- 🔍 Deep Repository Analysis - Analyzes GitHub repositories to identify key abstractions
- 🧠 Intelligent Concept Mapping - Understands relationships between code components
- 📚 Comprehensive Tutorial Generation - Creates structured, beginner-friendly tutorials
- 📊 Visual Architecture Diagrams - Generates Mermaid flowcharts and sequence diagrams
- 🌐 Multi-LLM Provider Support - OpenRouter, Google Gemini, Anthropic Claude, OpenAI
- 🌍 Multi-Language Support - Generate tutorials in different languages
- 🔒 Secure & Local - All processing happens locally, API keys stored securely
- ⚡ Smart Caching - Caches LLM responses for faster subsequent runs
Quick Start
Prerequisites
- Node.js 18+
- npm or yarn
- An API key for your preferred LLM provider
Installation
- Clone and Build
git clone https://github.com/tmtcomeup/pocketflow-mcp.git
cd pocketflow-mcp
npm install
npm run build
- Configure Your AI Assistant
For Cline (VSCode)
Add to your Cline settings:
{
"mcpServers": {
"pocketflow": {
"command": "node",
"args": ["path/to/pocketflow-mcp/build/index.js"]
}
}
}
For Claude Desktop
Add to claude_desktop_config.json
:
{
"mcpServers": {
"pocketflow": {
"command": "node",
"args": ["path/to/pocketflow-mcp/build/index.js"]
}
}
}
Usage
Once connected, you'll have access to these tools:
analyze_github_repository
Generate a complete tutorial from any GitHub repository:
// Basic usage
analyze_github_repository({
repo_url: "https://github.com/microsoft/vscode",
llm_provider: "openrouter",
api_key: "sk-or-v1-your-key-here",
model: "anthropic/claude-3.5-sonnet"
})
// Advanced options
analyze_github_repository({
repo_url: "https://github.com/pytorch/pytorch",
llm_provider: "google",
api_key: "your-gemini-key",
model: "gemini-2.5-pro",
max_abstractions: 8,
language: "spanish",
include_patterns: ["*.py", "*.md"],
exclude_patterns: ["*test*", "*docs/*"]
})
get_repository_structure
Explore repository structure before analysis:
get_repository_structure({
repo_url: "https://github.com/facebook/react",
include_patterns: ["*.js", "*.jsx", "*.ts"],
max_depth: 3
})
LLM Provider Setup
OpenRouter (Recommended)
- Sign up at openrouter.ai
- Get your API key from the dashboard
- Access 100+ models including Claude, GPT-4, Gemini, and more
Google Gemini
- Get an API key from Google AI Studio
- Use models like
gemini-2.5-pro
orgemini-2.5-flash
Anthropic Claude
- Get an API key from console.anthropic.com
- Use models like
claude-3-5-sonnet-20241022
OpenAI
- Get an API key from platform.openai.com
- Use models like
gpt-4o
orgpt-4o-mini
How It Works
The PocketFlow methodology follows a 6-step process:
- Repository Fetching - Downloads and filters code files based on patterns
- Abstraction Identification - Uses AI to identify 5-10 core concepts in the codebase
- Relationship Analysis - Maps how abstractions interact with each other
- Chapter Ordering - Determines the optimal learning sequence
- Chapter Writing - Generates detailed, beginner-friendly explanations for each concept
- Tutorial Compilation - Combines everything into a cohesive tutorial with diagrams
Example Output
The generated tutorial includes:
- Index Page with project overview and visual architecture diagram
- Individual Chapters for each core abstraction
- Mermaid Diagrams showing relationships and workflows
- Code Examples with detailed explanations
- Cross-References between related concepts
- Beginner-Friendly Language with analogies and examples
Configuration Options
Parameter | Description | Default |
---|---|---|
repo_url | GitHub repository URL | Required |
llm_provider | AI provider (openrouter , google , anthropic , openai ) | Required |
api_key | API key for the LLM provider | Required |
model | Specific model to use | Provider default |
max_abstractions | Number of key concepts to identify | 10 |
language | Tutorial language | "english" |
include_patterns | File patterns to analyze | Common code files |
exclude_patterns | File patterns to skip | Tests, docs, builds |
max_file_size | Maximum file size in bytes | 100000 |
use_cache | Enable LLM response caching | true |
Contributing
We welcome contributions! Please see our .
License
MIT License - see file for details.
Original PocketFlow
This MCP server is based on the original PocketFlow project by The-Pocket. We've adapted their brilliant methodology to work seamlessly with all MCP-compatible AI assistants.
Support
- 🐛 Bug Reports: GitHub Issues
- 💬 Discussions: GitHub Discussions
- 📖 Documentation: Check this README and inline code comments
Ready to transform any codebase into a comprehensive learning resource? Start analyzing repositories with PocketFlow MCP today!