bchat

Nyrk0/bchat

3.4

If you are the rightful owner of bchat and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Model Context Protocol (MCP) server is a sophisticated system designed to facilitate seamless integration and communication between multiple AI models, enhancing their collaborative capabilities in software development workflows.

Tools
2
Resources
0
Prompts
0

šŸ¤– AI CLI Chat Logger (bchat)

AI Conversation Intelligence with Technical Context Preservation

Version Platform Python Contributors

AI Conversation Intelligence with Technical Context Preservation

A lightweight, fully local Python utility that captures AI-powered CLI chat logs with intelligent semantic processing and preserves the crucial technical "how" and "why" that future development sessions need. Features dual AI provider support (Claude/Gemini), automatic chat analysis with implementation details, and structured JSON backups that maintain technical continuity. Simple 3-step setup with professional-grade organization and no external platform dependencies.

šŸ“‹ Development Guidelines & Project Status

For Developers and AI Contributors:

  • Development Directives:
  • Project Coordination:
  • Master Context Memory:
  • Current Initiative:

Current Status: Transitioning from Phase 2 (Deep Context Engine) to BChat MCP development with structured stage progression.


šŸŽÆ Why bchat?

  • 🧠 Technical Context Preservation: Captures the crucial "how" and "why" that future development sessions need
  • šŸ”’ Fully Local: No data leaves your machine
  • ⚔ Simple Setup: Ready in 3 steps - clone, add API key, install
  • šŸŒ Universal Access: bchat command works from anywhere in your workspace
  • šŸ¤– AI-Smart: Intelligent chat analysis with implementation detail extraction
  • šŸ“¦ Lightweight: Minimal dependencies, maximum functionality

✨ Features

  • 🧠 Technical Context Intelligence: Preserves implementation details, code changes, and architectural decisions that enable seamless development continuity
  • šŸ” Real-time Monitoring: Automatically watches and processes AI chat logs
  • 🧠 Dual AI Providers: Choose between Claude or Gemini APIs for intelligent analysis
  • šŸ“Š Structured Data: Creates machine-readable JSON indexes with technical metadata and implementation tracking
  • šŸ”„ Daily Consolidation: Merges multiple chat files into organized single files with context preservation
  • šŸš€ Universal Access: bchat command works from any directory in your workspace
  • šŸ’¬ Multi-AI Support: Compatible with Claude Code, Gemini CLI, and extensible to other AI tools
  • šŸ›”ļø Resilient Architecture: Circuit breaker patterns, retry logic, and graceful error handling
  • ⚔ Professional Organization: Clean workspace structure with essential files at root level

šŸš€ Quick Start

System Requirements

  • Python 3.8+ (required)
  • AI API Key (required for intelligent processing): Anthropic API key for Claude OR Google API key for Gemini
  • Node.js 16+ (optional, for Gemini CLI integration)
  • Git (for installation)

Installation (macOS/Linux)

  1. Clone the repository:

    git clone https://github.com/Nyrk0/bchat.git
    cd bchat
    
  2. Configure your API key:

    cp .env.example .env
    # Edit .env and add your API key:
    # For Claude (recommended): ANTHROPIC_API_KEY=your_anthropic_api_key_here
    # For Gemini: GOOGLE_API_KEY=your_google_api_key_here
    
  3. Run the installer:

    ./install.sh
    
  4. Start using bchat:

    # Check system status (works without API key)
    ./bchat --status
    
    # Backup and process chat conversation (requires API key)
    ./bchat
    
    # Use Gemini CLI with logging (requires API key)
    ./bchat -p "Explain quantum computing"
    

šŸ’” Note: Basic commands like --status work immediately. Intelligent processing features require an API key configured in step 2.

Windows Support

Windows installation is not yet supported. The core Python functionality should work on Windows with manual setup.


šŸŽÆ How to Use

The Universal bchat Command

The bchat command works from anywhere in your workspace:

Backup Mode (no arguments)
# From any directory - triggers chat backup/consolidation
bchat
  • From AI CLI windows: Saves current AI conversation to structured logs
  • From VSCode terminal: Consolidates recent chat activity
  • From any location: Works globally across the workspace
Gemini CLI Mode (with arguments)
bchat -p "Explain quantum computing"
bchat -a -p "Analyze this project structure"  # Include all files
bchat --help                                  # See all Gemini options

Monitoring Commands

# Start monitoring system
./start

# Control monitoring (from ai-cli-chat-logger directory)
./rchat --help          # View all options
./runchat              # Alternative command (same as rchat)

# Manual consolidation
./rchat --consolidate

šŸ“ What Gets Created

When you clone from GitHub:

cd /their/workspace/
git clone https://github.com/Nyrk0/bchat.git

You will get:

their-workspace/
ā”œā”€ā”€ their-existing-file.txt     # User's files (no conflict)
ā”œā”€ā”€ their-config.json           # User's files (no conflict) 
ā”œā”€ā”€ their-install.sh            # User's files (no conflict)
└── bchat/                      # All bchat files contained here
    ā”œā”€ā”€ README.md               # Documentation
    ā”œā”€ā”€ LICENSE                 # MIT License
    ā”œā”€ā”€ CLAUDE.md               # Claude Code instructions
    ā”œā”€ā”€ bchat                   # Main executable
    ā”œā”€ā”€ install.sh              # Installation script
    ā”œā”€ā”€ requirements.txt        # Python dependencies
    ā”œā”€ā”€ .env.example           # Environment template
    ā”œā”€ā”€ bin/                    # All executable scripts
    │   ā”œā”€ā”€ bchat-status        # System status checker
    │   ā”œā”€ā”€ rchat               # Chat monitor launcher
    │   ā”œā”€ā”€ runchat             # Alternative launcher
    │   └── start               # Quick start script
    ā”œā”€ā”€ config/                 # Configuration files
    │   ā”œā”€ā”€ config.json         # Main config (Claude Sonnet 4 default)
    │   └── wrappers/
    │       ā”œā”€ā”€ claude_wrapper.sh   # Claude CLI logging wrapper
    │       └── gemini_wrapper.sh   # Gemini CLI logging wrapper
    ā”œā”€ā”€ core/                   # Python source code
    │   └── src/
    │       ā”œā”€ā”€ chat_monitor.py # Core monitoring system
    │       └── utils/
    │           └── path_manager.py # Path resolution utilities
    ā”œā”€ā”€ data/                   # Runtime data (created during use)
    │   ā”œā”€ā”€ chats/              # Chat logs and processed JSON
    │   │   ā”œā”€ā”€ chat_index.json      # Searchable session index
    │   │   ā”œā”€ā”€ context_summary.json # Cross-session analysis
    │   │   ā”œā”€ā”€ chat_log_*.json      # Individual session logs
    │   │   ā”œā”€ā”€ claude_current_day_raw.log   # Raw Claude logs
    │   │   └── gemini_current_day_raw.log   # Raw Gemini logs
    │   └── logs/
    │       └── bchat.log       # System operation logs
    ā”œā”€ā”€ dev/                    # Development tools
    │   ā”œā”€ā”€ venv/               # Virtual environment (created by install)
    │   └── dev_directives/
    │       └── general.md      # Development guidelines
    └── docs/                   # Complete documentation
        ā”œā”€ā”€ user-guide.md       # User documentation
        ā”œā”€ā”€ ai-integration.md   # AI integration guide
        ā”œā”€ā”€ CHANGELOG.md        # Project history
        └── structure.md        # Workspace organization guide

Perfect Namespace Isolation: All bchat files are contained within the bchat/ directory, preventing any conflicts with your existing files. You can have your own install.sh, config.json, etc. without any naming conflicts.


āš™ļø Configuration

Environment Variables (.env)

# API Keys (choose your preferred provider)
GOOGLE_API_KEY=your_google_api_key_here        # For Gemini provider
ANTHROPIC_API_KEY=your_anthropic_api_key_here  # For Claude provider

# Optional: Chat log retention (default: 90 days)
CHAT_LOG_RETENTION_DAYS=90

# Optional: Debug mode (default: false)
CHAT_MONITOR_DEBUG=false

Advanced Configuration (config.json)

{
  "system": {
    "project_name": "your_project",
    "log_level": "INFO"
  },
  "api": {
    "provider": "gemini",
    "model": "gemini-2.5-flash",
    "claude": {
      "model": "claude-3-5-sonnet-20241022"
    }
  },
  "monitoring": {
    "enabled": true,
    "debounce_delay": 2.0,
    "triggers": ["bchat", "backup chat"]
  }
}

🧠 Technical Context Intelligence

The Core Purpose: bchat solves the critical problem of technical context continuity in AI-assisted development sessions.

The Problem

Traditional chat logging captures what was decided but loses the crucial how and why:

  • āŒ Specific code changes and their locations
  • āŒ Root cause analysis of issues
  • āŒ System architecture understanding
  • āŒ Implementation strategies and technical decisions
  • āŒ Development stage progress and status

bchat's Solution

bchat preserves technical implementation context that future development sessions need:

  • āœ… Code Change Tracking: Documents specific files modified and why
  • āœ… Architecture Mapping: Captures component relationships and system understanding
  • āœ… Stage Progress: Tracks development methodology progress (ST_00 → ST_01 → ST_02...)
  • āœ… Issue Resolution: Preserves root cause analysis and solution implementation
  • āœ… Technical Decisions: Documents the reasoning behind implementation choices

Foundation Audit Results

Our comprehensive system analysis reveals that while basic JSON processing works excellently, enhanced technical context capture is essential for development continuity. See detailed findings in .

Key Discovery: Context continuity gaps were identified as a HIGH priority issue affecting development efficiency and technical knowledge preservation.


šŸ“Š Usage Examples

AI Development Sessions

# Start a Claude Code session
claude

# After productive conversation, backup progress
bchat

# Continue in VSCode terminal and save work
bchat

Team Collaboration

# After significant progress
bchat

# System creates:
# āœ… chats/chat_backup_YYYY-MM-DD.md     # Human-readable
# āœ… chats/chat_index.json              # Machine-readable
# āœ… chats/context_summary.json         # Cross-session context

šŸ¤ Contributing

We welcome contributions! Here's how to get started:

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/your-feature
  3. Make your changes and test
  4. Commit with clear messages: git commit -m "Add Windows installer support"
  5. Push and create a pull request

šŸ”„ High Priority Contributions Needed

  • Windows installer script - Adapt install.sh for Windows/PowerShell
  • Linux distribution testing - Test on Ubuntu, Debian, Fedora, etc.
  • Performance optimizations - Async processing, caching
  • Web dashboard - Browser interface for chat analytics
  • Unit tests - Test coverage for all components

See for detailed guidelines.


šŸ” Troubleshooting

Common Issues

Installation fails
# Check Python version
python3 --version  # Should be 3.8+

# Install dependencies manually
pip3 install watchdog google-generativeai python-dotenv
API errors
# Verify API key
echo $GOOGLE_API_KEY | head -c 10

# Test API connection
python3 -c "import google.generativeai as genai; genai.configure(api_key='$GOOGLE_API_KEY'); print('API works')"
bchat not found
# Re-run installer
./install.sh

# Check symlink
ls -la ../bchat

Getting Help

  • šŸ“– Documentation: Check our directory
  • šŸ› Bug Reports: Create an issue
  • šŸ’¬ Discussions: GitHub Discussions
  • šŸ“§ Contact: Open an issue for questions

šŸ“„ License

MIT License - see file for details.

Copyright (c) 2025 Alex Walter Rettig Eglinton


šŸ”— Related Projects


⭐ Support This Project

If this project helps you, please consider:

  • ⭐ Starring the repository
  • šŸ› Reporting bugs via Issues
  • šŸ’” Suggesting features via Issues
  • šŸ”§ Contributing code via Pull Requests
  • šŸ“¢ Sharing with others who might find it useful

šŸ“ˆ Project Stats

GitHub stars GitHub forks GitHub issues GitHub last commit


šŸš€ Ready to get started? Run ./install.sh and you'll be monitoring AI conversations in under a minute!