Nyrk0/bchat
If you are the rightful owner of bchat and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server is a sophisticated system designed to facilitate seamless integration and communication between multiple AI models, enhancing their collaborative capabilities in software development workflows.
š¤ AI CLI Chat Logger (bchat)
AI Conversation Intelligence with Technical Context Preservation
AI Conversation Intelligence with Technical Context Preservation
A lightweight, fully local Python utility that captures AI-powered CLI chat logs with intelligent semantic processing and preserves the crucial technical "how" and "why" that future development sessions need. Features dual AI provider support (Claude/Gemini), automatic chat analysis with implementation details, and structured JSON backups that maintain technical continuity. Simple 3-step setup with professional-grade organization and no external platform dependencies.
š Development Guidelines & Project Status
For Developers and AI Contributors:
- Development Directives:
- Project Coordination:
- Master Context Memory:
- Current Initiative:
Current Status: Transitioning from Phase 2 (Deep Context Engine) to BChat MCP development with structured stage progression.
šÆ Why bchat?
- š§ Technical Context Preservation: Captures the crucial "how" and "why" that future development sessions need
- š Fully Local: No data leaves your machine
- ā” Simple Setup: Ready in 3 steps - clone, add API key, install
- š Universal Access:
bchat
command works from anywhere in your workspace - š¤ AI-Smart: Intelligent chat analysis with implementation detail extraction
- š¦ Lightweight: Minimal dependencies, maximum functionality
⨠Features
- š§ Technical Context Intelligence: Preserves implementation details, code changes, and architectural decisions that enable seamless development continuity
- š Real-time Monitoring: Automatically watches and processes AI chat logs
- š§ Dual AI Providers: Choose between Claude or Gemini APIs for intelligent analysis
- š Structured Data: Creates machine-readable JSON indexes with technical metadata and implementation tracking
- š Daily Consolidation: Merges multiple chat files into organized single files with context preservation
- š Universal Access:
bchat
command works from any directory in your workspace - š¬ Multi-AI Support: Compatible with Claude Code, Gemini CLI, and extensible to other AI tools
- š”ļø Resilient Architecture: Circuit breaker patterns, retry logic, and graceful error handling
- ā” Professional Organization: Clean workspace structure with essential files at root level
š Quick Start
System Requirements
- Python 3.8+ (required)
- AI API Key (required for intelligent processing): Anthropic API key for Claude OR Google API key for Gemini
- Node.js 16+ (optional, for Gemini CLI integration)
- Git (for installation)
Installation (macOS/Linux)
-
Clone the repository:
git clone https://github.com/Nyrk0/bchat.git cd bchat
-
Configure your API key:
cp .env.example .env # Edit .env and add your API key: # For Claude (recommended): ANTHROPIC_API_KEY=your_anthropic_api_key_here # For Gemini: GOOGLE_API_KEY=your_google_api_key_here
-
Run the installer:
./install.sh
-
Start using bchat:
# Check system status (works without API key) ./bchat --status # Backup and process chat conversation (requires API key) ./bchat # Use Gemini CLI with logging (requires API key) ./bchat -p "Explain quantum computing"
š” Note: Basic commands like
--status
work immediately. Intelligent processing features require an API key configured in step 2.
Windows Support
Windows installation is not yet supported. The core Python functionality should work on Windows with manual setup.
šÆ How to Use
The Universal bchat
Command
The bchat
command works from anywhere in your workspace:
Backup Mode (no arguments)
# From any directory - triggers chat backup/consolidation
bchat
- From AI CLI windows: Saves current AI conversation to structured logs
- From VSCode terminal: Consolidates recent chat activity
- From any location: Works globally across the workspace
Gemini CLI Mode (with arguments)
bchat -p "Explain quantum computing"
bchat -a -p "Analyze this project structure" # Include all files
bchat --help # See all Gemini options
Monitoring Commands
# Start monitoring system
./start
# Control monitoring (from ai-cli-chat-logger directory)
./rchat --help # View all options
./runchat # Alternative command (same as rchat)
# Manual consolidation
./rchat --consolidate
š What Gets Created
When you clone from GitHub:
cd /their/workspace/
git clone https://github.com/Nyrk0/bchat.git
You will get:
their-workspace/
āāā their-existing-file.txt # User's files (no conflict)
āāā their-config.json # User's files (no conflict)
āāā their-install.sh # User's files (no conflict)
āāā bchat/ # All bchat files contained here
āāā README.md # Documentation
āāā LICENSE # MIT License
āāā CLAUDE.md # Claude Code instructions
āāā bchat # Main executable
āāā install.sh # Installation script
āāā requirements.txt # Python dependencies
āāā .env.example # Environment template
āāā bin/ # All executable scripts
ā āāā bchat-status # System status checker
ā āāā rchat # Chat monitor launcher
ā āāā runchat # Alternative launcher
ā āāā start # Quick start script
āāā config/ # Configuration files
ā āāā config.json # Main config (Claude Sonnet 4 default)
ā āāā wrappers/
ā āāā claude_wrapper.sh # Claude CLI logging wrapper
ā āāā gemini_wrapper.sh # Gemini CLI logging wrapper
āāā core/ # Python source code
ā āāā src/
ā āāā chat_monitor.py # Core monitoring system
ā āāā utils/
ā āāā path_manager.py # Path resolution utilities
āāā data/ # Runtime data (created during use)
ā āāā chats/ # Chat logs and processed JSON
ā ā āāā chat_index.json # Searchable session index
ā ā āāā context_summary.json # Cross-session analysis
ā ā āāā chat_log_*.json # Individual session logs
ā ā āāā claude_current_day_raw.log # Raw Claude logs
ā ā āāā gemini_current_day_raw.log # Raw Gemini logs
ā āāā logs/
ā āāā bchat.log # System operation logs
āāā dev/ # Development tools
ā āāā venv/ # Virtual environment (created by install)
ā āāā dev_directives/
ā āāā general.md # Development guidelines
āāā docs/ # Complete documentation
āāā user-guide.md # User documentation
āāā ai-integration.md # AI integration guide
āāā CHANGELOG.md # Project history
āāā structure.md # Workspace organization guide
Perfect Namespace Isolation: All bchat files are contained within the bchat/
directory, preventing any conflicts with your existing files. You can have your own install.sh
, config.json
, etc. without any naming conflicts.
āļø Configuration
Environment Variables (.env)
# API Keys (choose your preferred provider)
GOOGLE_API_KEY=your_google_api_key_here # For Gemini provider
ANTHROPIC_API_KEY=your_anthropic_api_key_here # For Claude provider
# Optional: Chat log retention (default: 90 days)
CHAT_LOG_RETENTION_DAYS=90
# Optional: Debug mode (default: false)
CHAT_MONITOR_DEBUG=false
Advanced Configuration (config.json)
{
"system": {
"project_name": "your_project",
"log_level": "INFO"
},
"api": {
"provider": "gemini",
"model": "gemini-2.5-flash",
"claude": {
"model": "claude-3-5-sonnet-20241022"
}
},
"monitoring": {
"enabled": true,
"debounce_delay": 2.0,
"triggers": ["bchat", "backup chat"]
}
}
š§ Technical Context Intelligence
The Core Purpose: bchat solves the critical problem of technical context continuity in AI-assisted development sessions.
The Problem
Traditional chat logging captures what was decided but loses the crucial how and why:
- ā Specific code changes and their locations
- ā Root cause analysis of issues
- ā System architecture understanding
- ā Implementation strategies and technical decisions
- ā Development stage progress and status
bchat's Solution
bchat preserves technical implementation context that future development sessions need:
- ā Code Change Tracking: Documents specific files modified and why
- ā Architecture Mapping: Captures component relationships and system understanding
- ā Stage Progress: Tracks development methodology progress (ST_00 ā ST_01 ā ST_02...)
- ā Issue Resolution: Preserves root cause analysis and solution implementation
- ā Technical Decisions: Documents the reasoning behind implementation choices
Foundation Audit Results
Our comprehensive system analysis reveals that while basic JSON processing works excellently, enhanced technical context capture is essential for development continuity. See detailed findings in .
Key Discovery: Context continuity gaps were identified as a HIGH priority issue affecting development efficiency and technical knowledge preservation.
š Usage Examples
AI Development Sessions
# Start a Claude Code session
claude
# After productive conversation, backup progress
bchat
# Continue in VSCode terminal and save work
bchat
Team Collaboration
# After significant progress
bchat
# System creates:
# ā
chats/chat_backup_YYYY-MM-DD.md # Human-readable
# ā
chats/chat_index.json # Machine-readable
# ā
chats/context_summary.json # Cross-session context
š¤ Contributing
We welcome contributions! Here's how to get started:
- Fork the repository
- Create a feature branch:
git checkout -b feature/your-feature
- Make your changes and test
- Commit with clear messages:
git commit -m "Add Windows installer support"
- Push and create a pull request
š„ High Priority Contributions Needed
- Windows installer script - Adapt
install.sh
for Windows/PowerShell - Linux distribution testing - Test on Ubuntu, Debian, Fedora, etc.
- Performance optimizations - Async processing, caching
- Web dashboard - Browser interface for chat analytics
- Unit tests - Test coverage for all components
See for detailed guidelines.
š Troubleshooting
Common Issues
Installation fails
# Check Python version
python3 --version # Should be 3.8+
# Install dependencies manually
pip3 install watchdog google-generativeai python-dotenv
API errors
# Verify API key
echo $GOOGLE_API_KEY | head -c 10
# Test API connection
python3 -c "import google.generativeai as genai; genai.configure(api_key='$GOOGLE_API_KEY'); print('API works')"
bchat not found
# Re-run installer
./install.sh
# Check symlink
ls -la ../bchat
Getting Help
- š Documentation: Check our directory
- š Bug Reports: Create an issue
- š¬ Discussions: GitHub Discussions
- š§ Contact: Open an issue for questions
š License
MIT License - see file for details.
Copyright (c) 2025 Alex Walter Rettig Eglinton
š Related Projects
- Claude Code - Official Claude CLI
- Gemini CLI - Google's Gemini CLI
- Watchdog - Python file monitoring
ā Support This Project
If this project helps you, please consider:
- ā Starring the repository
- š Reporting bugs via Issues
- š” Suggesting features via Issues
- š§ Contributing code via Pull Requests
- š¢ Sharing with others who might find it useful
š Project Stats
š Ready to get started? Run ./install.sh
and you'll be monitoring AI conversations in under a minute!