Pratik-11/Bolt
If you are the rightful owner of Bolt and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server is a pivotal component in AI-driven data solutions, enabling seamless integration between AI models and data warehouses.
π Bolt - Self-Hosted AI Data Agent
A powerful self-hosted Gen AI agent with integrated MCP server for solving organization data-driven queries and tasks
Features β’ Quick Start β’ Documentation β’ Contributing β’ Support
π― What is Bolt?
Bolt is a cutting-edge self-hosted AI data agent that combines the power of Ollama's local AI models with Snowflake's data warehouse through the Model Context Protocol (MCP). It enables organizations to ask natural language questions about their data and get intelligent, AI-generated SQL queries and insights - all while keeping everything secure and on-premises.
π Why Bolt?
- π Privacy First: Keep your data and AI processing completely local
- π§ Smart Data Queries: Natural language to SQL conversion using AI
- π’ Enterprise Ready: Seamless Snowflake integration for organizational data
- π Real-time Insights: Instant data analysis and visualization
- π° Cost Effective: Self-hosted solution with no external API costs
- π§ Easy Setup: One-click deployment on Google Colab
β¨ Features
π€ AI-Powered Data Interaction
- Natural Language Queries: Ask questions in plain English about your data
- Smart SQL Generation: AI automatically generates optimized Snowflake queries
- Multiple Model Support: Choose from 7B to 14B parameter models based on your needs
- Context-Aware Responses: Understands your data schema and relationships
π Seamless Integrations
- Snowflake Integration: Direct connection to your Snowflake data warehouse
- MCP Protocol: Industry-standard Model Context Protocol implementation
- Ollama Support: Local hosting of Llama 2, Mistral, CodeLlama, and more
- Google Colab Ready: Zero-setup deployment in cloud notebooks
π‘οΈ Security & Privacy
- On-Premises AI: No data sent to external AI services
- Credential Management: Secure handling of database credentials
- Environment Isolation: Containerized execution environment
- Access Control: Built-in security best practices
π Analytics & Insights
- Interactive Results: Beautiful data visualization and formatting
- Query History: Track and reuse previous queries
- Performance Metrics: Monitor query execution times and results
- Export Capabilities: Save results in multiple formats
π Quick Start
Option 1: Google Colab (Recommended)
-
Upload Notebooks: Upload
model.ipynb
andmcp.ipynb
to your Colab -
Follow the Guided Setup:
- Start with
model.ipynb
to set up Ollama - Then run
mcp.ipynb
for Snowflake integration
- Start with
Option 2: Local Setup
# Clone the repository
git clone https://github.com/Pratik-11/Bolt.git
cd Bolt
# Install Ollama (Linux/macOS)
curl -fsSL https://ollama.com/install.sh | sh
# Install Python dependencies
pip install snowflake-connector-python==3.0.0 python-dotenv jupyter
# Start Jupyter
jupyter notebook
Setup Your First Query
# 1. Start with model setup
# Open model.ipynb and follow the cells to:
# - Install Ollama
# - Download your preferred AI model (7B or 14B)
# - Test the model
# 2. Configure data connection
# Open mcp.ipynb and:
# - Set up Snowflake credentials
# - Initialize MCP server
# - Test data connectivity
# 3. Ask your first question!
result = ai_with_data.execute_ai_query("Show me top 10 customers by order value")
π Documentation
Architecture Overview
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β User Query βββββΆβ Ollama AI βββββΆβ Snowflake β
β (Natural Lang.) β β (SQL Gen.) β β (Data Exec.) β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β² β β
β βΌ βΌ
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Formatted ββββββ MCP Server ββββββ Query Results β
β Results β β (Orchestrator) β β (Raw Data) β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
Supported Models
Model | Size | Memory | Use Case |
---|---|---|---|
Llama 2 7B | 4-6GB | Colab Free | General purpose, fast responses |
Mistral 7B | 4-6GB | Colab Free | Efficient, multilingual |
CodeLlama 7B | 4-6GB | Colab Free | Code generation, SQL focused |
Llama 2 13B | 8-12GB | Colab Pro | Higher quality, complex queries |
CodeLlama 13B | 8-12GB | Colab Pro | Advanced code generation |
Environment Variables
# Snowflake Configuration
SNOWFLAKE_USER=your_username
SNOWFLAKE_PASSWORD=your_password
SNOWFLAKE_ACCOUNT=your_account
SNOWFLAKE_DATABASE=your_database
SNOWFLAKE_WAREHOUSE=your_warehouse
π οΈ Development
Project Structure
Bolt/
βββ model.ipynb # Ollama model setup and configuration
βββ mcp.ipynb # MCP server and Snowflake integration
βββ README.md # Project documentation
βββ .gitignore # Git ignore rules
βββ requirements.txt # Python dependencies (if local setup)
Key Components
EmbeddedMCPServer
: Core MCP server implementationOllamaWithMCP
: AI client with data integration- Snowflake Connector: Secure database connectivity
- Query Generator: Natural language to SQL conversion
π€ Contributing
We welcome contributions from the community! Here's how you can help:
Ways to Contribute
- π Bug Reports: Found an issue? Open an issue
- π‘ Feature Requests: Have an idea? Start a discussion
- π Documentation: Improve our docs and examples
- π§ͺ Testing: Help test new features and edge cases
- π§ Code: Submit pull requests for bug fixes and features
Development Setup
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature
- Make your changes and test thoroughly
- Commit your changes:
git commit -m 'Add amazing feature'
- Push to your branch:
git push origin feature/amazing-feature
- Open a Pull Request
Contribution Guidelines
- Follow the existing code style and structure
- Add tests for new features
- Update documentation as needed
- Ensure all notebooks run successfully
- Add comments for complex logic
π§ͺ Testing
Manual Testing
- Model Testing: Verify AI model responses
- Database Testing: Test Snowflake connectivity
- Integration Testing: End-to-end query execution
- Performance Testing: Monitor response times
Test Scenarios
- Simple data queries (SELECT statements)
- Complex analytical queries (JOINs, aggregations)
- Error handling (invalid queries, connection issues)
- Different model sizes and types
πΊοΈ Roadmap
Version 1.1 (Coming Soon)
- Support for additional databases (PostgreSQL, MySQL)
- Web-based UI interface
- Query result caching
- Enhanced error handling
Version 1.2 (Future)
- Multi-tenant support
- Advanced visualization capabilities
- API endpoints for external integration
- Docker containerization
Version 2.0 (Vision)
- Multi-modal AI support (text + charts)
- Real-time data streaming
- Advanced security features
- Enterprise SSO integration
π₯ Contributors
Core Team
Pratik Singh Creator & Lead Developer |
Want to join this list?
Check out our contributing guidelines and start contributing today!
π License
This project is licensed under the MIT License - see the file for details.
MIT License - Feel free to use, modify, and distribute
Commercial use permitted - Build amazing products with Bolt!
π¬ Support & Community
Get Help
- π Documentation: Check our comprehensive guides above
- π Issues: GitHub Issues for bug reports
- π¬ Discussions: GitHub Discussions for questions
- π§ Email: For private inquiries and enterprise support
Stay Connected
- β Star this repo to show your support
- ποΈ Watch for updates and new releases
- π΄ Fork to start building your own version
- π’ Share with your team and community
π Acknowledgments
Special thanks to:
- Ollama for making local AI accessible
- Snowflake for their robust data platform
- MCP Protocol for standardizing AI-data connections
- Google Colab for free cloud computing
- Open Source Community for inspiration and support