sqlite-mcp-server-streamlit-app

suryanshp1/sqlite-mcp-server-streamlit-app

3.2

If you are the rightful owner of sqlite-mcp-server-streamlit-app and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

A fully local Model Context Protocol (MCP) client application that operates without external dependencies, featuring a Streamlit UI, local LLM via Ollama, and SQLite-based data persistence.

Tools
3
Resources
0
Prompts
0

100% Local MCP Client

A fully local Model Context Protocol (MCP) client application that runs entirely without external dependencies, featuring a Streamlit UI, local LLM via Ollama, and SQLite-based data persistence.

Features

  • 100% Local Operation: No external API calls or cloud dependencies
  • MCP Protocol Compliance: Full support for tools, resources, and prompts
  • Local LLM: llama3.2 served via Ollama
  • Persistent Storage: SQLite database for conversations and knowledge
  • Context-Aware: Maintains memory across sessions
  • Web UI: Interactive Streamlit interface
  • Docker Support: Easy deployment with Docker Compose
  • Multi-Client Support: Compatible with Claude Desktop, Cursor, and other MCP clients

Quick Start

  1. Clone the repository:
git clone <repository-url>
cd 100-local-mcp-client
  1. Start with Docker Compose:
2. **Start with Docker Compose**:
  1. Access the UI:

Configuration for External Clients

Claude Desktop

Copy the configuration to your Claude Desktop config file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%/Claude/claude_desktop_config.json

{
"mcpServers": {
    "100-local-mcp-client": {
        "command": "docker",
        "args": [
        "exec", "-i", "100-local-mcp-client-mcp-client-1",
        "python", "/app/src/mcp_server.py"
        ]
    }
    }
}

Cursor IDE

  1. Go to Settings → MCP → Add new global MCP server
  2. Use the configuration from config/cursor_config.json

Available Tools

  • add_data: Store information in the local database
  • fetch_data: Retrieve information with optional filtering
  • get_memory_context: Access conversation history and context

Architecture

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” │ Streamlit UI │ │ MCP Client │ │ Local LLM │ │ │◄──►│ │◄──►│ (Llama3.2) │ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ │ ā–¼ ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” │ MCP Server │◄──►│ SQLite Database │ │ │ │ │ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

Development

To run in development mode:

  1. Install dependencies: pip install -r requirements.txt
  2. Start Ollama: ollama serve
  3. Pull model: ollama pull llama3.2:1b
  4. Run MCP server: python src/mcp_server.py
  5. Run Streamlit: streamlit run src/streamlit_ui.py

License

MIT License