suryanshp1/sqlite-mcp-server-streamlit-app
If you are the rightful owner of sqlite-mcp-server-streamlit-app and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A fully local Model Context Protocol (MCP) client application that operates without external dependencies, featuring a Streamlit UI, local LLM via Ollama, and SQLite-based data persistence.
100% Local MCP Client
A fully local Model Context Protocol (MCP) client application that runs entirely without external dependencies, featuring a Streamlit UI, local LLM via Ollama, and SQLite-based data persistence.
Features
- 100% Local Operation: No external API calls or cloud dependencies
- MCP Protocol Compliance: Full support for tools, resources, and prompts
- Local LLM: llama3.2 served via Ollama
- Persistent Storage: SQLite database for conversations and knowledge
- Context-Aware: Maintains memory across sessions
- Web UI: Interactive Streamlit interface
- Docker Support: Easy deployment with Docker Compose
- Multi-Client Support: Compatible with Claude Desktop, Cursor, and other MCP clients
Quick Start
- Clone the repository:
git clone <repository-url>
cd 100-local-mcp-client
- Start with Docker Compose:
2. **Start with Docker Compose**:
- Access the UI:
- Streamlit UI: http://localhost:8501
- MCP Server: http://localhost:3001
Configuration for External Clients
Claude Desktop
Copy the configuration to your Claude Desktop config file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"100-local-mcp-client": {
"command": "docker",
"args": [
"exec", "-i", "100-local-mcp-client-mcp-client-1",
"python", "/app/src/mcp_server.py"
]
}
}
}
Cursor IDE
- Go to Settings ā MCP ā Add new global MCP server
- Use the configuration from
config/cursor_config.json
Available Tools
- add_data: Store information in the local database
- fetch_data: Retrieve information with optional filtering
- get_memory_context: Access conversation history and context
Architecture
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā ā Streamlit UI ā ā MCP Client ā ā Local LLM ā ā āāāāāŗā āāāāāŗā (Llama3.2) ā āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā ā ā¼ āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā ā MCP Server āāāāāŗā SQLite Database ā ā ā ā ā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
Development
To run in development mode:
- Install dependencies:
pip install -r requirements.txt
- Start Ollama:
ollama serve
- Pull model:
ollama pull llama3.2:1b
- Run MCP server:
python src/mcp_server.py
- Run Streamlit:
streamlit run src/streamlit_ui.py
License
MIT License