usecontexta/contexta-cli
If you are the rightful owner of contexta-cli and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
Contexta CLI is a zero-configuration Model Context Protocol (MCP) server designed for seamless integration with AI assistants, offering a privacy-first, local solution.
Contexta CLI
Contexta local CLI tool - Zero-config MCP server for AI assistants.
Features
- 🚀 Zero Configuration: Works with just the command name - no manual ports or URLs
- 🔒 Privacy First: Fully local - no data sent to cloud
- 🔄 Hybrid Transport: STDIO → HTTP handshake for seamless AI assistant integration
- 📦 Lightweight: Standalone Python package with minimal dependencies
- 🎯 MCP Compatible: Standard Model Context Protocol implementation
Installation
pip install contexta-cli
Quick Start
For AI Assistants (Claude Desktop, Cursor, etc.)
Simply configure with the command name - no extra configuration needed:
{
"mcpServers": {
"contexta": {
"command": "contexta",
"args": []
}
}
}
The CLI automatically:
- Starts local HTTP server on available port (8090-8190)
- Sends handshake message via STDIO with endpoint URL
- Handles MCP requests over HTTP
For Developers
# Start local MCP server
contexta service
# Test MCP client
contexta mcp-client
# Setup wizard
contexta setup
# Connect to AI platform
contexta connect --platform claude-desktop
Architecture
AI Assistant
│
├─ STDIO ──────► contexta command starts
│ │
│ ├─ Find available port (8090-8190)
│ ├─ Start HTTP server (127.0.0.1:PORT)
│ └─ Send handshake via STDOUT
│
└─ HTTP ───────► All MCP requests to http://127.0.0.1:PORT/mcp
CLI Commands
contexta service
Start the MCP HTTP server (main command for AI assistants).
Options:
--port PORT: Manual port override (default: auto-discover 8090-8190)--project PATH: Project directory to analyze (default: current directory)
contexta setup
Interactive setup wizard.
Flow:
- Authentication (optional)
- Platform detection
- Project indexing
- Platform configuration
contexta connect
Configure AI platform to use Contexta MCP.
Options:
--platform PLATFORM: Target platform (claude-desktop, cursor, windsurf, etc.)--scope SCOPE: Configuration scope (project or global)
contexta mcp-client
Test MCP protocol locally.
Options:
--endpoint URL: MCP server endpoint (default: http://127.0.0.1:8090/mcp)
Configuration
Local configuration stored in ~/.contexta/:
~/.contexta/
├── config.json # Global configuration
├── projects/ # Per-project settings
│ └── <project-id>/
│ ├── config.json # Project configuration
│ └── index.db # Local SQLite index
└── tokens.json # Authentication tokens
MCP Protocol
Supports standard MCP methods:
Tools:
contexta/index: Index project filescontexta/search: Search code by patterncontexta/query: Query symbols and dependencies
Resources:
project://summary: Project overviewproject://index/status: Index statisticsproject://config: Configuration details
Development
Setup
# Clone repository
git clone https://github.com/usecontexta/contexta-cli.git
cd contexta-cli
# Install in editable mode
pip install -e ".[dev]"
# Run tests
pytest tests/ -v
Testing
# Unit tests
pytest tests/
# Type checking
mypy src/contexta_cli/
# Linting
ruff check src/
# Formatting
black src/ tests/
License
Licensed under either of:
- Apache License, Version 2.0 ()
- MIT License ()
at your option.