fwornle/mcp-server-semantic-analysis
If you are the rightful owner of mcp-server-semantic-analysis and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Semantic Analysis MCP Server is a multi-agent system designed for comprehensive semantic analysis of code and conversations, utilizing the Graphite framework.
determine_insights
Analyze repository or conversation for insights.
analyze_repository
Extract patterns and architectural analysis.
update_knowledge_base
Sync insights to knowledge systems.
lessons_learned
Extract lessons from code or conversations.
Semantic Analysis MCP Server
A powerful multi-agent semantic analysis system built with the Graphite framework, providing comprehensive code and conversation analysis capabilities through multiple interfaces.
Overview
This MCP server implements a sophisticated 7-agent architecture for semantic analysis:
- Coordinator Agent - Workflow orchestration and quality assurance
- Semantic Analysis Agent - Core LLM analysis with multi-provider fallback
- Knowledge Graph Agent - Entity and relationship management with UKB integration
- Web Search Agent - Context-aware search and validation
- Synchronization Agent - Data sync across MCP Memory, Graphology DB, and shared-memory files
- Deduplication Agent - Similarity detection and entity merging
- Documentation Agent - Automated documentation generation
Features
Multi-Interface Access
- MCP Server - Direct integration with Claude Code
- HTTP API - REST endpoints for VSCode CoPilot extension
- CLI - Command-line interface (
sal
command)
API Key Flexibility
3-tier fallback system for maximum compatibility:
ANTHROPIC_API_KEY
(Claude) - PrimaryOPENAI_API_KEY
(OpenAI) - SecondaryOPENAI_BASE_URL
+OPENAI_API_KEY
(Custom OpenAI-compatible) - Tertiary- UKB-CLI fallback mode (no AI) - Final fallback
Advanced Capabilities
- Workflow Orchestration - Complex multi-step analysis workflows
- Quality Assurance - Agent output validation and auto-correction
- Event Sourcing - Durable workflow state and recovery
- Cross-Directory Execution - Works from any directory
- Incremental Analysis - Delta analysis since last run
- Knowledge Synchronization - Multi-system data consistency
Installation
This system is automatically installed as part of the main coding tools:
# Install the entire coding system (includes this semantic analysis server)
./install.sh
Usage
Command Line Interface
# Interactive semantic analysis
sal
# Repository analysis
sal --repository /path/to/repo
# Conversation analysis
sal --conversation /path/to/conversation.md
# Incremental analysis since last run
sal --incremental
# Pattern extraction
sal --pattern "architectural-patterns,design-patterns"
# Check workflow status
sal --status
# Get help
sal --help
MCP Tools (Claude Integration)
determine_insights
- Analyze repository or conversation for insightsanalyze_repository
- Extract patterns and architectural analysisupdate_knowledge_base
- Sync insights to knowledge systemslessons_learned
- Extract lessons from code or conversations
HTTP API (CoPilot Integration)
RESTful endpoints available at http://localhost:8765
when running:
POST /analyze/repository
- Repository analysisPOST /analyze/conversation
- Conversation analysisPOST /workflows/start
- Start custom workflowGET /workflows/{id}/status
- Get workflow status
Architecture
Agent Responsibilities
- Coordinator - Manages workflows, coordinates between agents, performs QA
- Semantic Analysis - Core LLM-powered analysis with provider fallback
- Knowledge Graph - Entity/relationship management, UKB integration
- Web Search - Context gathering and validation
- Synchronization - Data consistency across storage systems
- Deduplication - Similarity detection and entity merging
- Documentation - Auto-generated reports and documentation
Data Flow
User Request → Coordinator → Workflow Engine → Agents → QA Validation → Knowledge Sync → Results
Storage Systems
- MCP Memory - Session-based memory for Claude integration
- Graphology DB - Graph database for CoPilot integration
- Shared Memory Files - Persistent JSON files for team sharing
Configuration
Configuration is handled through environment variables:
# API Keys (3-tier fallback)
ANTHROPIC_API_KEY=your-anthropic-key
OPENAI_API_KEY=your-openai-key
OPENAI_BASE_URL=your-custom-endpoint # For custom OpenAI-compatible APIs
# Paths
CODING_TOOLS_PATH=/path/to/coding/repo
# Optional: Custom configuration
SEMANTIC_ANALYSIS_CONFIG=/path/to/config.json
Development
Setting up Development Environment
# Clone the repository (if developing standalone)
git clone <repository-url>
cd mcp-server-semantic-analysis
# Create virtual environment
python -m venv venv
source venv/bin/activate # or `venv\Scripts\activate` on Windows
# Install dependencies
pip install -e .[dev]
# Run tests
pytest
# Format code
black .
ruff check --fix .
Adding New Agents
- Create agent file in
agents/
directory - Implement using Graphite framework patterns
- Register with coordinator in
config/agent_config.py
- Add tests in
tests/agents/
Adding New Workflows
- Create workflow file in
workflows/
directory - Define as Graphite Assistant
- Register in coordinator's workflow engine
- Add tests and documentation
Integration
This semantic analysis server integrates with:
- Claude Code - Via MCP server protocol
- VSCode CoPilot - Via HTTP API and bridge
- UKB Tools - Direct integration and fallback
- Knowledge Management System - Bi-directional sync
- Git Repositories - Direct analysis capabilities
Troubleshooting
Common Issues
- API Key Issues: Check the 3-tier fallback chain
- Port Conflicts: System uses intelligent port management
- Permission Issues: Ensure proper file permissions
- Memory Issues: Large repositories may need increased limits
Logging
Comprehensive logging available at multiple levels:
- Agent-specific logs
- Workflow execution logs
- API request/response logs
- Error and debugging logs
Contributing
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
License
MIT License - See LICENSE file for details.