MrNitro360/Context-Optimizer-MCP
If you are the rightful owner of Context-Optimizer-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Context Optimizer MCP server is designed to enhance AI assistants' ability to manage conversation context efficiently, ensuring better continuity and reduced token usage.
Context Optimizer MCP
A Model Context Protocol (MCP) server for intelligent conversation context management with token optimization, designed to enhance Claude's ability to retain and utilize conversation context across interactions.
๐ฏ Purpose
This MCP server provides advanced context management capabilities that help AI assistants maintain better conversation continuity, reduce token usage through intelligent optimization, and prevent context loss during long conversations.
โจ Features
Core Context Management
- Context Storage & Retrieval: Store and retrieve important conversation context with categorization
- Conversation Summarization: Generate intelligent summaries to preserve key information
- Topic Flow Tracking: Monitor conversation evolution and topic transitions
- Actionable Item Extraction: Identify and track tasks, decisions, and follow-ups
Token Optimization
- Smart Compression: Multiple compression strategies to reduce token usage
- Budget Management: Configurable token budgets with automatic optimization
- Usage Analytics: Detailed token usage statistics and recommendations
- Priority-Based Filtering: Preserve high-importance context while optimizing low-priority content
Advanced Features
- Context Gap Detection: Identify missing context that could improve assistance quality
- Anti-Hallucination Validation: Built-in validation to ensure context accuracy
- Persistent Storage: Optional permanent storage for critical context
- Flexible Configuration: Customizable settings for different use cases
๐ Installation
Prerequisites
- Node.js โฅ 18.0.0
- Claude Desktop or compatible MCP client
Setup
- Clone the repository:
git clone https://github.com/MrNitro360/Context-Optimizer-MCP.git
cd Context-Optimizer-MCP
- Install dependencies:
npm install
- Configure Claude Desktop to use this MCP server by adding to your
claude_desktop_config.json
:
{
"mcpServers": {
"context-optimizer": {
"command": "node",
"args": ["path/to/Context-Optimizer-MCP/src/index.js"]
}
}
}
- Start the server:
npm start
๐ ๏ธ Available Tools
Context Management Tools
store_conversation_context
Store important conversation context for future reference.
- Parameters:
contextType
,content
,importance
(1-10),tags
,expiresAfter
- Context Types: decision, preference, fact, goal, constraint, insight
retrieve_relevant_context
Retrieve stored context relevant to the current conversation.
- Parameters:
query
,contextTypes
,maxResults
,minImportance
,tokenBudget
summarize_conversation
Create intelligent summaries of conversation content.
- Parameters:
conversationText
,focusAreas
,includeCodeContext
,tokenBudget
track_conversation_flow
Track conversation evolution and topic transitions.
- Parameters:
currentMessage
,messageType
,topicShift
extract_actionable_items
Extract tasks, decisions, and follow-ups from conversation.
- Parameters:
conversationText
,itemTypes
Optimization Tools
optimize_context_for_continuation
Optimize and compress context for efficient conversation continuation.
- Parameters:
fullContext
,targetLength
,preserveTypes
,tokenBudget
detect_context_gaps
Identify missing context that could improve assistance quality.
- Parameters:
currentRequest
,availableContext
,domain
get_token_usage_stats
Get detailed token usage statistics and optimization recommendations.
- Parameters:
includeRecommendations
,analyzeRedundancy
optimize_tokens
Apply token optimization using various compression strategies.
- Parameters:
strategy
,targetReduction
,preserveImportant
- Strategies: priority_filtering, content_compression, context_merging, aggressive_compression
set_token_budget
Configure token budgets and optimization settings.
- Parameters:
maxContextTokenBudget
,compressionThreshold
,tokenOptimizationEnabled
,maxTokensPerContext
๐ง Configuration
Default Settings
- Max Context Token Budget: 8,000 tokens
- Compression Threshold: 0.8 (80%)
- Max Tokens Per Context: 1,000 tokens
- Token Optimization: Enabled
Environment Variables
Set these environment variables to customize behavior:
CONTEXT_STORAGE_PATH
: Directory for persistent context storageMAX_TOKEN_BUDGET
: Override default token budgetDEBUG_MODE
: Enable detailed logging
๐ Usage Examples
Basic Context Storage
// Store a user preference
await store_conversation_context({
contextType: "preference",
content: "User prefers concise explanations with code examples",
importance: 8,
tags: ["communication", "coding"],
expiresAfter: "permanent"
});
Context Retrieval
// Retrieve relevant context for current question
await retrieve_relevant_context({
query: "How should I explain this technical concept?",
minImportance: 6,
maxResults: 5
});
Token Optimization
// Optimize stored context to reduce token usage
await optimize_tokens({
strategy: "priority_filtering",
targetReduction: 25,
preserveImportant: true
});
๐๏ธ Architecture
Core Components
- ContextOptimizerServer: Main MCP server implementation
- ContextOptimizer: Advanced context processing and optimization
- TokenOptimizer: Token usage analysis and compression
- FileManager: Persistent storage management
- DateUtils: Time-based context management
Data Flow
- Context ingestion through MCP tools
- Processing and categorization
- Storage in memory/disk with metadata
- Retrieval with relevance scoring
- Token optimization when needed
๐งช Development
Scripts
npm start
: Start the production servernpm run dev
: Start with nodemon for developmentnpm run lint
: Check code stylenpm run format
: Format code with Prettier
Testing
Run the server in test mode:
node src/index.js
๐ค Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit changes (
git commit -m 'Add amazing feature'
) - Push to branch (
git push origin feature/amazing-feature
) - Open a Pull Request
๐ License
This project is licensed under the MIT License - see the file for details.
๐ Links
- Repository: https://github.com/MrNitro360/Context-Optimizer-MCP
- Issues: https://github.com/MrNitro360/Context-Optimizer-MCP/issues
- MCP Documentation: https://modelcontextprotocol.io/docs
๐ Performance
- Memory Efficient: Intelligent context pruning and compression
- Fast Retrieval: Optimized search algorithms for context lookup
- Scalable: Handles thousands of context items efficiently
- Token Optimized: Reduces token usage by up to 50% through compression
Built with โค๏ธ for the Claude AI ecosystem by mrnitro360