context-optimizer-mcp
If you are the rightful owner of context-optimizer-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
An MCP server that uses Redis and in-memory caching to optimize and extend context windows for large chat histories.
The Context Optimizer MCP server is designed to enhance the handling of large chat histories by optimizing context windows. It acts as a middleware between applications and LLM providers, specifically supporting Anthropic's Claude models. The server employs dual-layer caching, combining in-memory LRU cache with Redis for persistent storage, to efficiently manage conversation summaries. It automatically summarizes older messages to maintain context within token limits, ensuring seamless conversation flow. The server is API-compatible with Anthropic, offering a drop-in replacement with enhanced context handling. Additionally, it features Redis-based rate limiting with burst protection and built-in performance monitoring and logging.
Features
- Dual-Layer Caching: Combines fast in-memory LRU cache with persistent Redis storage.
- Smart Context Management: Automatically summarizes older messages to maintain context within token limits.
- Rate Limiting: Redis-based rate limiting with burst protection.
- API Compatibility: Drop-in replacement for Anthropic API with enhanced context handling.
- Metrics Collection: Built-in performance monitoring and logging.