claude-code-openrouter

slyfox1186/claude-code-openrouter

3.3

If you are the rightful owner of claude-code-openrouter and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

OpenRouter MCP Server is a robust platform that provides unified access to over 400 AI models through OpenRouter's API, facilitating seamless integration and interaction with various AI models.

Tools
  1. chat

    Main chat interface with model selection and conversation continuation.

  2. list_conversations

    View all stored conversation summaries.

  3. get_conversation

    Retrieve full conversation history by ID.

  4. delete_conversation

    Remove a conversation from storage.

OpenRouter MCP Server

Python Docker MCP OpenRouter

A powerful Model Context Protocol (MCP) server providing unified access to 400+ AI models through OpenRouter's API

πŸš€ Overview

OpenRouter MCP Server is a Python-based tool that bridges the gap between MCP clients (like Claude Code) and OpenRouter's extensive AI model ecosystem. It provides seamless access to models from OpenAI, Anthropic, Meta, Google, Mistral, and many other providers through a single, unified interface.

✨ Key Features

  • πŸ€– Multi-Model Access: Connect to 400+ AI models from 30+ providers
  • πŸ”„ Conversation Continuity: Persistent chat history with UUID-based sessions
  • 🎯 Smart Model Selection: Natural language model aliases ("gemini" β†’ "google/gemini-2.5-pro")
  • πŸ“ Multi-Modal Support: Handle text, files, and images seamlessly
  • 🐳 Docker Ready: Containerized deployment with security best practices
  • ⚑ Performance Optimized: Intelligent caching and token management
  • πŸ”§ Developer Friendly: Comprehensive logging and debugging tools

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   MCP Client    │───▢│  OpenRouter MCP │───▢│   OpenRouter    β”‚
β”‚  (Claude Code)  β”‚    β”‚     Server      β”‚    β”‚      API        β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                β”‚
                                β–Ό
                       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
                       β”‚  Conversation   β”‚
                       β”‚    Storage      β”‚
                       β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Core Components

ComponentDescription
server.pyMain MCP server with JSON-RPC protocol implementation
config.pyConfiguration management and model alias resolution
conversation_manager.pyPersistent conversation storage with UUID sessions
docker_manager.pyDocker container lifecycle management

πŸ“¦ Installation

Prerequisites

  • Python 3.12+ (Required)
  • Docker & Docker Compose (Recommended)
  • OpenRouter API Key (Required)

Method 1: Docker Deployment (Recommended)

  1. Clone the repository:

    git clone https://github.com/slyfox1186/claude-code-openrouter.git
    cd claude-code-openrouter
    
  2. Set up environment:

    cp .env.example .env
    # Edit .env with your OpenRouter API key
    nano .env
    
  3. Build and run:

    ./scripts/build.sh
    ./scripts/run.sh
    

Method 2: Direct Python Installation

  1. Clone and install dependencies:

    git clone https://github.com/slyfox1186/claude-code-openrouter.git
    cd claude-code-openrouter
    pip install -r requirements.txt
    
  2. Configure environment:

    cp .env.example .env
    # Add your OpenRouter API key to .env
    
  3. Run the server:

    python run_server.py
    

βš™οΈ Configuration

Environment Variables

Create a .env file based on .env.example:

# OpenRouter API Configuration
OPENROUTER_API_KEY=your_openrouter_api_key_here
OPENROUTER_BASE_URL=https://openrouter.ai/api/v1

# Default Model Settings
DEFAULT_MODEL=moonshotai/kimi-k2
DEFAULT_TEMPERATURE=0.7
DEFAULT_MAX_TOKENS=4096

# Tool Configuration
ENABLE_WEB_SEARCH=true
MAX_CONTEXT_TOKENS=100000
TOKEN_BUDGET_LIMIT=50000

# Logging Configuration
LOG_LEVEL=INFO
LOG_FILE=openrouter_mcp.log

# Optional: Rate limiting
RATE_LIMIT_REQUESTS_PER_MINUTE=60
RATE_LIMIT_TOKENS_PER_MINUTE=100000

# Docker Compose Bake delegation for better build performance
COMPOSE_BAKE=true

πŸ”‘ Getting Your OpenRouter API Key

  1. Visit OpenRouter.ai
  2. Sign up for an account
  3. Navigate to the API Keys section
  4. Generate a new API key
  5. Add it to your .env file as OPENROUTER_API_KEY

🎯 Usage

MCP Tools Available

The server exposes these MCP tools for client interaction:

ToolDescription
chatMain chat interface with model selection and conversation continuation
list_conversationsView all stored conversation summaries
get_conversationRetrieve full conversation history by ID
delete_conversationRemove a conversation from storage

Model Selection

The server supports intelligent model aliases:

{
  "model": "gemini"          // β†’ google/gemini-2.5-pro
  "model": "claude"          // β†’ anthropic/claude-4-sonnet
  "model": "claude opus"     // β†’ anthropic/claude-4-opus
  "model": "kimi"            // β†’ moonshotai/kimi-k2
  "model": "gpt-4"           // β†’ openai/gpt-4
}

Conversation Continuity

Each chat session returns a continuation_id that can be used to maintain context:

{
  "prompt": "Follow up question...",
  "continuation_id": "uuid-from-previous-response"
}

Multi-Modal Input

Support for various input types:

{
  "prompt": "Analyze this code",
  "files": ["/path/to/file.py"],
  "images": ["/path/to/screenshot.png"],
  "model": "gemini"
}

🐳 Docker Management

Use the included Docker manager for easy container operations:

# Build image
python tools/docker_manager.py build

# Start container
python tools/docker_manager.py start

# View logs
python tools/docker_manager.py logs

# Interactive shell
python tools/docker_manager.py shell

# Stop container
python tools/docker_manager.py stop

# Full restart
python tools/docker_manager.py restart

# Check status
python tools/docker_manager.py status

πŸ”§ Development

Project Structure

claude-code-openrouter/
β”œβ”€β”€ server.py              # Main MCP server implementation
β”œβ”€β”€ config.py              # Configuration and model management
β”œβ”€β”€ conversation_manager.py # Conversation persistence
β”œβ”€β”€ docker_manager.py      # Docker operations
β”œβ”€β”€ requirements.txt       # Python dependencies
β”œβ”€β”€ Dockerfile             # Container definition
β”œβ”€β”€ docker-compose.yml     # Service orchestration
β”œβ”€β”€ build.sh               # Build script
β”œβ”€β”€ run.sh                 # Runtime script
β”œβ”€β”€ .env.example           # Environment template
β”œβ”€β”€ .gitignore             # Git ignore patterns
β”œβ”€β”€ CLAUDE.md              # Claude Code instructions
└── README.md              # This file

Logging and Debugging

  • Server logs: /tmp/openrouter_debug.log
  • Application logs: openrouter_mcp.log
  • Container logs: docker logs openrouter

Model Capabilities

The server automatically detects model capabilities:

  • Vision Models: Handle image inputs (Gemini Pro, GPT-4V)
  • Large Context: Support extended conversations (Kimi K2, Gemini)
  • Function Calling: Tool use capabilities (Gemini Pro, GPT-4)

🚨 Security & Best Practices

Environment Security

  • βœ… Never commit .env files (protected by .gitignore)
  • βœ… Use .env.example for templates
  • βœ… Run containers as non-root user
  • βœ… Read-only volume mounts

API Key Management

  • πŸ” Store API keys in .env only
  • πŸ” Use environment variables in production
  • πŸ” Rotate keys regularly
  • πŸ” Monitor usage and billing

πŸ“Š Monitoring & Performance

Token Management

  • Automatic conversation truncation to prevent API limits
  • Token usage tracking and reporting
  • Configurable token budgets per request

Performance Features

  • In-memory conversation caching
  • Efficient JSON-RPC protocol
  • Streaming response support
  • Request rate limiting

πŸ”„ Supported Models

Popular Model Aliases

AliasFull Model NameProvider
geminigoogle/gemini-2.5-proGoogle
claudeanthropic/claude-4-sonnetAnthropic
claude-opusanthropic/claude-4-opusAnthropic
kimimoonshotai/kimi-k2Moonshot
gpt-4openai/gpt-4OpenAI
llamameta-llama/llama-3.1-8b-instructMeta

Model Categories

  • πŸ’¬ Chat Models: General conversation and reasoning
  • πŸ‘οΈ Vision Models: Image understanding and analysis
  • πŸ”§ Function Models: Tool use and function calling
  • πŸ“š Long Context: Extended conversation memory
  • ⚑ Fast Models: Quick responses and low latency

πŸ› οΈ Troubleshooting

Common Issues

Server won't start:

# Check environment configuration
python -c "from src.config import validate_config; print(validate_config())"

# Verify API key
echo $OPENROUTER_API_KEY

Container issues:

# Check container status
python tools/docker_manager.py status

# View detailed logs
python tools/docker_manager.py logs

# Restart everything
python tools/docker_manager.py restart

Model selection problems:

# Test model alias resolution
python -c "from src.config import get_model_alias; print(get_model_alias('gemini'))"

Debug Mode

Enable detailed logging:

export LOG_LEVEL=DEBUG
python run_server.py

πŸ“„ License

This project is licensed under the MIT License - see the file for details.

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Development Setup

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

πŸ™ Acknowledgments

πŸ“ž Support


Made with ❀️ for the AI development community

GitHub stars GitHub forks