TheGrokMCP

IgorWarzocha/TheGrokMCP

3.2

If you are the rightful owner of TheGrokMCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Grok MCP Server is a production-ready Model Context Protocol server designed for Grok AI models, utilizing FastMCP for a clean and Pythonic implementation.

Tools
  1. chat_completion

    Send chat messages to Grok with intelligent model selection.

  2. image_understanding

    Analyze images using Grok's vision capabilities.

  3. create_embeddings

    Generate text embeddings for semantic analysis.

  4. list_models

    Get information about all available Grok models.

Grok MCP Server šŸš€

A production-ready Model Context Protocol (MCP) server for Grok AI models, built with FastMCP for clean, Pythonic implementation. Developed using my own Superprompt System

Created by Igor Warzocha | LinkedIn | GitHub

ā˜• Support This Project

If this has helped you, consider buying me a coffee so I can finance my Opus credits!

ko-fi

Your support helps me create more AI optimization resources and keep this project updated!

Features

  • Full Model Support: Access to all Grok models (grok-2-latest, grok-3, grok-3-reasoner, grok-3-deepsearch, grok-3-mini-beta)
  • Intelligent Model Selection: Automatically choose the best model based on task complexity
  • Vision Capabilities: Analyze images with Grok's visual understanding
  • Text Embeddings: Generate embeddings for semantic search and analysis
  • Retry Logic: Built-in exponential backoff for robust API interactions
  • Comprehensive Error Handling: Detailed error messages and graceful degradation
  • FastMCP Integration: Clean, decorator-based API for easy extension

Quick Start

1. Installation

# Clone the repository
git clone https://github.com/IgorWarzocha/TheGrokMCP.git
cd TheGrokMCP

# Install dependencies
pip install -r requirements.txt

2. Configuration

# Copy the example environment file
cp .env.example .env

# Edit .env and add your Grok API key
# XAI_API_KEY=your_actual_api_key_here

3. Run the Server

# Run the MCP server
python -m src.server

# Or with debug logging
DEBUG=true python -m src.server

4. Connect with Claude Desktop

Add to your Claude Desktop configuration (claude_desktop_config.json):

{
  "mcpServers": {
    "grok": {
      "command": "python",
      "args": ["-m", "src.server"],
      "cwd": "/path/to/TheGrokMCP",
      "env": {
        "XAI_API_KEY": "your_api_key_here"
      }
    }
  }
}

Available Tools

1. chat_completion

Send chat messages to Grok with intelligent model selection.

# Simple chat
await chat_completion([{"role": "user", "content": "Hello!"}])

# Auto-select model based on complexity
await chat_completion(
    messages=[{"role": "user", "content": "Explain quantum computing"}],
    task_complexity="complex"  # Options: simple, complex, reasoning, research
)

# Use specific model
await chat_completion(
    messages=[{"role": "user", "content": "Solve this puzzle"}],
    model="grok-3-reasoner"
)

2. image_understanding

Analyze images using Grok's vision capabilities.

# From file path
await image_understanding(
    image_path="/path/to/image.jpg",
    prompt="What's in this image?"
)

# From base64 data
await image_understanding(
    image_base64="base64_encoded_data",
    prompt="Describe the scene"
)

3. create_embeddings

Generate text embeddings for semantic analysis.

# Single text
await create_embeddings("Hello world")

# Multiple texts
await create_embeddings([
    "First document",
    "Second document",
    "Third document"
])

4. list_models

Get information about all available Grok models.

models = await list_models()
# Returns model capabilities, context windows, and specifications

Model Comparison

ModelBest ForContext WindowMax Output
grok-3-mini-betaQuick responses, simple tasks64,0004,096
grok-2-latestGeneral purpose, image analysis128,0004,096
grok-3Complex tasks, advanced features128,0008,192
grok-3-reasonerMulti-step reasoning, analysis128,0008,192
grok-3-deepsearchResearch, fact-checking128,0008,192

Configuration Options

Environment Variables

  • XAI_API_KEY (required): Your Grok API key
  • DEFAULT_MODEL (optional): Default model to use (default: grok-3-mini-beta)
  • DEBUG (optional): Enable debug logging (default: false)

Troubleshooting

Common Issues

  1. API Key Not Found

    Error: XAI_API_KEY not found in environment variables
    

    Solution: Ensure your .env file contains XAI_API_KEY=your_key_here

  2. Model Not Available

    Error: Invalid model: grok-4
    

    Solution: Use list_models() to see available models

  3. Rate Limiting

    Error: Rate limit exceeded
    

    Solution: The client includes automatic retry with exponential backoff

  4. Image Analysis Fails

    Error: Image file not found
    

    Solution: Ensure the image path is absolute or use base64 encoding

Debug Mode

Enable detailed logging:

DEBUG=true python -m src.server

Development

Project Structure

TheGrokMCP/
ā”œā”€ā”€ src/
│   ā”œā”€ā”€ __init__.py
│   ā”œā”€ā”€ server.py          # Main MCP server
│   ā”œā”€ā”€ grok_client.py     # Grok API client
│   └── utils/             # Helper utilities
ā”œā”€ā”€ tests/                 # Test suite
ā”œā”€ā”€ docs/                  # Additional documentation
ā”œā”€ā”€ requirements.txt       # Python dependencies
ā”œā”€ā”€ .env.example          # Environment template
ā”œā”€ā”€ README.md             # This file
└── TODO.md               # Development roadmap

Running Tests

# Run all tests
pytest

# Run with coverage
pytest --cov=src

Adding New Tools

  1. Add the tool to server.py using the @mcp.tool decorator:
@mcp.tool
async def my_new_tool(param1: str, param2: int) -> Dict[str, Any]:
    """Tool description here."""
    # Implementation
    pass
  1. Update the README with usage examples
  2. Add tests for the new functionality

License

MIT License - see LICENSE file for details

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Support

Acknowledgments

  • Built with FastMCP - The Pythonic way to build MCP servers
  • Powered by Grok AI - Advanced AI models by xAI