Ojansen/blackjack-mcp-server
If you are the rightful owner of blackjack-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
A model context protocol server designed to facilitate playing blackjack using a language learning model (LLM).
Blackjack MCP Server
A turn-based blackjack game where you play as the dealer against an LLM player powered by Ollama and LlamaIndex. The game uses the Model Context Protocol (MCP) to expose game actions as tools that the LLM can use to make strategic decisions.
Features
- MCP Server: Exposes blackjack game as MCP tools and resources
- LLM Player: Uses LlamaIndex with Ollama to make strategic decisions
- Turn-Based Gameplay: You act as the dealer, the LLM plays against you
- Betting System: LLM manages chip balance and places bets
- Strategic AI: LLM explains its reasoning for each decision
- Configurable: Customize LLM model, starting chips, and game rules
Architecture
┌─────────────────┐
│ Main CLI │ You (Dealer)
│ (main.py) │
└────────┬────────┘
│
├──────────────┐
│ │
┌────────▼────────┐ ┌──▼──────────────┐
│ LlamaIndex │ │ MCP Server │
│ Agent │◄─┤ (Game Logic) │
│ (Ollama LLM) │ │ Tools/Resources│
└─────────────────┘ └─────────────────┘
MCP Tools
The MCP server exposes these tools to the LLM:
place_bet(amount)- Place a bet to start a new roundview_table()- View current game state (hands, dealer upcard, chips)hit()- Draw another cardstand()- End turn and let dealer playget_hand_value()- Get current hand valueget_chip_balance()- Get current chip balance
MCP Resources
blackjack://game/state- Current game state as JSONblackjack://rules- Blackjack rules and strategy guide
Installation
Prerequisites
- Ollama - Install from ollama.ai or running on a remote server
- LLM Model - Ensure a model is available (e.g.,
ollama pull llama3.2) - Docker & Docker Compose (recommended) OR Python 3.10+
Option 1: Docker (Recommended)
- Clone the repository:
git clone <repo-url>
cd blackjack-mcp-server
- Update
config.yamlwith your Ollama URL:
llm:
base_url: "http://192.168.33.12:11434" # Your Ollama server
- Run with Docker Compose:
docker compose run --rm --service-ports blackjack
That's it! The game will start automatically in the container.
Option 2: Local Python Setup
- Clone the repository:
git clone <repo-url>
cd blackjack-mcp-server
- Install dependencies:
pip install -e .
Or manually:
pip install mcp llama-index llama-index-llms-ollama pydantic pyyaml
- Ensure Ollama is running (locally or remote):
ollama serve # If running locally
- Update
config.yamlif using remote Ollama:
llm:
base_url: "http://your-ollama-server:11434"
Configuration
Edit config.yaml to customize the game:
game:
starting_chips: 1000 # Starting chip balance
minimum_bet: 10
maximum_bet: 500
dealer_stands_on: 17 # Dealer stands on this value
llm:
model: "llama3.2" # Ollama model to use
base_url: "http://localhost:11434"
temperature: 0.7
request_timeout: 60.0
display:
show_reasoning: true # Show LLM's thought process
clear_screen: false
Usage
Run the Game
With Docker:
docker compose up
Without Docker:
python main.py
Gameplay
- The game starts with the LLM having chips (default: 1000)
- Each round:
- You (dealer) set the bet amount or use default
- LLM places bet and receives cards
- LLM makes decisions (hit/stand) using strategy
- Dealer (you) plays automatically based on rules
- Winner is determined and chips are awarded
- Continue playing rounds until LLM runs out of chips or you quit
Example Session
🎰 BLACKJACK - You vs LLM 🎰
LLM Player starting chips: 1000
LLM Model: llama3.2
Dealer stands on: 17
🎲 Round 1
LLM Chips: 1000
Press Enter to let LLM bet 50 chips (or type amount/q to quit):
NEW ROUND - LLM Betting 50 chips
💭 LLM Reasoning:
I'll view the table first to see my hand...
My hand is [K♠ 7♥] = 17
Dealer shows: 5♦
Since I have 17 and dealer shows a weak card (5), I'll stand.
Result: You win! +50 chips
Remaining chips: 1050
Running as MCP Server
You can also run just the MCP server for integration with other MCP clients:
python -m blackjack_mcp.server
Or use the installed script:
blackjack-mcp
Development
Project Structure
blackjack-mcp-server/
├── src/
│ ├── blackjack_mcp/
│ │ ├── __init__.py
│ │ ├── game.py # Core game logic
│ │ └── server.py # MCP server
│ └── agent.py # LlamaIndex agent
├── main.py # CLI interface
├── config.yaml # Configuration
├── pyproject.toml # Project metadata
└── README.md
Running Tests
pytest
Code Formatting
black src/ main.py
Game Rules
- Objective: Get closer to 21 than the dealer without going over
- Card Values:
- Number cards (2-10): Face value
- Face cards (J, Q, K): 10 points
- Aces: 11 or 1 (automatically adjusted)
- Dealer Rules: Dealer must hit until reaching 17 or higher
- Payouts:
- Blackjack (21 with 2 cards): 1.5x bet
- Regular win: 1x bet
- Push (tie): Bet returned
- Loss: Lose bet
LLM Strategy
The LLM uses basic blackjack strategy:
- Hit on 11 or less (can't bust)
- Stand on 17 or more (dealer threshold)
- Between 12-16: Consider dealer's upcard
- Dealer shows 2-6 (weak): Stand
- Dealer shows 7-Ace (strong): Hit
Troubleshooting
Ollama Connection Error
Ensure Ollama is running:
ollama serve
Check the base URL in config.yaml matches your Ollama installation.
Model Not Found
Pull the model specified in config:
ollama pull llama3.2
Import Errors
Make sure all dependencies are installed:
pip install -e .
License
MIT License
Contributing
Contributions welcome! Please open an issue or submit a PR.
Acknowledgments
- Built with Model Context Protocol (MCP)
- Powered by LlamaIndex
- Uses Ollama for local LLM inference