MartinHodges/present-list
If you are the rightful owner of present-list and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Model Context Protocol (MCP) server is designed to manage and process Christmas present requests by interfacing with a LangGraph-based agent.
Pressie List - Santa's Present Request Agent
A LangGraph-based agent that uses MCP (Model Context Protocol) tools to help manage Christmas present requests. The agent checks if you've been naughty or nice and submits present requests to Santa.
Architecture
The project consists of:
- LangGraph Agent (
src/agent/) - Orchestrates the present request workflow using ChatGPT- Connects to standalone MCP HTTP server
- Wraps MCP tools as LangChain tools
- Routes between agent and tool execution nodes
- MCP HTTP Server (
src/mcp/http-server.ts) - Standalone HTTP server on port 3100 providing:check_naughty_or_nice- Checks if someone has been naughty or nicerequest_presents- Submits present requests to Santa based on naughty/nice status- Session management for multiple concurrent clients
- JSON-RPC over HTTP with SSE for notifications
Setup
- Install dependencies:
npm install
- Create a
.envfile with your OpenAI API key:
cp .env.example .env
# Edit .env and add your OPENAI_API_KEY
Running the Application
The application requires two processes: the MCP server and the agent.
Terminal 1 - Start the MCP HTTP server:
npm run mcp:http
Terminal 2 - Run the agent:
Interactive mode (chat with Santa's elf):
npm run dev
You'll chat with one of Santa's helpful elves who will:
- Listen to what presents you want
- Keep track of your list as you chat
- Intelligently detect when you're done (phrases like "that's all", "I'm done", etc.)
- Check if you're naughty/nice and submit your list to Santa
- Type "exit" to quit
Example session:
🎅 Welcome to Santa's Present Request Service!
I'm one of Santa's elves, here to help you build your Christmas list.
You: I want a bike for Christmas
Elf: Great choice! I've added a bike to your list. What else would you like?
You: Also some books
Elf: Wonderful! Books are added. Anything else?
You: And a video game. That's all!
Elf: Perfect! Let me check if you've been naughty or nice...
🎄 Processing your request...
Elf: Good news! You've been nice this year! Santa has approved all 3 presents on your list.
Single request mode:
npm run dev "I want a PlayStation 5 and some games"
Legacy stdio mode:
npm run mcp # Old stdio-based server (deprecated)
Build for production:
npm run build
npm start
How It Works
- Startup: MCP HTTP server starts on port 3100
- Connection: Agent connects to server via HTTP (
StreamableHTTPClientTransport) - Tool Registration: MCP tools are converted to LangChain DynamicStructuredTools
- User Request: User enters a present request via command line
- Agent Reasoning: ChatGPT decides which tools to call based on the request
- Tool Execution:
- First calls
check_naughty_or_nicevia HTTP to get user's status - Then calls
request_presentswith the naughty/nice score
- First calls
- Response: Agent returns Santa's decision to the user
MCP Tools
The MCP server communicates via HTTP and exposes two tools:
check_naughty_or_nice:
- Input:
userName(required),context(optional) - Returns:
{ status: 'naughty'|'nice', score: 0-100, reason: string }
request_presents:
- Input:
userName,presents[],naughtyNiceScore - Returns:
{ approved: boolean, presents: Present[], message: string } - Logic: Score >= 50 = all presents approved, score < 50 = only low priority items