marcusjhang/mcp_server
If you are the rightful owner of mcp_server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The BookOps MCP + LLM Project is a setup for integrating a Model Context Protocol (MCP) server with Large Language Model (LLM) powered business logic to handle financial data queries.
explain_pnl
Provides explanations for profit and loss for a given book and date.
audit_exposure
Audits the main exposures for a given book.
π BookOps MCP + LLM Project
This repository demonstrates a fast setup for:
- MCP Server using FastMCP and stdout/stdin transport.
- LLM-powered Business Logic within the MCP tool functions.
- Agent that handles user input and orchestrates LLM function-calling + MCP tool invocation.
π File Structure
/ βββ agent.py # LLM agent that handles queries and calls MCP via subprocess βββ server.py # FastMCP server defining tools explain_pnl & audit_exposure βββ dummy_data/ β βββ trades.csv # Contains book_id, date, symbol, qty, side, price β βββ positions.csv # Contains book_id, symbol, sector, qty βββ README.md # This documentation
π οΈ Requirements
- Python 3.10+ (recommended version: 3.12)
- Install dependencies:
pip install fastmcp openai pandas python-dotenv
β’ Ensure your .env includes:
OPENAI_API_KEY=sk-xxx
βΈ»
π Running the System (STDIO Mode)
- Start the MCP Server (in one terminal)
python server.py
This waits for client calls over stdin/stdout.
- Run the Agent (in another terminal)
python agent.py
- Example interaction:
Explain the PnL for book HF123 on 2024-06-01 LLM chose explain_pnl β¦ { "book_id": "HF123", "date": "...", "pnl": -72500.0, "summary": "β¦" }
- Or ask:
What are the main exposures for HF123? LLM chose audit_exposure β¦ { "book_id": "HF123", "top_exposures": [...], "summary": "..." }
βΈ»
π How it Works 1. Agent uses LLM function-calling to choose a tool & arguments from FUNCTIONS. 2. Agent launches the server subprocess and sends MCP stdio JSON-RPC: β’ initialize β’ notifications/initialized β’ tools/call 3. Server processes the request: β’ Executes business Python & makes internal LLM calls. 4. Server returns JSON response. 5. Agent captures and displays it.
π§ Next Steps & Future Improvements
-
Expand the Toolset
Add more BookOps analytical toolsβsuch as historical PnL comparison, VaR (Value at Risk) calculations, or trade anomaly detection. -
Add Session Memory and Context
Implement lightweight session memory for the agent to track past queries (e.g., remember last book/date asked, or support follow-up questions). -
Switch to Persistent HTTP Deployment
Move from launching the MCP server per query (STDIO mode) to a persistent HTTP-based deployment (via Flask, FastAPI, or fastmcp-http-proxy).