damianboh/fmp_mcp_server
If you are the rightful owner of fmp_mcp_server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
A lightweight, production-ready Model Context Protocol (MCP) server that brings real financial data directly to AI Agents or LLM Apps.
Financial Modeling Prep (FMP) MCP Server
Medium Article Explanation here: Build Your Own Financial Data MCP Server That ChatGPT Can Talk To
A lightweight, production-ready Model Context Protocol (MCP) server that brings real-time financial data directly to AI agents or LLM Apps — fundamentals, ratios, price data, transcripts, macro indicators, and more.
Overview
The Financial Modeling Prep MCP Server acts as a bridge between AI agents or LLM Apps and the Financial Modeling Prep (FMP) API, offering a structured and safe interface to query real-world market and company data.
This server gives you a single MCP endpoint your AI tools can call for:
- Fundamentals – Income statements, balance sheets, cash flows
- Valuation Metrics – P/E, P/B, ROE, margins, leverage
- News – Latest market or ticker-specific headlines
- Earnings Transcripts – Full text of company calls
- Macro Data – CPI, GDP, employment, and event calendars
- Insider Trades – Real-time Form-4 style transactions
Requirements
| Requirement | Version | Notes |
|---|---|---|
| Python | 3.9+ | Recommended: 3.10 or higher |
| Dependencies | httpx, mcp, uvicorn, starlette | Install via pip install -r requirements.txt |
| Environment | Set FMP_API_KEY | Defaults to "demo" with rate limits |
You Would Need a Financial Modeling Prep (FMP) API Key
To use this MCP server with full access (beyond the limited demo key), you’ll need your own FMP API key.
- Go to the official registration page:
https://site.financialmodelingprep.com/register - Create a free account using your email.
- After registration, navigate to your Dashboard → API Key section to get the API Key.
Note that some FMP API endpoints may not work if you are using the free version of FMP API.
Installation
git clone https://github.com/damianboh/fmp_mcp_server.git
cd fmp-mcp-server
pip install -r requirements.txt
Then set your API key:
export FMP_API_KEY=your_fmp_api_key_here # macOS/Linux
setx FMP_API_KEY your_fmp_api_key_here # Windows
Running the Server
The script supports 3 launch modes via the --transport flag in your cmd command:
1️. STDIO Mode (Default)
For local development and direct MCP use via CLI or ChatGPT desktop app.
python fmp_mcp_server.py
Equivalent to:
python fmp_mcp_server.py --transport stdio
This mode just communicates over standard input/output — ideal for embedding in local AI environments.
2️. SSE Mode (Server-Sent Events)
For streaming output via FastMCP’s SSE transport.
python fmp_mcp_server.py --transport sse
This is used by frameworks like LangChain MCP or OpenDevin that consume event streams.
3️. Streamable HTTP Mode
Runs a Starlette / Uvicorn HTTP server for remote access (ideal for ChatGPT or cloud tunnels).
python fmp_mcp_server.py --transport streamable-http --host 127.0.0.1 --port 8000
You’ll see:
Starting FMP MCP Server (Streamable HTTP mode) on http://127.0.0.1:8000
API Key configured: Yes
Streamable HTTP endpoint (path hint): http://127.0.0.1:8000/mcp/
Then test:
curl http://127.0.0.1:8000/health
Expected:
{"status": "healthy", "service": "fmp-mcp-server"}
Exposing the Server to the Web
You need to do this if you want to expose the server to other LLMs hosted on the web, e.g. ChatGPT.
Option A: Cloudflare Tunnel (Recommended)
Cloudflare is fast, free, and doesn’t require installing ngrok.
- Install Cloudflare Tunnel
- Authenticate:
cloudflared login - Run tunnel:
Output example:
cloudflared tunnel --url http://127.0.0.1:8000https://fmp-mcp-server-sg.trycloudflare.com
Copy this URL and use it as your MCP server endpoint inside ChatGPT or any MCP-capable client.
Option B: Ngrok
ngrok http 8000
Then use the forwarded URL in your MCP config, e.g.:
https://1234-56-78-90-123.ngrok-free.app/mcp/
Example: Using with ChatGPT (Custom MCP Server) (Paid Plan Required)
-
Open ChatGPT → Settings → Apps & Connectors → Developer Mode On. You would need a paid (ChatGPT Plus) account for turning developer mode on. And you need this mode for creating and connecting to your own custom MCP server.
-
Go back to Apps & Connectors and click Create
-
Paste your tunnel URL:
https://whatever-subdomain-assigned-to-you.trycloudflare.com/mcp/
- ChatGPT will now auto-discover all available tools and resources:
/stable/profile→ Company profile/stable/ratios→ Financial ratios/stable/earning-call-transcript→ Earnings transcript/stable/news→ Stock news
You can now ask questions like these (yay!):
When ChatGPT tries to call a tool for the first time, you will need to click "Confirm" as shown below.
Examples of tool calls:
ChatGPT knows and can now call the correct FMP MCP tools.
Included Tools
| Category | Tool | Description |
|---|---|---|
| Company Fundamentals | company_profile | Overview: price, market cap, CEO, sector, identifiers |
| Statements | income_statement, balance_sheet, cash_flow | Financials across periods |
| Ratios | financial_ratios | P/E, ROE, margins, leverage, etc. |
| Price Data | historical_price_eod_full | Full OHLCV daily bars |
| Transcripts | earnings_call_transcript | Management Q&A and remarks |
| Macroeconomics | economic_indicators, economic_calendar | GDP, CPI, NFP, and calendar events |
| News | stock_news_latest, stock_news_search | General and ticker-specific headlines |
| Insider Trades | insider_trading_latest | Form-4 type recent insider transactions |
| Utilities | ping, when_should_i_use_fmp | Health check and routing hint |
When to Use This Server
- You need factual stock or macro data
- You’re analyzing fundamentals or transcripts
- You’re building RAG or agentic pipelines that rely on finance data
Health Check
Verify the server is running:
curl http://127.0.0.1:8000/health
Response:
{"status": "healthy", "service": "fmp-mcp-server"}
Advanced Options
| Flag | Description | Default |
|---|---|---|
--host | Host interface | 127.0.0.1 |
--port | Server port | 8000 |
--path | HTTP path prefix | /mcp/ |
--stateless | Run in stateless HTTP mode | False |
--json-response | Use JSON responses (instead of SSE) | False |
Example:
python src/fmp_mcp_server.py --transport streamable-http --stateless --json-response
Quick Reference
| Purpose | Command |
|---|---|
| Run locally | python src/fmp_mcp_server.py |
| Stream SSE | python src/fmp_mcp_server.py --transport sse |
| HTTP endpoint | python src/fmp_mcp_server.py --transport streamable-http |
| Health check | curl http://127.0.0.1:8000/health |
| Expose via Cloudflare | cloudflared tunnel --url http://127.0.0.1:8000 |
| Connect to ChatGPT | Use the Cloudflare URL as MCP endpoint |
Have fun!
Cheers, Damian