adam-j-baron/mcp-finance-search-agents
If you are the rightful owner of mcp-finance-search-agents and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
This project demonstrates how to build a financial AI agent using a local LLM and a custom tool server with live web search capabilities.
Financial AI Agent with MCP Tools, including Live Web Search
This repository contains an experimental and educational project demonstrating how to build a financial AI agent using a local Large Language Model (LLM) and a custom tool server. The tool server includes live web search capability.
The project consists of two main files:
mcp_server_yfinance_search.py: A Model Context Protocol (MCP) server that provides financial data and live web search tools.mcp_client_yfinance_search.ipynb: A Jupyter Notebook that acts as a client to connect to the server, creates a ReAct Agent, and performs a sample financial analysis query.
🚀 Getting Started
To run this project, you need to first start the MCP server, and then run the client notebook.
Step 1: Start the MCP Server
Open your terminal and run the Python server file. This will make the financial data and live web search tools available to your client.
python mcp_server_yfinance_search.py
Ensure that OLLAMA_API_KEY environment variable is set from the Terminal beforehand, or the MCP Server will not run. Get your free Ollama API Key here: https://ollama.com/settings/keys
For Windows PowerShell users, the command is:
$Env:OLLAMA_API_KEY = "your_api_key"
Step 2: Run the Client Notebook
Open mcp_client_yfinance_search.ipynb in your Jupyter environment. You can then run the cells in the notebook sequentially to:
- Connect to the MCP server.
- Initialize the LLM (Ollama model).
- Create the ReAct agent.
- Ask questions and see the agent's financial analysis.
🔧 Project Details
This project uses the following key technologies:
- MCP (Model Context Protocol): For building and communicating with the tool server.
yfinance: The library used by the server to fetch financial data.ollama: A local LLM used to power the AI agent. Ollama recently addedweb_searchandweb_fetchtools for access to live web results.langgraph: A library for building agents with reasoning and action capabilities (ReAct).
This is a simplified setup intended for learning. It's not designed for production use, as noted in the notebook, due to the way connections are handled for ease of experimentation.