adam-j-baron/financial_agent_crew
If you are the rightful owner of financial_agent_crew and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Model Context Protocol (MCP) server is a specialized server designed to provide financial data and live web search tools for AI agents, facilitating complex financial analysis tasks.
Financial AI Agent Crew
This repository contains an experimental and educational project demonstrating how to build a Fnancial AI Agent Crew using a local Large Language Model (LLM) and a custom tool server. The tool server includes tools to access yfinance data and run live web searches.
The project consists of two main files:
mcp_server_financial_tools.py: A Model Context Protocol (MCP) server that provides financial data and live web search tools.financial_agent_crew.ipynb: A Jupyter Notebook that acts as a client to connect to the MCP server, creates a Financial AI Agent Crew, performs a complex multi-step analysis for single stock and generates a report summarizing the stock analysis.
🚀 Getting Started
To run this project, you need to first start the MCP server, and then run the client notebook.
Step 1: Start the MCP Server
Open your terminal and run the Python server file. This will make the financial data and live web search tools available to your client.
python mcp_server_financial_tools.py
Ensure that OLLAMA_API_KEY environment variable is set from the Terminal beforehand, or the MCP Server will not run. This key is needed for the live web search. Get your free Ollama API Key here: https://ollama.com/settings/keys
For Windows PowerShell users, the command is:
$Env:OLLAMA_API_KEY = "your_api_key"
Step 2: Run the Client Notebook
Open financial_agent_crew.ipynb in your Jupyter environment. You can then run the cells in the notebook sequentially to:
- Connect to the MCP Server
- You'll need to run
python mcp_server_financial_tools.pyin a Terminal before running this notebook - Ensure that OLLAMA_API_KEY environment variable is set from the Terminal beforehand, or the MCP Server will not run. Get your free Ollama API Key here: https://ollama.com/settings/keys
- For Windows PowerShell users, the command is:
$Env:OLLAMA_API_KEY = "your_api_key"
- For Windows PowerShell users, the command is:
- You'll need to run
- Retreive and list all the available MCP Tools
- Define the LLM to be used by each Agent. I'm using the same Ollama model locally (llama3.1), but I use different
temperaturesettings to give different agents more creativity (i.e. higher temperature). - Define the Agents to be used in the Crew. Each Agent has a specialized ability (e.g. analyze stock prices, analyze estimates data, run web searches, write stock reports). I've enabled
reasoningfor all Agents since these tasks are somewhat complex and need some autonomous thought. - Define the Tasks to be run by the Crew. It's important that follow-on Tasks receive the
contextfrom the previous Task it depends on (e.g. can't search for bullish analyst reasons if it doesn't know the most bullish analysts.) - Define the Crew. Since this is
sequential, the order of Tasks matters. - Review the final Stock Report.
- (Optional) Debugging Zone to explore individual Tools, Agents and Tasks.
🔧 Project Details
This project uses the following key technologies:
- MCP (Model Context Protocol): For building and communicating with the tool server.
yfinance: The library used by the server to fetch financial data.ollama: A local LLM used to power the AI agent. Ollama recently addedweb_searchandweb_fetchtools for access to live web results.crewai: A library for building agent crews to perform multi-step tasks.
This is a simplified setup intended for learning. It's not designed for production use, as noted in the notebook, due to the way connections are handled for ease of experimentation.