Harshalk2002/-Amazon-Reviews-MCP-Server
If you are the rightful owner of -Amazon-Reviews-MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Amazon Reviews MCP Server (FastMCP) is a server built using the Model Context Protocol (MCP) with FastMCP, designed to analyze brand mentions in an Amazon reviews dataset.
Amazon Reviews MCP Server (FastMCP)
This repo contains a Model Context Protocol (MCP) server built with FastMCP that exposes an amazon_reviews dataset and tools to analyze brand mentions. It also includes a small Python demo client.
What’s included
server.py— FastMCP server exposing:resource:amazon_reviews_csv(first ~1MB of the CSV)tool:dataset_info()— columns & quick row counttool:count_brand_mentions(brand, text_column=None, regex=False)— counts mentionstool:sample_reviews_with_brand(brand, limit=5, text_column=None)— example snippetstool:explain_result_with_llm(brand, provider='openai'|'ollama', model=...)— optional LLM explanation
demo_client.py— minimal Python client that launches the server over stdio and calls the toolsrequirements.txt
Quick start
-
Place the dataset
Put your CSV at one of:- Same folder, named
amazon_reviews.csv, or - Set an environment variable:
export AMAZON_REVIEWS_CSV=/absolute/path/to/amazon_reviews.csv
- Same folder, named
-
Install dependencies
python -m venv .venv && source .venv/bin/activate pip install -r requirements.txt -
Run the server (for inspection)
python server.pyThe server speaks MCP over stdio; use an MCP-compatible client or the included demo.
-
Run the demo client
python demo_client.py AppleIt will:
- list available tools
- call
count_brand_mentions("Apple") - fetch a few sample snippets
- (optionally) ask an LLM to summarize the finding if configured
Optional: LLM explanation
You can ask the server to generate a 1‑paragraph explanation of the metric using OpenAI or Ollama.
OpenAI
export OPENAI_API_KEY=sk-...
python demo_client.py Apple
The demo calls explain_result_with_llm(provider='openai', model='gpt-4o-mini') via the server.
Ollama (local)
# Ensure Ollama is running locally (defaults to http://localhost:11434)
# e.g., ollama run llama3.1
export OLLAMA_HOST=http://localhost:11434
python demo_client.py Apple
Grading notes
- FastMCP is used to define the MCP server (
server.py) and tools/resources. - Tool requirement:
count_brand_mentions(brand)implements the brand mention count (case‑insensitive, word‑boundary by default; can switch to regex). - Demo:
demo_client.pylaunches the MCP server as a subprocess via stdio and calls the tools end‑to‑end.
Troubleshooting
- If you see
Dataset not found, setAMAZON_REVIEWS_CSVor place the CSV besideserver.py. - If your dataset’s text column isn’t
review_body, the server will auto‑detect a likely column (e.g.,reviewText,reviews.text, etc.), or you can passtext_columnexplicitly.