girishpandit88/ai-newsroom-mcp-server
If you are the rightful owner of ai-newsroom-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Newsroom MCP Server is a deterministic Model Context Protocol server designed to simulate a local newsroom workflow, enabling article ingestion, entity analysis, fact checking, ranking, and digest delivery without relying on third-party APIs.
Newsroom MCP Server
This repository contains a fully deterministic Model Context Protocol (MCP) server that simulates a local newsroom workflow. The goal is to showcase how a client can orchestrate article ingestion, entity analysis, fact checking, ranking, and digest delivery using a chain of MCP tools without relying on third-party APIs.
What's Included
- Sample article corpora in
resources/sample_articles.json
- A suite of lightweight tools in
tools/
that implement each newsroom stage - Typed models in
newsroom/types.py
so the generated tool schemas stay informative - A
server.py
entrypoint that registers the tools withFastMCP
- A
main.py
helper that runs the entire pipeline locally for verification
Quickstart
- Install dependencies (the project uses uv):
uv sync
- Run the local demo pipeline and inspect its output:
The script prints every intermediate payload so you can see how data flows from article fetches to a compiled digest.
uv run python main.py
Live sources —
fetch_articles
now supports RSS URLs (for example,https://rss.cnn.com/rss/cnn_topstories.rss
) when the server has outbound network access. The call automatically falls back to the canned dataset when you pass a named source such as"sample"
.
Running the MCP Server
You can bring the tools into any MCP-compatible client. Two handy options during development are:
-
Using the MCP CLI:
uv run mcp dev ./server.py
The CLI prints the JSON messages exchanged over stdio and is useful for debugging.
-
Embedding the server in an MCP-aware IDE/agent: point the client at
python server.py
(or the equivalentuv run python server.py
) and it will speak MCP over stdio.
Recommended Tool Flow
While the tools are designed to be composable, the example workflow below mirrors the
logic in main.py
:
fetch_articles(source="sample")
extract_passages(article_id, content)
for each articleextract_entities(passages)
disambiguate_entities(entities)
tag_entities(resolved_entities)
classify_topic(passages)
andanalyze_sentiment(passages)
as neededsummarize_tags(tagged_entities, passages)
fact_check(claims)
(optional)rank_stories(user_profile, tag_summaries, articles)
compile_digest(ranked_summaries, format="markdown")
deliver_digest(digest, delivery_channel, user_id)
Each tool returns structured JSON so downstream steps can consume the results directly.
Enabling LLM-Backed Steps
Several tools can optionally call ChatGPT for richer reasoning. To enable this path:
- Export your OpenAI key (
export OPENAI_API_KEY=sk-...
). - Optionally pick a model with
NEWSROOM_OPENAI_MODEL
(defaults togpt-4o-mini
). - Toggle the LLM-aware tools by setting
NEWSROOM_USE_LLM=true
before running the demo or starting the server.
When the environment variables are missing the pipeline stays fully deterministic and
uses the rule-based fallbacks outlined in newsroom/llm.py
.
Testing Changes
The quickest way to validate modifications is to run uv run python main.py
after your
changes. Because the dataset is static, the output should remain deterministic unless you
intentionally alter the business logic.
Next Steps
- Swap
resources/sample_articles.json
with feeds from your CMS - Extend the keyword lists or plug in an LLM for richer analysis
- Wire the delivery tool to your messaging stack once you are ready for real side effects