What Hearst’s AI Playbook Can Teach Smaller Newsrooms

I first came across this in the BoSacks newsletter. The original article — Hearst Newspapers leverages AI for a human-centred strategy by Paula Felps at INMA — lays out how Hearst is rolling out AI across its network.

Now, you might be thinking: “That’s great for a chain with San Francisco-based innovation teams and a dozen staffers dedicated to new tools… but what about us smaller or niche outlets that don’t have a DevHub?”

That’s exactly why this is worth paying attention to. Hearst’s approach isn’t just about expensive tech — it’s about structure, guardrails, and culture. Those translate no matter the newsroom size.

Hearst’s AI Guiding Principles

✅ What We Do

  • Embrace generative AI responsibly.
  • Stay aligned with Legal and leadership.
  • Involve newsrooms and journalists across the organization.
  • Create scalable tools that help journalists.
  • Keep humans deeply involved.

🚫 What We Don’t Do

  • Tarnish our brands for quick wins.
  • Mass-publish AI-generated slop.
  • Mislead our audience or avoid transparency.
  • Let bots run without oversight.
  • Do nothing out of fear of change.

Here’s the big picture:

  • Clear principles: They’ve drawn a hard line on what AI will and won’t do. It’s in writing. It’s shared. And everyone’s on the same page.
  • Human-first workflows: Every AI-assisted output gets human review. No shortcuts.
  • Small tools, big wins: Their AI isn’t all moonshots. Some of the biggest gains come from automating grunt work — things every newsroom wrestles with.

Why smaller newsrooms should take notes

  • You might not have a Slack-integrated bot like Hearst’s Producer-P, but you could set up a lightweight GPT workflow for headlines, SEO checks, or quick summaries.
  • You probably can’t scrape and transcribe every public meeting in the state, but you could start with one high-value local board or commission using free/cheap transcription paired with keyword alerts.
  • You might not launch a public-facing Chow Bot, but you could make a reader tool that solves one local pain point — from school board jargon busters to a property tax appeal explainer.

The secret here isn’t deep pockets — it’s intentional design. Hearst put thought into categories (digital production, news gathering, audience tools), built policies to match, and then trained their people. That part costs time, not millions.

As Tim O’Rourke of Hearst put it:

“We try to build around the expertise in our local newsrooms. That’s our value — not the tech.”

For smaller outlets, that’s the blueprint. Start with what you do best. Add AI where it can actually save time or uncover new reporting angles. Keep your humans in control. And make sure your audience always knows you value accuracy over speed.


Quick wins for small newsrooms

  • Write your own “What We Do / What We Don’t Do” AI policy in plain language.
  • Pick one workflow bottleneck and pilot an AI tool to tackle it.
  • Build an internal “AI tips” Slack channel or email chain to share wins and lessons.

You don’t need a DevHub to start. You just need a plan — and maybe the courage to experiment without losing sight of your values.

AI in the Newsroom: Why It Should Be Your Smartest Intern, Not Your Star Reporter

Practical AI tools and governance tips for small and niche newsrooms that want smarter reporting, not robot reporters.

If you’ve been anywhere near a journalism conference in the past year, you’ve probably heard the AI hype: “It’s going to replace reporters.” “It’s the future of investigative journalism.” “It’s going to write all our stories for us.”

But here’s the reality check, courtesy of journalist-technologist Jaemark Tordecilla — someone who’s actually been in the trenches building AI for newsrooms. In a recent INMA piece, Tordecilla put it plainly: AI is a terrible journalist. It doesn’t chase leads, smell a rat, or spot the story between the lines. What it does do exceptionally well is the grunt work — the sifting, sorting, and summarizing that lets you get to the important stuff faster.

And that’s the mental shift small and niche news organizations need to make: stop asking AI to be the reporter, and start asking it to make your reporters’ jobs easier.


Tools That Complement, Not Replace, Human Skill

If you’re running a small newsroom with limited staff, think of AI as your hyper-efficient intern — one that doesn’t sleep, doesn’t take lunch breaks, and doesn’t mind doing the boring bits.

Here are a few practical tools you could build or adopt:

  • Data Sifters
    AI models that can ingest giant PDF reports, meeting transcripts, or spreadsheets and spit out bullet-point summaries or proposed headlines. Your reporter glances at the output and decides if it’s worth a deeper dive.
  • Budget Chatbots
    Exactly like Tordecilla’s tool for “chatting” with the Philippines’ 700,000-line national budget. For local publishers, this could mean feeding your city or county budget into an AI tool and asking questions like: How much did we spend on police overtime last year? or Which departments’ budgets increased the most?
  • Pattern Spotters
    Tools that flag anomalies or trends in datasets — e.g., tracking how often a government department awards contracts to the same vendor, or how property sales spike in certain neighborhoods.
  • Fast-Format Converters
    AI-assisted workflows that can take a long-form investigative article and quickly produce a podcast script, social video captions, or illustrated explainers. The key: these formats should be reviewed and fine-tuned by humans before publishing.

The Governance Question: Who’s Driving This Thing?

If AI is going to become part of your newsroom’s workflow, you need rules of the road. For small and niche publishers, governance doesn’t have to be a 40-page corporate policy, but it does need to answer some core questions:

  • Transparency: Will you disclose when AI is used in research, production, or content creation? How?
  • Attribution: Who “owns” AI-generated outputs in your newsroom — and how do you credit sources if AI pulls from third-party data?
  • Bias Checks: How will you review AI-generated summaries or insights for skew, especially when dealing with politically sensitive topics?
  • Ethical Boundaries: Where will you not use AI? (For example, generating deepfake-like images of people, or creating composite quotes.)
  • Review Protocol: Who signs off on AI-assisted work before it goes public? Even small teams should have a second set of eyes on anything AI touches.

A lightweight governance structure might be as simple as a one-page “AI Use Policy” taped to the newsroom wall. The important part is that everyone knows the rules — and follows them.


Why This Matters for Small Newsrooms

Big national outlets can afford to burn cycles experimenting with AI. You probably can’t. That’s why your AI playbook should focus on high-leverage tasks: the work that’s essential but time-consuming, where AI can give you a multiplier effect without compromising your credibility.

The payoff? More time for your reporters to be out in the community, making calls, filing FOIA requests, and doing the human work AI can’t touch.


Memorable Takeaway:
“AI is good at finding patterns in data; humans are good at finding meaning in those patterns. Keep it that way.”