Practical AI tools and governance tips for small and niche newsrooms that want smarter reporting, not robot reporters.
If you’ve been anywhere near a journalism conference in the past year, you’ve probably heard the AI hype: “It’s going to replace reporters.” “It’s the future of investigative journalism.” “It’s going to write all our stories for us.”
But here’s the reality check, courtesy of journalist-technologist Jaemark Tordecilla — someone who’s actually been in the trenches building AI for newsrooms. In a recent INMA piece, Tordecilla put it plainly: AI is a terrible journalist. It doesn’t chase leads, smell a rat, or spot the story between the lines. What it does do exceptionally well is the grunt work — the sifting, sorting, and summarizing that lets you get to the important stuff faster.
And that’s the mental shift small and niche news organizations need to make: stop asking AI to be the reporter, and start asking it to make your reporters’ jobs easier.
Tools That Complement, Not Replace, Human Skill
If you’re running a small newsroom with limited staff, think of AI as your hyper-efficient intern — one that doesn’t sleep, doesn’t take lunch breaks, and doesn’t mind doing the boring bits.
Here are a few practical tools you could build or adopt:
- Data Sifters
AI models that can ingest giant PDF reports, meeting transcripts, or spreadsheets and spit out bullet-point summaries or proposed headlines. Your reporter glances at the output and decides if it’s worth a deeper dive. - Budget Chatbots
Exactly like Tordecilla’s tool for “chatting” with the Philippines’ 700,000-line national budget. For local publishers, this could mean feeding your city or county budget into an AI tool and asking questions like: How much did we spend on police overtime last year? or Which departments’ budgets increased the most? - Pattern Spotters
Tools that flag anomalies or trends in datasets — e.g., tracking how often a government department awards contracts to the same vendor, or how property sales spike in certain neighborhoods. - Fast-Format Converters
AI-assisted workflows that can take a long-form investigative article and quickly produce a podcast script, social video captions, or illustrated explainers. The key: these formats should be reviewed and fine-tuned by humans before publishing.
The Governance Question: Who’s Driving This Thing?
If AI is going to become part of your newsroom’s workflow, you need rules of the road. For small and niche publishers, governance doesn’t have to be a 40-page corporate policy, but it does need to answer some core questions:
- Transparency: Will you disclose when AI is used in research, production, or content creation? How?
- Attribution: Who “owns” AI-generated outputs in your newsroom — and how do you credit sources if AI pulls from third-party data?
- Bias Checks: How will you review AI-generated summaries or insights for skew, especially when dealing with politically sensitive topics?
- Ethical Boundaries: Where will you not use AI? (For example, generating deepfake-like images of people, or creating composite quotes.)
- Review Protocol: Who signs off on AI-assisted work before it goes public? Even small teams should have a second set of eyes on anything AI touches.
A lightweight governance structure might be as simple as a one-page “AI Use Policy” taped to the newsroom wall. The important part is that everyone knows the rules — and follows them.
Why This Matters for Small Newsrooms
Big national outlets can afford to burn cycles experimenting with AI. You probably can’t. That’s why your AI playbook should focus on high-leverage tasks: the work that’s essential but time-consuming, where AI can give you a multiplier effect without compromising your credibility.
The payoff? More time for your reporters to be out in the community, making calls, filing FOIA requests, and doing the human work AI can’t touch.
Memorable Takeaway:
“AI is good at finding patterns in data; humans are good at finding meaning in those patterns. Keep it that way.”