AI Could Save Local Journalism – or Bury It in Misinformation
There's a lot of unknown about the future with AI, but Dr Adam North discusses how it could make or break local journalism.
Local news has historically held a special role in communities. It’s often the most trusted source of information because it’s grounded in proximity—your local school, hospital, street, sports clubs, council, etc. Readers can see journalists at events, recognise their stories, and know they’re part of the same world. However, local news is vanishing.
How Could AI Help?
In an era when local journalism is vanishing faster than local pubs, artificial intelligence might seem like an unlikely saviour. Newsrooms have been gutted, reporters laid off, and many communities—especially rural or lower-income areas—are now “news deserts”. However, as AI becomes more sophisticated, there’s growing interest in whether it can help reverse this decline, by making local journalism cheaper, faster, and more scalable.
The promise is compelling. AI can be used to automate routine reporting, such as local council and election updates, police reporting, school board decisions, or sport results and summaries. Additionally, AI can edit articles and perform production tasks quickly and cost-effectively. AI can transcribe meetings, translate community content into multiple languages, or even identify patterns in local data that might otherwise go unnoticed—like sudden spikes in rent prices or pothole complaints. For underfunded local papers, this kind of support isn’t just helpful; it might be existential.
Therefore, AI-assisted reporting is likely to enable small outlets to stretch limited resources further, allowing human journalists to focus on in-depth investigations or original interviews.
Additionally, AI can also lower the barrier to entry for new media startups. A neighbourhood newsletter doesn’t need a full editorial staff if it can use AI to help curate and generate articles based on verified public data. Combined with human oversight, AI can give voice to communities that have long been ignored by legacy media—which we intend to do at The Northern Rose.
Could The Same Tools Used to Rescue Local Journalism Also Destroy Its Credibility?
The threat isn’t theoretical. AI-generated content can be misused to spread disinformation, create fake stories about local crime, stoke racial tensions, or fabricate quotes from officials. In polarised communities or politically volatile areas, the damage from such disinformation can be impactful—as we saw with the Southport riot in 2024. What’s more, generative AI tools can produce content that sounds plausible, making it harder for readers to distinguish fact from fiction.
The economics of AI-generated content make this even more dangerous. Whereas traditional disinformation campaigns require time, money and a team, AI can now create dozens of “local news” websites filled with clickbait or ideological propaganda at almost no cost. In the US, there are already examples of politically motivated actors setting up entire networks of AI-generated local news sites, using templates that sound authoritative but are designed to push disinformation and misinformation.
While AI might make it easier to increase local reporting, quantity doesn’t guarantee quality. Journalism isn’t just about recounting events, it’s about context, accountability, and trust. An AI can write that a planning application was approved; it can’t ask the awkward follow-up question about who profits from the development. Without a human watchdog, civic corruption and malpractice can slip by unnoticed under the guise of automated efficiency.
There’s also the risk of homogenisation. If AI systems trained on national or global datasets start churning out “local” content, stories may lose their cultural specificity and relevance. Communities are not algorithms. The way people talk about issues in London is different from Manchester. Local journalism, at its best, reflects the voice of its people and not the statistical median of a training dataset.
Where Does This Leave Us?
AI has the potential to reinvigorate local journalism but only if used responsibly, transparently, and in partnership with human journalists. Regulators, publishers, and tech companies must come together to set clear ethical standards for AI use in local media. Communities should know when a story was AI-generated and when a reporter was on the ground. Transparency about sourcing and fact-checking will be crucial to maintaining trust.
An algorithm cannot knock on doors, sit through a contentious public meeting, or win the trust of a grieving family. It cannot dig into public records, confront officials, or follow up on a tip from a concerned citizen. AI can summarise what happened, but it can’t hold power to account. And without that accountability, communities suffer.
If AI is to have a role in local journalism, it must be as a tool and not a replacement. The best model is hybrid: AI does the grunt work; journalists do the storytelling. AI helps uncover patterns, but people still decide what matters and why.
Ultimately, the goal shouldn’t be to create journalism that is simply faster or cheaper, but journalism that is fairer, more accessible, and more deeply embedded in the communities it serves. We’re at a fork in the road. We can choose to let AI happen to journalism, or we can decide how it should happen. The future of local news may depend on which path we take.
Interesting article.