Is “artificial intelligence” an oxymoron in local news?
The real importance of what we do is the process.
By Alice Dreger
They’re coming to “help” us.
Google, Rolli, Hacks/Hackers – these companies are collectively spending millions to create artificial intelligence programs designed to write articles about local happenings. They say their efforts will enable independent local news producers to churn out way more relevant local content without having to add people to our payrolls.
The chief example being floated envisions the bots taking a press release and dicing it up with internet-sourced “expert” quotes to produce something that – ta da! – can be consumed as if it is an “independent” news article, as if locally-aware humans were involved in its production.
Yeah, I know. We’re talking about spit-polishing press releases rather than questioning them.
And you know what really makes me go “hmmm”? The claim that these AI publishing systems will ultimately reduce disinformation and misinformation. As opposed to creating it at light speed.
Look, I’m not a Luddite. I can imagine some fine uses for AI in local news production.
Like the rest of you, I’ve had some genuinely bad string reporters. My God, I would have given up my one reliable home printer to have had access to an AI system that could have taken the same information as those folks and written something readable – something I didn’t have to spend more time editing than it would have taken me to research and write it myself.
Then there’s my top AI fantasy: Our green reporters use AI to help them compose their articles, producing work that has appropriate ledes, correct grammar, and readable content. Little by little, the green reporters see what the AI bots are doing and learn how to write, just like the desire to play complex video games taught my kid how to use a keyboard.
And, just as many of us already use Otter.ai to transcribe recordings of meetings and interviews, I can imagine using AI to summarize huge municipal or school board reports, so that we know where to start in terms of our own reporting.
Also, while I am not a fan of sports, I get that a lot of readers of local publications want to know who threw the ball to whom at the high school and who fumbled the puck on the layup. It makes total sense that we might use AI to produce “articles” about local sporting matches.
Bonus: When parents call to yell at us that we didn’t feature their kids, we could let our phones transcribe the voicemails and then ask Siri to respond with generated AI text telling the irate parents…whatever. Lather, rinse, repeat, amirite?
But every time I imagine the use of AI in local news production in practice rather than in theory, I imagine disaster.
I picture a press release turning into an AI product that leads to a defamation suit.
I can see the bots quoting an “expert” who turns out to either not exist or be a charlatan or a Nazi – leading to a crash in reader confidence.
I envision politicians and bureaucrats figuring out how to game the system to control the narratives as never before.
And looming in the background of it all is an image of a whole parallel universe of “local news” sites created to look real but in fact produced by multinational corporations – or the national Democratic or Republican parties, or the Chinese or Russian governments – that don’t actually give a crap what happens in or to our communities. These AI-generated “local news” outlets would quickly destabilize and defund our legitimate operations.
The fact that the tech corporations want to “help” us by providing tools that do things like convert press releases into so-called news articles reveals they don’t really get what we do. They think content production is our purpose. They have no concept that the real importance of what we do is the process.
The way I built a viable independent news operation for East Lansing, Michigan, was through incredibly picky reporting work – struggling to be fair, thorough, accurate, and precise – work that, year by year, built up a network of trust in my town. I know many of you have done the same, painstakingly using your editorial judgment to figure out for and with your communities what’s important, what’s relevant, and what’s true.
Yet all over America, even in places with high-quality news, news consumers don’t understand that it’s the process that makes journalism epistemologically essential to democracy. What makes journalists important is not just that they tell the truth, but that they work like heck to discern it.
News articles are not journalism; they are manifestations of journalism. And AI-generated “news” articles are certainly not journalism; they are manifestations of computer algorithms.
Before the bots come for us, we’re going to have to start explaining like never before – including to the tech people who seem to think they’re much smarter than us – what it is we do.
Note, Mar. 18, 2024: The word "collectively" has been added to the second sentence of this post following a note from Burt Herman, co-founder of Hack/Hackers, who wrote to say his 501(c)(3) operation "apply[s] for grant funding to support specific objectives, and as of this year, we’ve received grants in the low five figures for work on journalism and artificial intelligence."
Alice Dreger is a journalist, historian, and the publisher of Local News Blues. She founded East Lansing Info, a nonprofit digital investigative news service, and ran the operation for about ten years. Read more at the Local News Blues contributors page.