
A prominent editor has unleashed a scathing attack on journalism schools for what he claims is their retrograde attitude toward artificial intelligence. Since the editor, Chris Quinn of Cleveland.com and The Plain Dealer, is invading my turf, I thought I’d take a look at what he has to say and offer some context.
Follow my Bluesky newsfeed for additional news and commentary. And please join my Patreon for just $6 a month. You’ll receive a supporters-only newsletter every Thursday.
Quinn begins his recent “Letter from the Editor” column with an anecdote about a recent college graduate who turned down a job because of the way Quinn’s publications use AI. Increasingly, they ask reporters to do nothing but report, turning over their notes to be transformed into news stories by AI, with human editors looking them over to make sure the final product is accurate and coherent.
As Quinn observes, the AI chatbot is performing the role of what used to be called a “rewrite man,” taking calls from reporters in the field and banging out stories on deadline. Automating that process, he argues, is the future. He writes:
Because we want reporters gathering information, these jobs are 100 percent reporting. We have an AI rewrite specialist who turns their material into drafts. We fact-check everything. Editors review it. Reporters get the final say. Humans — not AI — control every step.
By removing writing from reporters’ workloads, we’ve effectively freed up an extra workday for them each week. They’re spending it on the street — doing in-person interviews, meeting sources for coffee. That’s where real stories emerge, and they’re returning with more ideas than we can handle.
Quinn also takes several shots at journalism schools for teaching their students that AI is evil. Here’s a sample:
Journalism programs are decades behind. Many graduating students have unrealistic expectations. They imagine themselves as long-form magazine storytellers, chasing a romanticized version of journalism that largely never existed.
That’s what they’re taught….
If you’re a student considering journalism, I’d skip that degree. Study political science. Learn technology. Understand how government, businesses and nonprofits work. Take communications law and ethics as electives. Skip much of the rest.
Well, now. I don’t recognize our journalism school in Quinn’s screed. Northeastern has a deal with Anthropic to make the enterprise version of Claude available to all of us, faculty and students, and though I’m ambivalent, I’ve found that it has its uses. Just recently I’ve used it to brainstorm some questions for an interview, and I frequently use it as a search engine, finding that it can go deeper and ferret out more obscure information than I can get from Google, which itself is now choked with ads and suffused with AI. I always ask Claude to give me links, and I follow them, because the hallucination problem is real.
As I’ve said before, I also like what they’re doing at the Midcoast Villager in Camden, Maine, where reporters are given AI-generated summaries of governmental meetings in the 43 towns it covers so that they can identify stories that are worthy of following up on.
My j-school colleagues and I do wrestle, though, with using AI as a writing tool. Bottom line: We generally prohibit our students from taking that final step, because one of the most important things you’re supposed to learn in j-school is how to write. If Quinn ends up hiring them to report for Cleveland.com/The Plain Dealer and they have to turn over the writing part of their job to AI, well, I suppose that’s all right given the protections that he says are in place. Quinn’s news organization is part of the Advance Local chain, which is one of the better ones; it’s doing interesting work at NJ.com and, in Massachusetts, at MassLive.com, which is loosely attached to a daily newspaper that Advance also owns, The Republican of Springfield.
That said, it doesn’t sound like much fun. I’ve always found that writing is the most enjoyable part of my job, because it gives me a chance to make sense of the information I’ve gathered. There are also a lot of nuances on what to include, what to leave out and what to emphasize. AI can’t do that as well as I could; neither could an old-fashioned rewrite man. I suppose that’s fine for breaking news about the house fire, the court hearing or the demonstration you just finished covering, but it’s not going to work for the deeper writing that comes from interviewing a number of people. Could AI at least help with that process? Yes. But in that circumstance I’d want the reporter to take control and not turn it over entirely to AI and the editors who’ll look over the results.
Jeff Jarvis, a leading thinker on journalism and technology, took to Facebook in order to defend Quinn, but he agrees that the brakes Quinn taps on are vital:
The cautions over using AI are obvious and well-known: Never, never let its product loose without responsible, human oversight. The fears are well-documented: that AI will cause job loss. Well, that’s happening anyway. Quinn is right to look for ways to enable reporters to do more reporting. But let’s look at ways to break out of the presumptions of our grandfather’s newspapers.
In his newsletter Working Systems, Damon Kiesow, a professor at the Missouri School of Journalism, offers some additional cautions:
AI is just the latest tech, and also a unique threat. AI imposes a third wave of disruption on journalism: information is no longer expensive to distribute (the web), access (the mobile web), or create (GenAI). We must rebuild the business and readership without those three defensive moats.
We must ask questions about technology. What is (AI in this case) good for? What are the costs & benefits to our staff, the community, the business, our values and ethics, and society? How does it work; who built it; who is profiting from it? Why, how, and where should we use it? And we must keep in mind “no” is often a rational and appropriate answer.
As Kiesow suggests, AI and journalism can turn into a toxic mix very quickly. At The Baltimore Sun, reporters recently discovered that AI had been used to generate two news analyses with little if any human intervention. As Fern Shen reports for Baltimore Brew: “Instead of a byline, a note at the top said these ‘analyses’ were ‘generated by an artificial intelligence tool at the request of the Baltimore Sun and reviewed by staff members.’” One referred to Donald Trump as the “former president” twice.
“How thorough was that ‘review’ of the AI slop, anyway?” the union was reported as saying.
In New York State, legislators are considering legislation that would “would require news organizations to label AI-generated material and mandate that humans review any such content before publication,” according to Nieman Lab. That sort of government intrusion into journalism probably violates the First Amendment. But as a statement of best practices that news organizations ought to follow, it’s on target.
“Artificial intelligence is not bad for newsrooms. It’s the future of them,” writes Chris Quinn. “It already allows us to be faster, more thorough and more comprehensible. It frees time for what matters most: gathering facts and developing stories to serve you.”
Those of us who teach future journalists don’t necessarily disagree. I wouldn’t push it as far as Quinn has, but AI is here to stay, and there’s no doubt that it can be a valuable newsroom tool. At the same time, though, many newsrooms are not going to follow the common-sense ethical guidelines that Quinn lays out. I’d like to think that journalism educators can help students sort through the ethical and unethical uses of AI and prepare them for the newsrooms of the future. Unfortunately, Quinn has already declared us irrelevant.
Discover more from Media Nation
Subscribe to get the latest posts sent to your email.
Efficiency v. Authenticity?
It is in the writing process that reporters (or their editors) often first perceive holes in stories. This is what I call the upstream theory of problem solving in journalism: If you can’t get your 15th graf to work, your real problem is probably in the 10th graf. If you can’t get the writing to work, the real problem is probably in your reporting. If you can’t get your reporting to work, your real problem is probably in your story idea. Using AI to treat the writing part as an administrative chore rather than a crucial part of the process of telling a story risks damaging the organic nature of storytelling as journalism.
“graf”…??