I haven’t played enough with the newest version of Google Bard to know whether it’s better than ChatGPT, but Bard has some advantages. You don’t have to log in — if you’re like most people, you’re already logged in through Google. The database is more up to date: It knows that Maura Healey is governor, whereas ChatGPT still thinks Charlie Baker is in the corner office. And it provides links. My misgivings about artificial intelligence aside, I’m impressed.
The question of whether Google should pay for news is about to get a lot more complicated. The Wall Street Journal is reporting that news publishers are freaking out over a new search tool powered by artificial intelligence that Google is working on.
The problem is that current Google search protocols drive a lot of traffic to news websites, and that could change. AI-powered search may very well keep users inside Google, thus denying clicks to the originators of the journalism that users are looking for. As an example, here is what The Atlantic believes it’s up against, according to the Journal’s Keach Hagey, Miles Kruppa and Alexandra Bruell:
About 40% of the magazine’s web traffic comes from Google searches, which turn up links that users click on. A task force at the Atlantic modeled what could happen if Google integrated AI into search. It found that 75% of the time, the AI-powered search would likely provide a full answer to a user’s query and the Atlantic’s site would miss out on traffic it otherwise would have gotten.
That 40% figure is typical for news publications. And though Google executives say that they intend to roll out AI search in such a way that journalism will continue to benefit, the Journal story makes it clear that’s nothing more than a vague promise at the moment.
The AI threat comes at a time when much of the media business is pushing for passage of the Journalism Competition and Preservation Act (JCPA), which would require that Google and Facebook come to the bargaining table and reach a deal to compensate news organizations for repurposing their content. It’s a dicey proposition — Facebook has been moving away from news, and as the Journal story shows, publishers are dependent on traffic from Google even as they insist that Google ought to pay them.
Just this week, Brier Dudley of The Seattle Times wrote that the NewsGuild-CWA, the union that represents 26,000 employees at a number of news outlets, now supports the JCPA as the result of a possible tweak to the legislation that would be more explicit about protecting jobs. Brier also touted a recent study that claims the two tech giants should be paying news organizations some $12 billion a year.
Despite some bipartisan support for the JCPA, finding agreement within our dysfunctional Congress may prove impossible. And the rise of AI-based search isn’t going to make passage any easier.
Time-Life co-founder Henry Luce in 1954. Photo via the Library of Congress.
Fake journalism produced by artificial intelligence is quickly devolving into a fiasco. The latest scandal involves Sports Illustrated, once a great magazine that was part of the Time-Life empire, now — well, who knows? It’s owned by something called The Arena Group, whose holdings also include TheStreet and Parade magazine (remember them?), and whose website says the company “combines powerful brands, in areas consumers are passionate about and delivers compelling experiences.” Corporate gobbledygook perfected except for the misplaced comma.
On Monday, Maggie Harrison of Futurism reported that SI had published articles generated by AI and — get this — included bylines and writer profiles that also had been generated by AI. Fake writers producing fake stories, in other words. All we need are fake readers. Harrison wrote: “After we reached out with questions to the magazine’s publisher, The Arena Group, all the AI-generated authors disappeared from Sports Illustrated’s site without explanation.”
SI later posted a message on X/Twitter that almost literally says, No, we did not publish any AI content. What actually happened was that we published AI content. Huh? The message is worth reproducing in full:
We didn’t do it! The third-party content provider did it! Well, all right then. Poynter media analyst Tom Jones, himself a former sports writer, has a lot to say this morning. He does not seem impressed with The Arena Group’s attempt to deflect blame, writing, “The stories in question do not appear to be the traditional sports features we’re all familiar with when it comes to Sports Illustrated. The stories were more along the lines of product features and reviews. For example, one story from 2022 was about the best volleyballs. Not that it makes any difference.” No, it doesn’t.
The real threat coming from AI-produced fake journalism is that bottom-feeders with no interest in quality are going to load up on the stuff, thus harming the reputation of quality news organizations as well. NewsGuard recently conducted a study that found 49 content farms were using material that seemed to be “almost entirely written” by AI. Even in its shrunken form, Sports Illustrated is better than a content farm. Even so, Henry Luce is rolling over in his grave.
I got some great feedback on my post about using artificial intelligence to summarize a panel discussion. This is an issue I’ll continue to keep a close eye on and to experiment with. I want to surface a few comments I received and gather them together here.
From Ed Miller, editor of The Provincetown Independent: “I was there at the Radically Rural panel discussion, and I’m not sure I would call this summary ‘shockingly good,’ Dan. It is pretty good, but it completely misses the most important point in Victoria Bouloubasis’s presentation, which is that the Iowa poultry plant workers’ organizing efforts mostly failed to persuade local officials to help them.” OK, I guess I could have said “shockingly coherent” rather than “shockingly good.”
From Tom Johnson: “Any idea what it means to ’empower stereotypes’? Some species sure would help.” Johnson is referring to a section of the summary that says, “The story aimed to empower and defy stereotypes surrounding immigrant communities, contrasting with typical narratives of victimization.” I would agree that ChatGPT is no A.J. Liebling, but overall I thought we got the drift.
From Rebecca Rainey, writing on Threads: “Worth noting: The summaries are incredibly boring. I would much rather read your reporting and light analysis, which would tell me what matters most in the grand scheme of things.” My response is that such summaries would be more for internal newsroom use than for public consumption. The next step is to take such a summary and see if ChatGPT can transform it into a news story. I’ll be looking for a suitable event sometime in the near future.
Meanwhile, OpenAI, the company that rolled out ChatGPT a year ago, is in utter turmoil. Co-founder Sam Altman was fired over the weekend and is now moving to Microsoft. The speculation is that the OpenAI board wanted to proceed more slowly and was concerned that Altman was too dismissive of AI’s potential dangers. Presumably Microsoft will let him pick up the pace, so overall this is not good news.
Strictly as an experiment, I produced an AI-generated summary of a panel discussion that took place in September at the Radically Rural conference in Keene, New Hampshire. Titled “How’d They Do That? Groundbreaking Journalism,” the panel was moderated by Jack Rooney, managing editor for audience development at The Keene Sentinel, which co-sponsored the event along with the Hannah Grimes Center for Entrepreneurship.
The hour-long panel featured Victoria Bouloubasis, a filmmaker who showed part of her Emmy-nominated documentary, “Rising Up in the Heartland: Latino Workers Fight for Pandemic Relief,” and discussed it; Samantha Hogan, an investigative reporter with The Maine Monitor, who talked about her reporting on the state’s public defense system for the indigent; and Adam Ganucheau, editor-in-chief of Mississippi Today, who described his news organization’s Pulitzer Prize-winning reporting into the Backchannel scandal, which encompassed figures ranging from former Gov. Phil Bryant to former NFL quarterback Brett Favre.
A note on methodology: I took some real chances with this. I captured the audio using Otter.ai, a transcription service that uses artificial intelligence. Then I fed each of the three parts of the panel discussion into ChatGPT and asked it to produce 400-word summaries. I left out the audience Q&A that followed the panel. Let me stipulate for the record that I did not watch the video except for a few brief clips, and I skimmed quickly through the Otter transcript, which was 7,000 words long. I don’t recommend this as sound journalistic practice, but, as I said, I was experimenting. What I did not try to do was ask ChatGPT to generate a news story; I’ll try that another time.
What appears below, produced entirely by ChatGPT, looks shockingly good. I just hope that it’s an accurate and useful summary.
***
Victoria Bouloubasis, an Emmy-nominated journalist and filmmaker, based in Durham, North Carolina, shared insights into her work and a specific project during a panel discussion. Here’s a summary of the key points:
Journalist’s Background: Victoria Bouloubasis is an experienced journalist with a focus on dispelling myths about the global South. Her career, primarily in community journalism, spans 15 years, emphasizing labor rights, healthcare, and language access in North Carolina’s agricultural and food industries. Growing up in the South, she covers labor issues, especially concerning first-generation Americans and their families in Latin America.
Coverage of Rural Communities: Bouloubasis delves into the complexities of covering rural communities, particularly in North Carolina, known for its prominence in pork and tobacco production. She highlights the evolving demographics, with a shift from primarily Mexican to more Central American and Indigenous communities. Language diversity adds complexity to her reporting, as many interviewed speak languages other than Spanish.
Project Overview: The discussed project, an Emmy-nominated piece for Univision, originally focused on long COVID’s impact on poultry plant workers. However, Bouloubasis shifted the narrative after discovering a grassroots organizing story among immigrant workers in Iowa. The story aimed to empower and defy stereotypes surrounding immigrant communities, contrasting with typical narratives of victimization.
Challenges Faced: The challenges in the project included last-minute changes to the story focus and navigating language barriers during interviews. Bouloubasis emphasizes the importance of terminology, noting the shifting dynamics of minority-majority populations in small towns. The story showcased the struggles of immigrant workers, demanding pandemic relief funds for essential workers who didn’t receive stimulus checks due to eligibility criteria.
Community Perspectives: Bouloubasis highlights the discomfort and confusion of local officials who couldn’t understand Spanish, providing a unique perspective on the challenges faced by non-English speakers. The mayor’s lack of understanding of the struggles faced by undocumented immigrants, particularly in receiving stimulus checks, became a focal point in the story.
Positive Outcomes: Despite the challenges, Bouloubasis expresses pride in the project’s recognition at the Emmys, emphasizing its hopeful and empowering narrative. The immigrant community in Iowa successfully distributed federal pandemic relief aid to meatpacking and farm workers, showcasing the impact of grassroots organizing.
Inspiration from the Community: Bouloubasis commends the courage of immigrant workers who put themselves at risk by publicly acknowledging their undocumented status. The project seeks to elevate their voices and bring attention to social determinants of health, emphasizing the importance of dialogue and understanding in journalism.
This summary captures Victoria Bouloubasis’s insights into her journalism career, the challenges of covering immigrant communities, and the impact of her Emmy-nominated project, which sheds light on the struggles and resilience of rural immigrant workers.
***
Samantha Hogan, investigative reporter with The Maine Monitor, shared insights into her work, focusing on Maine’s indigent public defense system. Here’s a summary of the key points:
Reporter’s Background: Samantha Hogan, an investigative reporter, joined The Maine Monitor in 2019, specializing in court system reporting. Her work, supported by Report for America, delves into Maine’s troubled indigent public defense system.
Maine’s Demographics: Maine, with a predominantly white population, faces economic disparities rather than racial issues. The community includes both lifelong residents and seasonal “snowbirds.” Hogan decided to investigate how individuals unable to afford lawyers were represented in the state.
Indigent Public Defense System Issues: Hogan discovered issues within Maine’s indigent public defense system, where private defense lawyers were contracted without proper oversight. Unlike most states, Maine lacked a statewide public defender system, relying on contracts with private lawyers. Some lawyers had criminal convictions and professional misconduct violations.
Investigative Approach: A whistleblower alerted Hogan to problems in the indigent defense system. She initiated a data-driven investigation, utilizing public records requests and analyzing billing data. Hogan revealed instances of lawyers billing implausible hours and identified disciplinary issues among contracted lawyers.
Scandalous Cases and Red Flags: Hogan uncovered cases where lawyers with serious criminal convictions or professional misconduct allegations were contracted for public defense. The investigation raised red flags, such as lawyers billing excessive hours and serious ethical violations within the system.
Impact and Legal Consequences: The investigative report led to significant consequences, including a government watchdog verifying Hogan’s findings. The Maine Commission on Indigent Legal Services faced a civil lawsuit, acknowledging negligent misrepresentation of billing by an attorney. The ACLU filed a lawsuit against the state, citing an ineffective public defense system.
Persistence and Impactful Journalism: Hogan’s relentless pursuit of public records, court filings, and data analysis turned anecdotal evidence into hard numbers. The investigative reporting became a pivotal factor in legal actions against the flawed public defense system. The ACLU lawsuit, citing Hogan’s work, puts the spotlight on the need for a functional public defense system in Maine. (1)
Ongoing Legal Battles: Hogan highlighted ongoing legal battles, including the ACLU lawsuit, indicating that the impact of investigative journalism extends beyond immediate results. The story, a four-year project, continues to evolve, emphasizing the importance of persistent reporting and its contribution to systemic change.
Samantha Hogan’s work exemplifies the power of investigative journalism in exposing systemic flaws, holding institutions accountable, and driving legal actions to address critical issues within the indigent public defense system in Maine.
***
Adam Ganucheau, editor-in-chief of Mississippi Today, shared insights into the impactful investigative journalism that led to winning a Pulitzer Prize for the Backchannel investigation. Here’s a summary of the key points:
Introduction: Adam expressed his gratitude to be on stage with fellow journalists and introduced Adam Ganucheau, editor-in-chief of Mississippi Today. Ganucheau oversees Mississippi’s largest newsroom, focusing on stories relevant to everyday Mississippians.
Mississippi’s Communities: Ganucheau highlighted the diversity of Mississippi’s communities, dispelling monolithic views. He emphasized the shared dreams and goals of Mississippians across different backgrounds, beliefs, and economic statuses. The challenge lies in providing equal access and opportunities for these dreams, especially in a state facing numerous problems.
Mississippi Today’s Role: Mississippi Today, a nonprofit newsroom launched in 2016, positions itself between everyday Mississippians and decision-makers. With a team of about 15, mostly Mississippi natives deeply connected to their community, the newsroom aims to bridge the gap by listening to people’s struggles, reporting on their concerns, and holding decision-makers accountable.
The Backchannel Investigation: The focus shifted to the Backchannel investigation, which earned a Pulitzer Prize in 2022. The investigation began with reporter Anna Wolfe’s exploration of poverty in Mississippi. She discovered that federal funds, intended for the state’s poorest residents, were being diverted to nonprofit programs with little accountability. (2)
Welfare Spending Anomalies: Wolfe delved into the anomalies in welfare spending, revealing that Mississippi approved less than 1% of applications for cash assistance. The investigation exposed a shift in funds from direct cash assistance to nonprofit initiatives with questionable accounting. Wolfe’s collaboration with the state auditor ultimately led to six arrests, including the welfare agency director.
Role of Former Governor: The investigation extended to the role of the former governor in redirecting federal funds and changing the philosophy behind poverty assistance programs. Wolfe spent years probing the governor’s involvement, culminating in a seven-part series titled “The Backchannel.”
Impactful Visuals and Reporting: Mississippi Today used charts and visuals to illustrate the impact of welfare spending changes over time. They scrutinized the former governor’s philosophy shift and its effects on assistance programs. The investigation revealed instances like wealthy individuals, including NFL quarterback Brett Favre, receiving welfare funds. (3)
Series Conclusion and Everyday Stories: The series aimed to provide accountability to state leaders while keeping the stories of everyday Mississippians at the forefront. Ganucheau highlighted the series’ conclusion in December 2022, where they circled back to the people who inspired the investigation. Everyday Mississippians, like Daniel Thomas, shared their experiences, expressing gratitude for the reporting but emphasizing the lack of tangible help.
Journalistic Mission: Ganucheau underscored that Mississippi Today’s mission is not driven by awards or notoriety but by a commitment to helping everyday people better understand their situations and hold decision-makers accountable.
The Backchannel investigation exemplifies the impactful role of investigative journalism in bringing systemic issues to light, holding those in power accountable, and giving a voice to those affected by government decisions.
Notes
Did someone say “lawsuit”? When I saw that word in the ChatGPT summary, I decided I’d better check it out. The verdict? It’s true.
Although I had already provided Otter, and thus ChatGPT, with the correct names of the panelists, ChatGPT was on its own with Anna Wolfe — and it spelled her name correctly.
ChatGPT got Brett Favre’s name right, too. I did not, and I had to scroll back to the introduction to fix it.
Today at What Works, we have a story by one of our Northeastern graduate students, Ian Dartley, about Inside Arlington, a local news project powered by artificial intelligence. It’s no substitute for the human touch, and in fact the town already has a very good nonprofit news organization, YourArlington. But it’s an interesting experiment, and Ian does a great job of explaining it.
We also decided to have a little fun. The headline and the bullet points used to summarize Ian’s story were written by ChatGPT. So was the social media post we used to promote the story. Here’s how it looks on Threads:
How about ChatGPT finding that dog emoji? Good boy! I thought it was interesting that ChatGPT wrote fairly dull headlines and bullet points but suddenly developed a sense of fun when it came time to write a social media post.
Among the more venerable local news startups in the Boston area is YourArlington, which has been publishing in one form or another since 2006. Founded by veteran journalist Bob Sprague, the digital-only site in the past couple of years has gone nonprofit, added a governing board, and hired an editor, Judith Pfeffer, who succeeded Sprague when he retired during the summer. YourArlington offers fairly comprehensive coverage of the town and has paid freelancers. (Disclosure: Some of those paid freelancers have been Northeastern students, and I’ve been asked to speak at Sprague’s retirement party in November.)
So imagine my surprise when I read Boston Globe tech reporter Hiawatha Bray’s story about Inside Arlington, a new project that is mainly produced by artificial intelligence: feed in the transcript of a select board meeting and publish what comes out the other side. Mainly I was surprised that Bray let cofounder Winston Chen get away with this whopper: “The town of Arlington, for practical purposes, is a news desert.” Bray offered no pushback, and there’s no mention of YourArlington. (Gannett merged the weekly Arlington Advocate with the Winchester Star about a year and a half ago and eliminated nearly all town-based coverage in favor of regional stories. There’s also a local Patch.)
Bray is properly skeptical, noting that several experiments in AI-generated stories have come to a bad end and that there’s no substitute for having a reporter on site who can ask follow-up questions. Still, there’s no question that AI news reporting is coming. Nieman Lab recently reported on a hyperlocal news organization in California that’s been giving AI a workout, although that organization — so far — has had the good sense not to publish the results.
But it’s disheartening to see the Globe take at face value the claim that Arlington lacks a local news organization. Scanning through YourArlington right now, I see a story about affordable housing that was posted today, a restaurant review, a story and photos from Town Day and a reception for the new town manager. Such coverage is the lifeblood of community journalism, and it can’t be replicated with AI — and I don’t see any of it at Inside Arlington.
Thank you to Paul Letendre, the host of “SouthCoast Matters,” and state Rep. Carol Doherty, D-Taunton, for having me on for two recent episodes. We talked about media trust, the challenges facing local news, artificial intelligence and more. “SouthCoast Matters” is recorded at Taunton Community Access and Media and is carried on cable stations in Taunton and the surrounding area. You can watch the two episodes below. (If you’re an email reader, you may have to click through for the web version of this post.)
Six years ago, The Washington Post announced that it would begin producing stories about high school football games using artificial intelligence. The expanded use of Heliograf, the Post’s “in-house automated storytelling technology,” would allow the news organization “to cover all Washington, D.C.-area high school football games every week,” according to a press release. The press release linked to an example of such coverage — a mundane article that begins:
In the second quarter, The Patriots’ Paul Dalzell was the first to put points on the board with a two-yard touchdown reception off a pass from quarterback William Porter.
Yet now, with AI tools having improved considerably, Gannett is running into trouble for doing exactly the same thing. Writing for Axios Columbus, Tyler Buchanan reports that The Columbus Dispatch had suspended AI-generated local sports coverage after the tool, LedeAI, came in for criticism and mockery. As Buchanan observes, one such article “was blasted on social media for its robotic style, lack of player names and use of awkward phrases like ‘close encounter of the athletic kind.’”
Please become a supporter of this free source of news and commentary for $5 a month. Just click here.
Has AI gone backwards since 2017? Obviously not. So what went wrong? It’s hard to say, but it could be that the generative AI tools that started becoming available late last year, with ChatGPT in the forefront, are more finicky than the blunt instrument developed by the Post some years back. In theory, generative AI can write a more natural-sounding story than the robotic prose produced by Heliograf and its ilk. In practice, if an AI tool like LedeAI is trained on a corpus of material loaded with clichés, then the output is going to be less than stellar.
Clare Duffy of CNN found that Gannett’s use of AI was not limited to Columbus. Other outlets that ran LedeAI-generated sports stories included the Courier Journal of Louisville, Kentucky; AZ Central; Florida Today, and the Journal Sentinel of Milwaukee, Wisconsin. Duffy reported that one story, before it was revised, included this Grantland Rice-quality gem: “The Worthington Christian [[WINNING_TEAM_MASCOT]] defeated the Westerville North [[LOSING_TEAM_MASCOT]] 2-1 in an Ohio boys soccer game on Saturday.”
There’s another dynamic that needs to be considered as well. The Washington Post, a regional newspaper under the Graham family, repositioned itself as a national digital news organization after Amazon founder Jeff Bezos bought it in 2013. Regional coverage is secondary to its mission, and if it weren’t covering high school football games with AI, then it wouldn’t be covering them at all.
By contrast, you’d think that high school sports would be central to the mission at Gannett’s local and regional dailies. Turning such coverage over to AI and then not bothering to check what they were publishing is exactly the sort of move you’d expect from the bottom-line-obsessed chain, though it obviously falls short of its obligation to the communities it serves.
Poynter media columnist Tom Jones, a former sportswriter, raises another issue worth pondering — the elimination of an important training ground for aspiring sports journalists:
There is still a contentious debate about how publishers should use AI. Obviously, journalists will be (and should be) upset if AI is being used to replace human beings to cover events. As someone who started his career covering high school football, I can tell you that invaluable lessons learned under the Friday night lights laid the foundation for covering events such as the Olympics and Stanley Cup finals and college football national championships in the years after that.
At a moment when AI is the hottest of topics in journalistic circles, Gannett’s botched experiment demonstrated that there is no substitute for actual reporters.
By the way, I asked ChatGPT to write a six- to eight-word headline for this post. The result: “AI-Generated Sports Coverage Faces Scrutiny: What Went Wrong?” Not bad, but lacking the specificity I was looking for.
I don’t want to come off as a total Luddite when it comes to artificial intelligence and journalism. Well, OK, maybe I do. Because even though I have no problem with using AI for certain mindless, repetitive tasks, such as transcribing interviews and finding patterns in public records, I think we need to be cautious about using such tools to actually produce journalism — whether it be reports about real estate transactions (thus missing the opportunity to dig more deeply) or stories about high school sports. With that in mind, I want to call your attention to three troubling developments.
For those who thought the notion of robot reporters was still quite a ways off, the first development is the most alarming. According to a recent article at Nieman Lab by Sophie Culpepper, an independent publisher has been experimenting with just that in his spare time, and the results are, well, not bad.
Mark Talkington, who runs a hyperlocal start-up called The Palm Beach Post in California, has been feeding governmental meetings that are available on YouTube into an AI system designed by a friend of his. Importantly, it’s not an off-the-shelf product like ChatGPT or Google Bard. Rather, it’s been trained on reliable news and information from his coverage area, which reduces if not eliminates the likelihood of “hallucinations,” the term for false but plausible-sounding output produced by AI.
The example Culpepper quoted from reads like what journalism professors disapprovingly tell their students is an “agenda story” — that is, it begins with something like Members of the board of sewer commissioners began their meeting by saluting the flag rather than with a lead grounded in the most interesting thing that happened. Nor has Talkington actually published any AI-generated stories yet. He said in his interview with Culpepper that he’s concerned about AI missing out on body language and, of course, on the ability to snag an elected official in the hallway during a break in the meeting.
But he said he could see using it to take notes and, eventually, to cover meetings that his thinly stretched staff can’t get to. And that’s how it begins: with a sympathetic hyperlocal publisher using AI to extend his reach, only to see the technology adopted by cost-cutting newspaper chains looking to dump reporters.
My second example might be called “speaking of which.” Because Gannett, whose 200 or so daily newspapers make it the largest corporate chain, announced recently that it, too, is experimenting with generative AI. Helen Coster of Reuters reports that, at first, AI will be used to generate content like bullet points that summarize the most important facts in a story, and that humans will check its work. That feature will be rolled out in the chain’s flagship newspaper, USA Today, later this year.
Gannett is hardly the only news organization that’s playing with AI; The New York Times, The Washington Post, The Boston Globe and others are all looking into ways to make use of it. But Gannett is notoriously tight-fisted and, as Coster notes, has slashed and burned its way to tenuous profitability. “Gannett’s journalists are fighting to ensure that they aren’t replaced by the technology,” she wrote. “Hundreds walked off the job over staff cuts and stagnant wages on June 5. Generative AI is a sticking point in some negotiations with the company, the union said.”
The third warning sign comes from Sebastian Grace, who passed along a truly disturbing item that the German tabloid Bild is laying off about 200 journalists while ramping up its use of AI. (Seb recently wrote a fine piece on journalism and AI for our website What Works: The Future of Local News.) Although those two developments at Bild are said to be unrelated, Jon Henley of The Guardian writes that Mathias Döpfner, the CEO of Bild’s corporate owner, Axel Springer, has said that ChatGPT and its ilk could “make independent journalism better than it ever was — or replace it.”
Axel Springer, by the way, also owns Politico, an important U.S. outlet for news about politics and public policy.
Do I think AI will soon replace reporters who do the hard work of going out and getting stories? No — at least not right away. But we’ve been losing journalists for 25 years now, and it seems certain that AI will be used, misused and abused in ways that accelerate that trend.