A new report finds that content farms are loading up on AI. Will local news be next?

Meet your new reporting staff. Photo (cc) 2023 by Dan Kennedy.

A recent report by NewsGuard, a project that evaluates news organizations for reliability and transparency, found that clickbait generated by artificial intelligence is on the rise. McKenzie Sadeghi and Lorenzo Arvanitis write:

NewsGuard has identified 49 news and information sites that appear to be almost entirely written by artificial intelligence software. A new generation of content farms is on the way.

The report didn’t specifically identify any local news websites that are using AI to write low-quality stories aimed at getting clicks and programmatic advertising. Perhaps non-local stories about health, entertainment and tech, to name three of the topics for which content farms are using AI, more readily fly under the radar. If you’re going to use AI to produce articles about the local tax rate or the women’s track team, you’re going to get caught pretty quickly when the results prove to be wrong. Still, the use of AI to produce some forms of local news, such as routine articles about real-estate transactions, is not new.

According to the NewsGuard report, there doesn’t seem to be a concerted effort yet to use AI in order to produce deliberately false stories, although there have been a few examples, including a celebrity death site that claimed President Biden had “passed away peacefully in his sleep.”

Call this Pink Slime 3.0. Version 1.0 was low-tech compared to what’s available today. Back in 2012, the public radio program “This American Life” found that a company called Journatic (pronounced “joor-NAT-ik,” though I always thought it should be “JOOR-nuh-tik”) was producing local content for newspapers using grossly underpaid, out-of-town reporters — including cheap Filipino workers who wrote articles under fake bylines.

Pink Slime 2.0, of more recent vintage, consists of hundreds of websites launched to exploit the decline of local news. Under such banners as “North Boston News” (!), these sites purport to offer community journalism but are actually a cover for political propaganda. Nearly all of them serve right-wing interests, thought there were a few on the left as well.

Pink Slime 3.0 threatens to become more insidious as AI continues to improve. As Seth Smalley wrote for Poynter Online, this is “pink slime on steroids.”

Of course, AI could prove to be a boon for local news, as Sebastian Grace wrote last week for What Works, our Northeastern journalism project tracking developments in community journalism. By eliminating repetitive drudge work, AI can free journalists to produce high-value stories that really matter.

Still, bottom-feeders like CNET — not exactly a content farm, but not much better than that, either — have already been caught publishing error-laden stories with AI. You can only imagine what sort of advice these content farms are going to give people about dealing with their medical problems.

OpenAI, which likes to portray itself as a responsible player in discussions about the future of AI, would not respond to NewsGuard’s inquiries. Neither would Facebook, which is amplifying AI-generated content.

The only thing we can be sure of is that a new, more insidious version of pink slime is coming to a website near you — if it hasn’t already.

Advertisement

Artificial intelligence writes an op-ed piece on gun control

Photo (cc) 2011 by Louis K.

I’ve been having some fun with AI-generated content recently, posting a description of the MBTA in the voice of Herman Melville and a profile of me that is thoroughly bollixed up with a marketing guru who’s also named Dan Kennedy. Today, though, I offer something a little more serious and practical. I directed ChatGPT to write a 600-word opinion article in favor of gun control. The result is below. It needs a little editing, especially in the lead. I didn’t check the numbers, and ChatGPT is known for producing false facts. The pro-gun control argument is not especially sophisticated. But you can imagine a state legislator doing something like this to write an op-ed for a local news outlet. You may think that’s fine. I think it’s chilling.

Gun control has been a controversial issue in the United States for many years, with strong opinions on both sides of the debate. As an AI language model, I do not have a political affiliation or personal beliefs. However, I can analyze data and research and provide an evidence-based opinion on the topic. Therefore, in this article, I will make a case in favor of gun control.

Firstly, let’s look at the numbers. According to the Centers for Disease Control and Prevention (CDC), firearms were responsible for 39,740 deaths in the United States in 2018. That is an average of 109 deaths per day, or roughly four deaths every hour. It is undeniable that gun violence is a significant public health issue in this country.

Many people argue that the Second Amendment guarantees their right to bear arms. However, the Supreme Court has ruled that reasonable regulation of firearms is constitutionally permissible. Therefore, it is within the government’s power to implement laws and regulations to protect the public from gun violence.

One such regulation could be universal background checks. Current federal law only requires background checks for gun purchases from licensed dealers. However, a loophole exists that allows private sellers, who are not required to be licensed, to sell firearms without conducting a background check. This loophole makes it easy for criminals and individuals with a history of violence to obtain guns.

Universal background checks would close this loophole and ensure that all gun purchases, regardless of the seller, are subject to a background check. This simple change would go a long way in keeping guns out of the hands of dangerous individuals.

Another common-sense regulation would be a ban on assault weapons. These weapons are designed for military use and have no practical purpose in civilian life. They have been used in several high-profile mass shootings, including the tragedies in Las Vegas, Parkland, and Sandy Hook.

Opponents of a ban on assault weapons argue that it would violate their Second Amendment rights. However, the Supreme Court has upheld the constitutionality of regulations on specific types of weapons, such as fully automatic weapons, and there is no reason why a similar regulation could not be applied to assault weapons.

Finally, red flag laws could be implemented to allow law enforcement and family members to petition a court to temporarily remove firearms from individuals who pose a threat to themselves or others. These laws have already been implemented in several states and have been shown to be effective in preventing gun violence.

In conclusion, gun violence is a serious public health issue in the United States, and reasonable regulations on firearms can help reduce the number of deaths and injuries. Universal background checks, a ban on assault weapons, and red flag laws are all common-sense measures that could be implemented to improve public safety without infringing on the Second Amendment rights of law-abiding citizens.

It is essential to remember that gun control is not about taking away people’s guns or infringing on their rights. It is about finding ways to prevent gun violence while still allowing responsible gun ownership. We can all agree that we want to live in a safe society, and gun control measures can help make that a reality.

Ultimately, the key to reducing gun violence is a collaborative effort between government, law enforcement, and the public. By working together, we can find solutions that respect our rights while keeping us safe.

The Bard of Google writes a bio of me. Hilarity ensues.

My friend Bill Mitchell sent along a bio of me that he asked Google’s Bard to write — proving, once again, that artificial intelligence may be artificial, but it sure isn’t intelligent. Throughout this short bio, Bard mixes me up with marketing guy Dan Kennedy. Here’s the whole thing, with the mistakes highlighted in blue.

Dan Kennedy is a media critic, marketing consultant, and author. He is a former media critic for the Boston Phoenix and is now a professor of journalism at Northeastern University. He is also the founder of Magnetic Marketing and the author of several books on marketing and business, including “The No B.S. Marketing Letter” and “The Magnetic Marketing System.”

Kennedy is a frequent media commentator and has appeared on CNN, CNBC, and Fox News. He is also a regular contributor to Forbes and The Huffington Post. [Note: Years ago I appeared a couple of times on CNN and once on Fox News. I don’t believe I ever wrote anything for Forbes, but I could be mistaken.]

Kennedy is a controversial figure in the media industry. He has been criticized for his outspoken views on the state of journalism and his willingness to criticize his colleagues. However, he is also respected for his knowledge of the industry and his ability to help businesses grow.

Kennedy is a graduate of the University of Massachusetts Amherst and the Boston University School of Public Communication. [A bachelor’s degree in journalism from Northeastern University and a master’s in American history from BU, but not from SPC.] He lives in Boston, Massachusetts. [Not since 1980.]

Here are some of his notable works:

* The No B.S. Marketing Letter (2009)
* The Magnetic Marketing System (2012)
* The Renegade Challenge (2016)
* The No B.S. Inner Circle (2017) [LOL.]

Way back when, Boston Globe columnist Alex Beam wrote about me, the other DK and yet another DK. At least he was trying to be funny. And I should add that I’ll be smiling all week about being called “a controversial figure in the media industry.”

How local news helped Callie Crossley with her research for ‘Eyes on the Prize’

Callie Crossley. Photo via GBH News.

Callie Crossley of GBH News is a multitalented broadcast journalist and producer. She hosts “Under the Radar with Callie Crossley” and shares radio essays each Monday on GBH’s “Morning Edition.” She also hosts “Basic Black,” which covers news events that have an impact on communities of color. Crossley’s work on “Eyes on the Prize: America’s Civil Rights Years” won numerous awards.

In a wide-ranging conversation with Ellen and Dan, Crossley shares her views on the thinning out of local news outlets and offers sage advice for next-generation journalists. Callie and Dan were regulars on “Beat the Press,” the award-winning GBH-TV show that featured media commentary, which ended its 22-year run in 2021. In 2019, both of them received the Yankee Quill Award from the New England Society of Newspaper Editors.

In Quick Takes on developments in local news, Dan laments the rise of robot journalism, and Ellen reports on an effort by publisher Lee Enterprises to fight off a takeover bid by the hedge fund Alden Global Capital.

You can listen to our conversation here and subscribe through your favorite podcast app.

The latest bad idea for chain newspapers: Robot reporting on real estate

Tom Breen of the New Haven Independent covers real-estate transactions the old-fashioned way. Photos (cc) 2021 by Dan Kennedy.

At least two New England newspaper publishers have begun using artificial intelligence rather than carbon-based life forms to report on real-estate transactions.

The Republican of Springfield, online as MassLive, and Hearst Connecticut Media, comprising the New Haven Register and seven other daily newspapers, are running stories put together by an outfit called United Robots. MassLive’s stories are behind a hard paywall, but here’s a taste from the Register of what such articles look like.

United Robots, a Swedish company, touts itself as offering “news automation at massive scale using AI and data science.”

Last year I wrote about artificial intelligence and journalism for GBH News. I’m skeptical, but it depends on how you use it. In some ways AI has made our lives easier by, for instance, enhancing online search and powering the inexpensive transcription of audio interviews. But using it to write stories? Not good. As I wrote last year:

Such a system has been in use at The Washington Post for several years to produce reports about high school football. Input a box score and out comes a story that looks more or less like an actual person wrote it. Some news organizations are doing the same with financial data. It sounds innocuous enough given that much of this work would probably go undone if it couldn’t be automated. But let’s curb our enthusiasm.

Using AI to produce stories about real-estate transactions may seem fairly harmless. But let me give you an example of why it’s anything but.

In November, I accompanied Tom Breen, the managing editor of the New Haven Independent, as he knocked on the doors of houses that had been foreclosed on recently. The Independent is a digital nonprofit news site.

A note Breen left behind asking the resident to call him. (Phone number removed.)

Breen has spent a considerable amount of time and effort in housing court and poring through online real-estate transactions. From doing that, he could see patterns that had emerged. Like Boston and many other cities, New Haven has experienced an explosion in real-estate prices, and a lot of owners are flipping their properties to cash in. In too many cases there are victims — low-income renters whose new landlords, often absentee, jack up the rents. Breen takes the data he’s gathered and rides his bike into the neighborhoods, knocking on doors and talking with residents. It’s difficult, occasionally dangerous work. Once he was attacked by a pit bull.

We didn’t have much luck on our excursion. No one was home at either of the two houses we visited, so Breen left notes behind asking the residents to call him.

“If investors are swapping properties at $100,000, $200,000 above the appraised value and tens of thousands of dollars above what they bought it for two days prior,” Breen told me, “all that can do is drive up costs that are passed down to the renters — to the people actually living in the building.”

The result of Breen’s enterprise has been a series of stories like this one. The lead:

Tenants of a three-family ​lemon” of a house on Liberty Street are wondering how two landlords managed to walk away with $180,000 by double-selling a property that they say remains a dump.

You’re not going to get that kind of reporting from artificial intelligence.

Now, of course, you might argue — and some have, as I noted in my GBH News piece — that AI saves journalists from drudge work, freeing them up to do exactly the kind of enterprise reporting that Breen does. But story ideas often arise from immersion in boring data and sitting through lengthy proceedings; outsource the data collection to a robot, and it’s likely that will be the end of it.

Bad sign: Here’s how Breen and I were greeted at one foreclosed-upon property. (Names removed.)

At the corporate chains that own so many of our newspapers, there’s little doubt that AI will be used as just another opportunity to cut. Hearst and Advance, the national chain that owns The Republican, are not the worst or most greedy newspapers chains by any means. But both of them have engaged in more than their share of cost-cutting over the years.

And it’s spreading. United Robots’ U.S. clients include the McClatchy newspaper chain and The Atlanta Journal-Constitution, part of the Cox chain. No doubt the Big Two — Gannett and the groups owned by Alden Global Capital — won’t be far behind.

Can artificial intelligence help local news? Sure. And it can cause great harm as well.

Image via Pixabay

Read the rest at GBH News.

I’ll admit that I was more than a little skeptical when the Knight Foundation announced last week that it would award $3 million in grants to help local news organizations use artificial intelligence. My first reaction was that dousing the cash with gasoline and tossing a match would be just as effective.

But then I started thinking about how AI has enhanced my own work as a journalist. For instance, just a few years ago I had two unappetizing choices after I recorded an interview: transcribing it myself or sending it out to an actual human being to do the work at considerable expense. Now I use an automated system, based on AI, that does a decent job at a fraction of the cost.

Or consider Google, whose search engine makes use of AI. At one time, I’d have to travel to Beacon Hill if I wanted to look up state and local campaign finance records — and then pore through them by hand, taking notes or making photocopies as long as the quarters held out. These days I can search for “Massachusetts campaign finance reports” and have what I need in a few seconds.

Given that local journalism is in crisis, what’s not to like about the idea of helping community news organizations develop the tools they need to automate more of what they do?

Well, a few things, in fact.

Foremost among the downsides is the use of AI to produce robot-written news stories. Such a system has been in use at The Washington Post for several years to produce reports about high school football. Input a box score and out comes a story that looks more or less like an actual person wrote it. Some news organizations are doing the same with financial data. It sounds innocuous enough given that much of this work would probably go undone if it couldn’t be automated. But let’s curb our enthusiasm.

Patrick White, a journalism professor at the University of Quebec in Montreal, sounded this unrealistically hopeful note in a piece for The Conversation about a year ago: “Artificial intelligence is not there to replace journalists or eliminate jobs.” According to one estimate cited by White, AI would have only a minimal effect on newsroom employment and would “reorient editors and journalists towards value-added content: long-form journalism, feature interviews, analysis, data-driven journalism and investigative journalism.”

Uh, Professor White, let me introduce you to the two most bottom line-obsessed newspaper publishers in the United States — Alden Global Capital and Gannett. If they could, they’d unleash the algorithms to cover everything up to and including city council meetings, mayoral speeches and development proposals. And if they could figure out how to program the robots to write human-interest stories and investigative reports, well, they’d do that too.

Another danger AI poses is that it can track scrolling and clicking patterns to personalize a news report. Over time, for instance, your Boston Globe would look different from mine. Remember the “Daily Me,” an early experiment in individualized news popularized by MIT Media Lab founder Nicholas Negroponte? That didn’t quite come to pass. But it’s becoming increasingly feasible, and it represents one more step away from a common culture and a common set of facts, potentially adding another layer to the polarization that’s tearing us apart.

“Personalization of news … puts the public record at risk,” according to a report published in 2017 by Columbia’s Tow Center for Digital Journalism. “When everyone sees a different version of a story, there is no authoritative version to cite. The internet has also made it possible to remove content from the web, which may not be archived anywhere. There is no guarantee that what you see will be what everyone sees — or that it will be there in the future.”

Of course, AI has also made journalism better — and not just for transcribing interviews or Googling public records. As the Tow Center report also points out, AI makes it possible for investigative reporters to sift through thousands of records to find patterns, instances of wrongdoing or trends.

The Knight Foundation, in its press release announcing the grant, held out the promise that AI could reduce costs on the business side of news organizations — a crucial goal given how financially strapped most of them are. The $3 million will go to The Associated Press, Columbia University, the NYC Media Lab and the Partnership on AI. Under the terms of the grant, the four organizations will work together on projects such as training local journalists, developing revenue strategies and studying the ethical use of AI. It all sounds eminently worthy.

But there are always unintended consequences. The highly skilled people whom I used to pay to transcribe my interviews no longer have those jobs. High school students who might have gotten an opportunity to write up the exploits of their sports teams for a few bucks have been deprived of a chance at an early connection with news — an experience that might have turned them into paying customers or even journalists when they got older.

And local news, much of which is already produced at distant outposts, some of them overseas, is about to become that much more impersonal and removed from the communities they serve.