Facebook’s brain-dead algorithm is censoring important public-safety information, reports WHAV in Haverhill, Massachusetts. WHAV is a nonprofit news organization with a low-power/online radio station as well as a website.
According to WHAV general manager Tim Coco:
In the last week, a local news warning about the sinkhole along the southbound lanes of Interstate 495 near Ward Hill was flagged as spam and removed by one social media site. Another blocked WHAV story was news of possible restoration of Haverhill’s 1845-era (gun) powder house. The tech giant behind these removals piles on with intimidation by writing “Repeatedly breaking our rules can cause more account restrictions.”
Even more mind-boggling, Coco writes, is that when the Haverhill Police Department attempted to share WHAV’s item on the sinkhole, Facebook removed that, too.
This is nothing new for Facebook. In “What Works in Community News,” the book that Ellen Clegg and I wrote, we tell the story of an emergency route during a wildfire that the sheriff’s department had shared with The Mendocino Voice in Northern California. The Voice posted it on its Facebook page, one of its primary distribution channels — and then watched in alarm as it disappeared. In the excerpt below, we talked with Kate Maxwell, then the publisher of the Voice, and Adrian Fernandez Baumann, then the editor:
The sheriff’s department asked the Voice to get the word out that people living in the national forest would run into danger if they tried to evacuate through the nearby community of Covelo. It was potentially lifesaving information, but Facebook took it down. “It had like a thousand shares in an hour,” said Maxwell. “Facebook flagged that post and deleted it.” The article was restored about a half-hour later following an uproar from the community. Maxwell said she never got a good explanation of what happened, even after talking with someone from Facebook at a conference. Maybe it was because the algorithms identified it as fake news. Maybe, as Baumann speculated, it was because the article included a reference to “Indian Dick Road.”
Coco doesn’t identify Facebook as the culprit, but the screenshots that he posted are clearly from that platform. He’s asking his readers and listeners in the Haverhill area to stop relying on social media for WHAV stories and instead to subscribe directly to the news outlet’s daily email newsletter.
Coco, by the way, is in our book and has been a guest on the “What Works” podcast.
Facebook’s parent company, Meta, is also getting swamped with complaints about entirely harmless posts being removed from its Threads platform because of algorithmic decisions being made with no human involvement. I can speak from personal experience, too. Twice over the past year or so, I’ve responded to questions asking about great song lyrics, and I’ve gone with “I shot a man in Reno just to watch him die,” from Johnny Cash’s “Folsom Prison Blues.” Both time, my posts were removed from Facebook and Threads, and I was given a warning.
So let me repeat something I’ve said a number of times: News organizations should not rely on social media any more than absolutely necessary. Do what Coco is doing: Push newsletter subscriptions, because that’s a platform that you control and own.