What does it mean to “publish” something? In the pre-social media era, that question was easy enough to answer. It became a little more complicated in 1996, when Congress passed a law called Section 230, which protects internet providers from liability for any third-party content that might be posted on their sites.
But those early online publishers were newspapers and other news organizations as well as early online services such as CompuServe, AOL and Prodigy. None of them was trying to promote certain types of third-party content in order to drive up engagement and, thus, ad revenues.
Today, of course, that’s the whole point. Algorithms employed by social media companies such as Meta (Facebook, Instagram and Threads), Twitter and TikTok use sophisticated software that figures out what kind of content you are more likely to engage with with so they can show you more of it. Such practices have been linked to, among other things, genocide in Myanmar as well as depression and other mental health issues.
So again, what does it mean to “publish”? I’ve argued since as far back as 2017 that elevating some third-party content over others could be considered publication rather than simply acting as a passive receptacle of whatever stuff comes in over the digital transom.
A print publication, after all, is legally responsible for everything it encompasses, including ads (the landmark Times v. Sullivan libel decision involved an advertisement) and letters to the editor. It would be neither practical nor desirable to hold social media companies responsible for all third-party content. But again, if they are boosting some content to make it more visible because they (or, rather, their unblinking algorithms) think it will get them more engagement and make them more money, how is that not an act of publishing? Why should it be protected by federal law?
Earlier this week, investigative journalist Julia Angwin wrote an op-ed piece for The New York Times (gift link) arguing that the tide may be turning against the social media giants, in part because of TikTok’s aggressive use of its algorithmic “For You” feed, which has been emulated by the other platforms. A showdown over Section 230 may be headed for the Supreme Court. She writes:
If tech platforms are actively shaping our experiences, after all, maybe they should be held liable for creating experiences that damage our bodies, our children, our communities and our democracy….
My hope is that the erection of new legal guardrails would create incentives to build platforms that give control back to users. It could be a win-win: We get to decide what we see, and they get to limit their liability.
I don’t think there’s a good-faith argument to be made that reforming Section 230 would harm the First Amendment. We would still have the right to publish freely, subject to long-existing prohibitions against libel, incitement, serious breaches of national security and obscenity. And internet providers would still be held harmless for any content posted by their users. But it would end the legal absurdity that a tech platform can boost harmful content and then claim immunity because that content originated with someone else. (Ironically, those third-party posters are fully liable for their content if they can be identified and tracked down.)
As Angwin notes, Ethan Zuckerman of UMass Amherst, a respected thinker about all things digital, is suing Meta for the right to develop software that would allow users to control their own experience on Facebook. Angwin also touts Bluesky, a Twitter alternative that allows its users to design their own feeds (you can find me at @dankennedy-nu.bsky.social).
We should all have the right to freedom of speech and freedom of the press. But the platforms that control so much of our lives should should have the same freedoms that the rest of us have — and that should not include the freedom to boost harmful content without any legal consequences because of the fiction that they are not engaged in an act of publishing. It’s long past time to make some changes to Section 230.
Olivia Nuzzi departs
Olivia Nuzzi’s separation agreement with New York magazine was heavily lawyered, according to reports, and that shouldn’t come as a surprise to anyone. But the magazine’s statement that its law firm found “no inaccuracies nor evidence of bias” in her work needs to be placed in context. Liam Reilly and Hadas Gold of CNN report on Nuzzi’s departure.
Nuzzi, you may recall, was involved in some sort of sexual (but not physical) relationship with Robert F. Kennedy Jr. that may have encompassed sexting and nude selfies — we still don’t know.
But as I wrote last month, after Nuzzi’s relationship with Kennedy became public, she wrote a very tough piece about President Biden’s alleged age-related infirmities while Kennedy was still a presidential candidate and an oddly sympathetic profile of Donald Trump after Kennedy had left the race, endorsed Trump and made it clear that he was hoping for a high-level job in a Trump White House.
Maybe Nuzzi would have written those two stories exactly the same way even if she had never met Kennedy. But we’ll never know.
Media notes
• Billionaire ambitions. Benjamin Mullin of The New York Times reports (gift link) that a Florida billionaire named David Hoffmann has bought 5% of the cost-cutting Lee Enterprises newspaper chain, and that he hopes to help revive the local news business. “These local newspapers are really important to these communities,” Hoffman told Mullin. “With the digital age and technology, it’s changing rapidly. But I think there’s room for both, and we’d like to be a part of that.” Lee owns media properties in 73 U.S. markets, including well-known titles such as the St. Louis Post-Dispatch and The Buffalo News.
• Silent treatment. Patrick Soon-Shiong, whose ownership of the Los Angeles Times has been defined by vaulting ambitions and devastating cuts, has stumbled once again. Max Tani of Semafor reports that the Times will not endorse in this year’s presidential content, even though it published endorsements in state and local races just last week. The decision to abstain from choosing between Kamala Harris and Donald Trump, Tani writes, came straight from Soon-Shiong, who made his wealth in the health-care sector. Closer to home, The Boston Globe endorsed Harris earlier this week.
• Reaching young voters. Santa Cruz Local, a digital nonprofit, has announced an ambitious idea to engage with young people: news delivered by text messages and Instagram. “We want to reach thousands of students with civic news and help first time voters get to the ballot box,” writes Kara Meyberg Guzman, the Local’s co-founder and CEO. The Local’s Instagram-first election guide will be aimed at 18- to 29-year-olds in Santa Cruz County, with an emphasis on reaching local college students; Guzman is attempting to raise $10,000 in order to fund it. Santa Cruz Local was one of 205 local news organizations to receive a $100,000 grant from Press Forward last week. Guzman was also interviewed in the book that Ellen Clegg and I wrote, “What Works in Community News,” and on our podcast.
Discover more from Media Nation
Subscribe to get the latest posts sent to your email.
“But again, if they are boosting some content to make it more visible because they (or, rather, their unblinking algorithms) think it will get them more engagement and make them more money, how is that not an act of publishing?”
I don’t disagree with Dan’s overall point, but imo arguing about whether content curation or promotion equals publishing just confuses things. Bookstores, magazine stands, and libraries have long enjoyed protection from liability claims from third party content even if they do lots of curation and promotion of the works they sell or lend. Curation or promotion of content alone doesn’t make them publishers.
IMO, the publisher/not a publisher distinction just isn’t particularly useful in the brave new world of algorithmic content curation and promotion. Instead maybe we should focus on specific harmful activities and conduct that social media companies engage in think about and what changes to laws are required to better manage this conduct.
Great piece! I disagree with this sentence: “It would be neither practical nor desirable to hold social media companies responsible for all third-party content.” Facebook, Twitter, etc. are different from “news organizations” in only one way, in my opinion: They reach a much bigger audience and make a lot more money, and therefore should be more, not less, legally responsible for the harm caused by the material they publish for profit. What might not be practical is having those platforms operate at the hyper-scale they do, when so much work must be automated (like insanely inaccurate and ineffective AI, algorithm-based moderating of comments and postings). So what if that’s not practical? It’s evil and counterproductive to operate that way, without benefiting society, so it wouldn’t be the end of the world to return to smaller platforms where humans are responsible for what they publish.