
Section 230 of the Communications Decency Act of 1996 protects website owners from liability over third-party content. The classic example would be an anonymous commenter who libels someone. The offended party would be able to sue the commenter but not the publishing platform, although the platform might be required to turn over information that would help identify the commenter.
But where is the line between passively hosting third-party content and activity promoting certain types of material in order to boost engagement and, thus, profitability? That question will go before the Massachusetts Supreme Judicial Court on Friday, reports Jennifer Smith of CommonWealth Beacon.
At issue is a lawsuit brought against Meta by 42 state attorneys general, including Andrea Campbell of Massachusetts. Meta operates Facebook, Instagram, Threads and other social media platforms, and it has long been criticized for using algorithms and other tactics that keep users hooked on content that, in some cases, provokes anger and depression, even suicide. Smith writes:
The Massachusetts complaint alleges that Meta violated state consumer protection law and created a public nuisance by deliberately designing Instagram with features like infinite scroll, autoplay, push notifications, and “like” buttons to addict young users, then falsely represented the platform’s safety to the public. The company has also been reckless with age verification, the AG argues, and allowed children under 13 years old to access its content.
Meta and its allies counter that Section 230 protects not just the third-party content they host but also how Facebook et al. display that content to its users.
In an accompanying opinion piece, attorney Megan Iorio of the Electronic Privacy Information Center, computer scientist Laura Edelson of Northeastern University and policy analyst Yaël Eisenstat of Cybersecurity for Democracy argue that Section 230 was not designed to protect website operators from putting their thumbs on the scales to favor one type of third-party content over another. As they put it in describing the amicus brief they have filed:
Our brief explains how the platform features at the heart of the Commonwealth’s case — things like infinite scroll, autoplay, the timing and batching of push notifications, and other tactics borrowed from the gambling industry — have nothing to do with content moderation; they are designed to elicit a behavior on the part of the user that furthers the company’s own business goals.
As Smith makes clear, this is a long and complex legal action, and the SJC is being asked to rule only on the narrow question of whether Campbell can move ahead with the lawsuit to which she has lent the state’s support. (Double disclosure: I am a member of CommonWealth Beacon’s editorial advisory aboard as well as a fellow Northeastern professor.)
I’ve long argued (as I did in this GBH News commentary from 2020) that, just as a matter of logic, favoring some types of content over others is a publishing activity that goes beyond the mere passive hosting of third-party content, and thus website operators should be liable for whatever harm those decisions create. That argument has not found much support in the courts, however. It will be interesting to see how this plays out.

“You’re in real trouble,” Justice Scott Kafker told Charles Waters, a lawyer for the school, explaining, “It quacks like a duck, it waddles like a duck, it paddles like a duck.” Added Justice Dalia Wendlandt:




