By Dan Kennedy • The press, politics, technology, culture and other passions

Thinking through a social-contract framework for reforming Section 230

Mary Anne Franks. Photo (cc) 2014 by the Internet Education Foundation.

The Lawfare podcasts are doing an excellent job of making sense of complicated media-technical issues. Last week I recommended a discussion of Australia’s new law mandating that Facebook and Google pay for news. Today I want to tell you about an interview with Mary Anne Franks, a law professor at the University of Miami, who is calling for the reform of Section 230 of the Communications Decency Act.

The host, Alan Rozenshtein, guides Franks through a paper she’s written titled “Section 230 and the Anti-Social Contract,” which, as he points out, is short and highly readable. Franks’ overriding argument is that Section 230 — which protects internet services, including platform companies such as Facebook and Twitter, from being sued for what their users post — is a way of entrenching the traditional white male power structure.

That might strike you as a bit much, and, as you’ll hear, Rozenshtein challenges her on it, pointing out that some members of disenfranchised communities have been adamant about retaining Section 230 in order to protect their free-speech rights. Nevertheless, her thesis is elegant, encompassing everyone from Thomas Jefferson to John Perry Barlow, the author of the 1996 document “A Declaration of the Independence of Cyberspace,” of which she takes a dim view. Franks writes:

Section 230 serves as an anti-social contract, replicating and perpetuating long-standing inequalities of gender, race, and class. The power that tech platforms have over individuals can be legitimized only by rejecting the fraudulent contract of Section 230 and instituting principles of consent, reciprocity, and collective responsibility.

So what is to be done? Franks pushes back on Rozenshtein’s suggestion that Section 230 reform has attracted bipartisan support. Republicans such as Donald Trump and Sen. Josh Hawley, she notes, are talking about changes that would force the platforms to publish content whether they want to or not — a nonstarter, since that would be a violation of the First Amendment.

Democrats, on the other hand, are seeking to find ways of limiting the Section 230 protections that the platform companies now enjoy without tearing down the entire law. Again, she writes:

Specifically, a true social contract would require tech platforms to offer transparent and comprehensive information about their products so that individuals can make informed choices about whether to use them. It would also require tech companies to be held accountable for foreseeable harms arising from the use of their platforms and services, instead of being granted preemptive immunity for ignoring or profiting from those harms. Online intermediaries must be held to similar standards as other private businesses, including duty of care and other collective responsibility principles.

Putting a little more meat on the bones, Franks adds that Section 230 should be reformed so as to “deny immunity to any online intermediary that exhibits deliberate indifference to harmful conduct.”

Today’s New York Times offers some details as to what that might look like:

One bill introduced last month would strip the protections from content the companies are paid to distribute, like ads, among other categories. A different proposal, expected to be reintroduced from the last congressional session, would allow people to sue when a platform amplified content linked to terrorism. And another that is likely to return would exempt content from the law only when a platform failed to follow a court’s order to take it down.

Since its passage in 1996, Section 230 has been an incredible boon to any internet publisher who opens its gates to third-party content. They’re under no obligation to take down material that is libelous or threatening. Quite the contrary — they can make money from it.

This is hardly what the First Amendment envisioned, since publishers in other spheres are legally responsible for every bit of content they put before their audiences, up to and including advertisements and letters to the editor. The internet as we know it would be an impossibility if Section 230 didn’t exist in some form. But it may be time to rein it in, and Franks has put forth a valuable framework for how we might think about that.

Become a member of Media Nation today.

Previous

There’s no reason to think that a Nextdoor-like service would have saved local news

Next

Reporter arrested at protest says it’s important for journalists to bear witness

9 Comments

  1. Steve Ross

    Great links. I will carve out some time to explore in detail. I’m certainly leaning toward a rewrite of 230, not only because of the political harm it has caused, but also to protect proactively against such abominations as revenge port.

    I do note that almost all news organizations air and distribute without limit political attack ads that, at best, are half-truths and at worst are outright lies.

    Newspapers and networks that vet toothpaste ads have no problem when it comes to selling politicians! But every proposal I have ever seen on the matter seems either toothless or draconian.

    • Steve Ross

      Well, revenge PORN. My phone is an alcoholic when it comes to autocorrect.

  2. MagellanNH

    From Franks’ Digital Social Contract paper: “This will require, among other things, amending Section 230(c)(1) to deny immunity to any online intermediary that exhibits deliberate indifference to harmful conduct”

    IMO, Franks’ proposal is extremely radical. It seems to be derived from Critical Race Theory and is unlike anything I’ve ever seen proposed for section 230 reform.

    She goes on to posit a constitutional amendment along the lines of this: “Congress and the several States shall take legislative and other measures to prevent or redress any disadvantage suffered by individuals or groups because of past and/or present inequality as prohibited by this Amendment, and shall take all steps requisite and effective to abolish prior laws, policies, or constitutional provisions that impede equal political representation”

    She continues: “A revolutionary social contract would acknowledge the role that the tech industry has played in sustaining hierarchies of race, class, and gender and in eroding democracy. It would involve tech companies committing to concrete steps to making amends for these harms, including dedicating significant resources to develop innovative practices and policies to combat racism, misogyny, and extremism; increasing accessibility to underserved communities; and funding nonprofit and advocacy efforts to protect the rights and liberties of all people.”

    I don’t even know where to start with a critique of this.

    For me, the main thing that gets missed in Section 230 discussions is that it’s role was mainly to codify into law the previously existing standards of liability for content distributors.
    230’s role was mainly resolve the uncertainty about this while several cases worked through the legal system. There are many experts that believe that even a total repeal of section 230 wouldn’t change the liability picture for platforms very much at all because of existing law and precedent.

    As I understand it, prior to the Internet precedent held that content distributors like newsstands, book stores, and public libraries weren’t liable for the content of the publications they distributed. The general idea was that liability rested with publishers, not distributors. Some platforms had begun to do some content moderation to improve the quality of their service, much like a bookstore curates the list of books it puts on its shelves. This raised questions about whether platforms were acting like publishers by curating content and the legal system was working through these questions. One thing that’s clear – If platforms don’t moderate content at all, they wouldn’t be liable even if 230 got repealed. The grey area only comes in when they start to do moderation.

    So one question I have for people who think Section 230 is a gift to platforms is whether they think Target, Walmart, and Amazon should be held liable for the contents of the physical books, magazines, and newspapers that they sell? If not, does Amazon banning marketplace sellers from selling Dr. Seuss books (content moderation) change your mind about this?

    • Dan Kennedy

      I would argue that Facebook is a publisher masquerading as a neutral distributor. Any protections it has for third-party content go beyond what publishers normally enjoy. We need some changes.

      • MagellanNH

        >> Any protections it has for third-party content go beyond what publishers normally enjoy.
        Is that true? I’m not sure about letters to the editor, but I thought comments and other third-party content on the NYT site enjoys the same protections as third party content on any non-publisher site. Do I have that wrong?

        Also, is there something specific that Facebook does, compared to say a generic blog comments section like this one, that earns them a publisher designation? Is it their content moderation, their feed algorithms, or something else? Does this opinion also apply to other social sites like twitter, reddit, youtube, tiktok, etc? What about sites that host third party blogs and podcasts like wordpress.org, apple podcasts, spotify, audible, patreon, and substack?

        IMO, whatever standards set for facebook should be justifiable using generic reasoning, not just because they’re the big bad wolf right now that everyone hates.

        (full disclosure – I’ve never had a personal facebook account and think the world would probably be better off if facebook just decided to shut itself down tomorrow).

      • Dan Kennedy

        @MagellanNH: It’s all the same. All publishers, from Facebook to the New York Times to the New Haven Independent, are legally responsible for every piece of content they publish — articles, letters, ads — except content posted online by third parties. Remember, Section 230 predates Facebook by quite a few years, and was originally intended with newspaper websites in mind. I cite @pauljbass below because he’s the editor and founder of the Independent. Read his comment. It’s important. I do think online publishers need *some* protection for third-party content, and maybe more than Franks wants to see. But the current situation is chaotic hell.

        • pauljbass

          I think we should have zero protection for third-party content. I think that is a cop-out

  3. pauljbass

    I have never understood why social media publishers (or corporate news organizations that don’t want to pay staff to monitor comments) should avoid the same libel responsibility we have had as news publishers for content in our stories and “letters.” Libel law, when not abused through SLAPPs, plays such a vital role IMHO.

  4. Steve Ross

    Keep in mind that regular publishers generally avoid liability for inaccurate and outright fake political advertising and politically planted articles, in the name of “well, the opponent can pay for an ad to rebut,” and with the understanding that the target of a lie has to prove the lie was deliberate. Old media taught new media how to be obnoxious and cruel.

Powered by WordPress & Theme by Anders Norén