By Dan Kennedy • The press, politics, technology, culture and other passions

Tag: Siva Vaidhyanathan

Why Facebook’s new oversight board is destined to be an exercise in futility

Former Guardian editor Alan Rusbridger is among the board members. Photo (cc) 2012 by Internaz.

Previously published at WGBHNews.org.

To illustrate how useless the newly unveiled Facebook oversight board will be, consider the top 10 fake-news stories shared by its users in 2019.

As reported by Business Insider, the list included such classics as “NYC Coroner who Declared Epstein death ‘Suicide’ worked for the Clinton foundation making 500k a year up until 2015,” “Omar [as in U.S. Rep. Ilhan Omar] Holding Secret Fundraisers with Islamic Groups Tied to Terror,” and “Pelosi Diverts $2.4 Billion From Social Security To Cover Impeachment Costs.”

None of these stories was even remotely true. Yet none of them would have been removed by the oversight board. You see, as Mathew Ingram pointed out in his Columbia Journalism Review newsletter, the 20-member board is charged only with deciding whether content that has already been taken down should be restored.

Now, it’s fair to acknowledge that Facebook CEO Mark Zuckerberg has an impossible task in bringing his Frankenstein’s monster under control. But that doesn’t mean any actual good is going to come of this exercise.

The board, which will eventually be expanded to 40, includes a number of distinguished people. Among them: Alan Rusbridger, the respected former editor of The Guardian, as well as international dignitaries and a Nobel Prize laureate. It has independent funding, Zuckerberg has agreed that its decisions will be binding, and eventually its purview may expand to removing false content.

But, fundamentally, this can’t work because Facebook was not designed to be controllable. In The New York Times, technology columnist Kara Swisher explained the problem succinctly. “Facebook’s problems are structural in nature,” she wrote. “It is evolving precisely as it was designed to, much the same way the coronavirus is doing what it is meant to do. And that becomes a problem when some of what flows through the Facebook system — let’s be fair in saying that much of it is entirely benign and anodyne — leads to dangerous and even deadly outcomes.”

It’s not really about the content. Stop me if you’ve heard this before, but what makes Facebook a threat to democracy is the way it serves up that content. Its algorithms — which are not well understood by anyone, even at Facebook — are aimed at keeping you engaged so that you stay on the site. And the most effective way to drive engagement is to show users content that makes them angry and upset.

Are you a hardcore supporter of President Donald Trump? If so, you are likely to see memes suggesting that COVID-19 is some sort of Democratic plot to defeat him for re-election — as was the case with a recent semi-fake-news story reporting that hospitals are being paid to attribute illnesses and deaths to the coronavirus even when they’re not. Or links to the right-wing website PJ Media aimed at stirring up outrage over “weed, opioids, booze and ciggies” being given to homeless people in San Francisco who’ve been quarantined. If you are a Trump opponent, you can count on Occupy Democrats to pop up in your feed and keep you in a constant state of agitation.

Now, keep in mind that all of this — even the fake stuff — is free speech that’s protected by the First Amendment. And all of this, plus much worse, is readily available on the open web. What makes Facebook so pernicious is that it amplifies the most divisive speech so that you’ll stay longer and be exposed to more advertising.

What is the oversight board going to do about this? Nothing.

“The new Facebook review board will have no influence over anything that really matters in the world,” wrote longtime Facebook critic Siva Vaidhyanathan at Wired, adding: “The board can’t say anything about the toxic content that Facebook allows and promotes on the site. It will have no authority over advertising or the massive surveillance that makes Facebook ads so valuable. It won’t curb disinformation campaigns or dangerous conspiracies…. And most importantly, the board will have no say over how the algorithms work and thus what gets amplified or muffled by the real power of Facebook.”

In fact, Facebook’s algorithm has already been trained to ban or post warning labels on some speech. In practice, though, such mechanized censorship is aggravatingly inept. Recently the seal of disapproval was slapped on an ad called “Mourning in America,” by the Lincoln Project, a group of “Never Trump” Republicans, because the fact-checking organization PolitiFact had called it partly false. The Lincoln Project, though, claimed that PolitiFact was wrong.

I recently received a warning for posting a photo of Benito Mussolini as a humorous response to a picture of Trump. No doubt the algorithm was too dumb to understand that I was making a political comment and was not expressing my admiration for Il Duce. Others have told me they’ve gotten warnings for referring to trolls as trolls, or for calling unmasked protesters against COVID-19 restrictions “dumber than dirt.”

So what is Facebook good for? I find it useful for staying in touch with family and friends, for promoting my work and for discussing legitimate news stories. Beyond that, much of it is a cesspool of hate speech, fake news and propaganda.

If it were up to me, I’d ban the algorithm. Let people post what they want, but don’t let Facebook robotically weaponize divisive content in order to drive up its profit margins. Zuckerberg himself has said that he expects the government will eventually impose some regulations. Well, this is one way to regulate it without actually making judgments about what speech will be allowed and what speech will be banned.

Meanwhile, I’ll watch with amusement as the oversight board attempts to wrestle this beast into submission. As Kara Swisher said, it “has all the hallmarks of the United Nations, except potentially much less effective.”

The real goal, I suspect, is to provide cover for Zuckerberg and make it appear that Facebook is doing something. In that respect, this initiative may seem harmless — unless it lulls us into complacency about more comprehensive steps that could be taken to reduce the harm that is being inflicted on all of us.

Talk about this post at Facebook.

Facing up to the damage wrought by Facebook

Previously published at The Arts Fuse.

Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, by Siva Vaidhyanathan. Oxford University Press, 288 pages, $24.95.

The reason that Facebook is so evil is that Mark Zuckerberg is so good. According to Siva Vaidhyanathan, a professor of media studies at the University of Virginia, the former wunderkind has drunk deeply of his own Kool-Aid and genuinely believes that his creation is a benevolent force in the world. “Zuckerberg has a vibrant moral passion,” Vaidhyanathan writes in his new book, Antisocial Media. “But he lacks a historical sense of the horrible things that humans are capable of doing to each other and the planet.”

From propagating fake news to violating our privacy, from empowering authoritarian regimes to enabling anti-Semitic advertising, Facebook has become the social network everyone loves to hate. Vaidhyanathan, whose previous books include The Googlization of Everything — and Why We Should Worry (2011), has produced a valuable guide, written in clear, non-academic prose, to the monstrous force Facebook has become. And if his overview of what’s gone wrong with Facebook will seem familiar to those of us who obsess about these things, it nevertheless serves as a worthwhile introduction to the Zuckerborg and all that it has wrought. If only Vaidhyanathan had some compelling ideas on what to do about it. If only any of us did.

Facebook’s malign omnipresence came about quickly. Founded in 2004, it wasn’t until the dawn of the current decade that it became a behemoth. With 2.2 billion active monthly users, Facebook is, for many people, synonymous with the internet itself — the place where your aunt and uncle share photos of their pets, updates from their vacations, and, of course, links to memes and conspiracy theories about George Soros’s non-existent Nazi past and the “deep state” plot to overthrow President Trump.

Such craziness has serious real-world consequences. It may not be an exaggeration to say that Trump became president partly because of Facebook, as Russian propagandists, Cambridge Analytica, and the Trump campaign itself all bought ads to bolster Trump’s message and to persuade possible Hillary Clinton voters to stay home on Election Day. The Facebook effect was probably not as powerful as James Comey’s bizarre obsession with Clinton’s emails — or, for that matter, Electoral College math. But given that Trump was elected by just a handful of votes in a few swing states, it seems plausible that Clinton might otherwise have overcome those obstacles.

There’s nothing new about political advertising, even if Facebook’s tools for microtargeting tiny slices of users based on the information they themselves have provided are unusually precise and pernicious. More ominous, Vaidhyanathan argues, is that the Facebook environment encourages the sort of fragmented thinking and emotional reactions that are antithetical to healthy civic engagement and that helps give rise to an authoritarian figure like Trump. And since Facebook’s algorithm is designed to give you more of the type of content that you interact with, you become increasingly sealed off from viewpoints you don’t agree with. Vaidhyanathan’s attempt to shoehorn Trump into his overarching theory of Facebook is a bit awkward given that Trump’s social-media drug of choice is Twitter. Nevertheless, he is surely on to something in arguing that the reductive discourse that characterizes Facebook helped fuel Trump’s rise.

“After a decade of deep and constant engagement with Facebook, Americans have been conditioned to experience the world Trump style,” Vaidhyanathan writes. “It’s almost as if Trump were designed for Facebook and Facebook were designed for him. Facebook helped make America ready for Trump.”

Vaidhyanathan is not the first to take note of the distractedness that has come to define the digital age. Nicholas Carr, in his 2010 book The Shallows: What the Internet Is Doing to Our Brains, laments that the internet has given rise to a culture of skimming rather than deep reading and warns: “As our window onto the world, and onto ourselves, a popular medium molds what we see and how we see it — and, eventually, if we use it enough, it changes who we are, as individuals and as a society.” Carr barely mentions Facebook, which at the time had not yet become a hegemonic force. But there is little doubt that it has only accelerated those trends.

So what is to be done? In a healthier political climate, Vaidhyanathan writes, we might expect our elected officials to act — by mandating greater privacy protections and by forcing Facebook to sell off some of its related businesses such as Instagram, WhatsApp, and Messenger. But he holds out little hope, even though Europe is moving in that direction. And he identifies a specific reason for his pessimism by describing two competing philosophies of corporate leadership in the United States, neither suited to dealing with the menace we face. One, market fundamentalism, holds that the sole obligation of a corporation is to make as much money as possible for its shareholders. The other, the social responsibility model, sees a role for corporations — but not for government — in addressing environmental and cultural concerns and in helping to make the world better. Vaidhyanathan places Facebook squarely within the latter tradition. Remember, he sees Zuckerberg at root as an earnest if misguided idealist.

The problem is that both of these philosophies are based on differing notions of corporate libertarianism. Each exalts the business leader as the exemplar to which society should aspire. By embracing a binary view of the corporation’s role, we have, Vaidhyanathan argues, essentially eliminated the public sphere from the discussion of how to solve universal problems. Rather than looking to elected leaders, we look to people like Bill Gates, Elon Musk, Laureen Powell Jobs, and, yes, Mark Zuckerberg. We embrace “innovation” rather than real progress that benefits everyone. Given the state of our politics, that might seem like logical behavior. But it’s also behavior based on the nostrum popularized by Ronald Reagan that government is the problem, not the solution. Say something often enough over the course of nearly four decades and it becomes true.

There is some hope. Although Vaidhyanathan doesn’t mention it, there are signs that journalism is becoming less dependent on Facebook. According to the web metrics firm Chartbeat, news organizations are seeing a decreasing amount of referral traffic from Facebook and an increasing amount of direct traffic to their websites and other digital platforms. “The increase in direct traffic matters because it enables publishers to control their own destiny,” writes Lucia Moses of Digiday. “They have more data on reader behavior, which enables them to better target readers with more content and offers for subscriptions and other revenue drivers.” Given the parlous state of the news business, any shift away from Facebook is a positive development.

Moreover, there are signs that we have reached peak Facebook, with young people in particular turning away from the service. According to Hanna Kozlowska, writing in Quartz, Facebook usage among 12- to 24-year-olds is declining, and overall usage in the United States and Canada is starting to shrink as well. That’s not to say Facebook is about to go the way of Friendster or MySpace. But perhaps a shrinking user base, combined with the controversy and legal woes Zuckerberg is dealing over privacy violations and other scandals, will lead to a kinder, gentler Facebook.

Ultimately, Vaidhyanathan says, it’s up to us. “Reviving a healthy social and political life would require a concerted recognition of the damage Facebook has done and a campaign to get beyond its spell,” he writes. “If millions were urged to put Facebook in its proper place, perhaps merely as a source of social and familial contact rather than political knowledge or activism, we could train ourselves out of the habit.” Later he writes: “Resistance is futile. But resistance seems necessary.”

Talk about this post on, well, you know, Facebook.

Powered by WordPress & Theme by Anders Norén