From COVID to our crisis of democracy, 2021 turned out to be a scant improvement over 2020

Photo (cc) 2021 by Blink O’fanaye

Previously published at GBH News.

Hopes were running high when we all turned the calendar to 2021. Would the worst 12 months in anyone’s memory give way to the best year of our lives?

Not quite. Yes, it was better than 2020, but 2021 was hardly a return to paradise. The joy of vaccinations gave way to the reality that COVID-19 is likely to be with us for a long time. The economy recovered rapidly — accompanied by the highest rate of inflation in 40 years. Worst of all, the end of the Trump presidency morphed into a crisis of democracy that is starting to look as ominous as the run-up to the Civil War.

During the past year, I’ve been struggling to make sense of the highs, the lows and the in-betweens through the prism of the media. Below are 10 of my GBH News columns from 2021. They’re in chronological order, with updates on many of the pieces posted earlier this year. If there’s a unifying theme, it’s that we’re in real trouble — but that, together, we can get through this.

The end of the Trump bump, Jan. 27. Even as he was denouncing journalists as “enemies of the people,” Donald Trump, both before and during his presidency, was very, very good for the media. Cable TV ratings soared. The New York Times and The Washington Post signed up subscribers by the bucketload. Several weeks after Trump departed from the White House, though, there were questions about what would happen once he was gone. We soon got an answer. Even though Trump never really left, news consumption shrank considerably. That may be good for our mental health. But for media executives trying to make next quarter’s numbers, it was an unpleasant new reality.

Local news in crisis, Feb. 23. The plague of hedge funds undermining community journalism continued unabated in 2021. The worst newspaper owner of them all, Alden Global Capital, acquired Tribune Publishing and its eight major-market papers, which include the Chicago Tribune, New York’s Daily News and, closer to home, the Hartford Courant. When the bid was first announced, there was at least some hope that one of those papers, The Baltimore Sun, would be spun off. Unfortunately, an epic battle between Alden and Baltimore hotel mogul Stewart Bainum resulted in Alden grabbing all of them. Bainum, meanwhile, is planning to launch a nonprofit website to compete with the Sun that will be called The Baltimore Banner.

The devolution of Tucker Carlson, April 15. How did a stylish magazine writer with a libertarian bent reinvent himself as a white-supremacist Fox News personality in thrall to Trump and catering to dangerous conspiracy theories ranging from vaccines (bad) to the Jan. 6 insurrection (good)? There are millions of possible explanations, and every one of them has a picture of George Washington on it. Carlson got in trouble last spring — or would have gotten in trouble if anyone at Fox cared — when he endorsed “replacement theory,” a toxic trope that liberal elites are deliberately encouraging immigration in order to dilute the power of white voters. A multitude of advertisers have bailed on Carlson, but it doesn’t matter — Fox today makes most of its money from cable fees. And Carlson continues to spew his hate.

How Black Lives Matter exposed journalism, May 26. A teenager named Darnella Frazier exposed an important truth about how reporters cover the police. The video she recorded of Minneapolis police officer Derek Chauvin literally squeezing the life out of George Floyd as he lay on the pavement proved that the police lied in their official report of what led to Floyd’s death. For generations, journalists have relied on law enforcement as their principal — and often only — source for news involving the police. That’s no longer good enough; in fact, it was never good enough. Frazier won a Pulitzer Prize for her courageous truth-telling. And journalists everywhere were confronted with the reality that they need to change the way they do their jobs.

The 24th annual New England Muzzle Awards, July 1. For 24 years, the Muzzle Awards have singled out enemies of free speech. The Fourth of July feature made its debut in The Boston Phoenix in 1998 and has been hosted by GBH News since 2013, the year that the Phoenix shut down. This year’s lead item was about police brutality directed at Black Lives Matter protesters in Boston and Worcester the year before — actions that had escaped scrutiny at the time but that were exposed by bodycam video obtained by The Appeal, a nonprofit news organization. Other winners of this dubious distinction included former Boston Mayor Marty Walsh, retired Harvard Law School professor Alan Dershowitz and the aforementioned Tucker Carlson, who unleashed his mob to terrorize two freelance journalists in Maine.

How to help save local news, July 28. Since 2004, some 2,100 newspapers have closed, leaving around 1,800 communities across the country bereft of coverage. It’s a disaster for democracy, and the situation is only growing worse. The Local Journalism Sustainability Act, a bipartisan proposal to provide indirect government assistance in the form of tax credits for subscribers, advertisers and publishers, could help. The bill is hardly perfect. Among other things, it would direct funds to corporate chains as well as to independent operators, thus rewarding owners who are hollowing out their papers. Nevertheless, the idea may well be worth trying. At year’s end, the legislation was in limbo, but it may be revived in early 2022.

Democracy in crisis, Sept. 29. As summer turned to fall, the media began devoting some serious attention to a truly frightening development: the deterioration of the Republican Party into an authoritarian tool of Trump and Trumpism, ready to hand the presidency back to their leader in 2024 through a combination of antidemocratic tactics. These include the disenfranchisement of Black voters through partisan gerrymandering, the passage of new laws aimed at suppressing the vote and the handing of state electoral authority over to Trump loyalists. With polls showing that a majority of Republicans believe the 2020 election was stolen, it’s only going to get worse in the months ahead.

Exposing Facebook’s depravity, Oct. 27. The social media giant’s role in subverting democracy in the United States and fomenting chaos and violence around the world is by now well understood, so it takes a lot to rise to the level of OMG news. Frances Haugen, though, created a sensation. The former Facebook executive leaked thousands of documents to the Securities and Exchange Commission and spoke out — at first anonymously, in The Wall Street Journal, and later on “60 Minutes” and before a congressional committee. Among other things, the documents showed that Facebook’s leaders were well aware of how much damage the service’s algorithmic amplification of conspiracy theories and hate speech was causing. By year’s end, lawyers for Rohingya refugees from Myanmar were using the documents to sue Facebook for $150 billion, claiming that Mark Zuckerberg and company had whipped up a campaign of rape and murder.

COVID-19 and the new normal, Nov. 17. By late fall, the optimism of June and July had long since given way to the reality of delta. I wrote about my own experience of trying to live as normally as possible — volunteering at Northeastern University’s long-delayed 2020 commencement and taking the train for a reporting trip in New Haven. Now, of course, we are in the midst of omicron. The new variant may prove disastrous, or it may end up being mild enough that it’s just another blip on our seemingly endless pandemic journey. In any case, omicron was a reminder — as if we needed one — that boosters, masking and testing are not going away any time soon.

How journalism is failing us, Dec. 7. Washington Post columnist Dana Milbank created a sensation when he reported the results of a content analysis he had commissioned. The numbers showed that coverage of President Joe Biden from August to November 2021 was just as negative, if not more so, than coverage of then-President Trump had been during the same four-month period a year earlier. Though some criticized the study’s methodology, it spoke to a very real problem: Too many elements of the media are continuing to cover Trump and the Republicans as legitimate political actors rather than as what they’ve become: malign forces attempting to subvert democracy. The challenge is to find ways to hold Biden to account while avoiding mindless “both sides” coverage and false equivalence.

A year ago at this time we may have felt a sense of optimism that proved to be at least partly unrealistic. Next year, we’ll have no excuses — we know that COVID-19, the economy and Trumpism will continue to present enormous challenges. I hope that, at the end of 2022, we can all say that we met those challenges successfully.

Finally, my thanks to GBH News for the privilege of having this platform and to you for reading. Best wishes to everyone for a great 2022.

A $150 billion lawsuit over genocide may force Facebook to confront its dark side

Displayed Rohingya Muslims. Photo (cc) 2017 by Tasnim News Agency.

Previously published at GBH News.

How much of a financial hit would it take to force Mark Zuckerberg sit up and pay attention?

We can be reasonably sure he didn’t lose any sleep when British authorities fined Facebook a paltry $70 million earlier this fall for withholding information about its acquisition of Giphy, an app for creating and hosting animated graphics. Maybe he stirred a bit in July 2019, when the Federal Trade Commission whacked the company with a $5 billion penalty for violating its users’ privacy — a punishment described by the FTC as “the largest ever imposed” in such a case. But then he probably rolled over and caught a few more z’s.

OK, how about $150 billion? Would that do it?

We may be about to find out. Because that’s the price tag lawyers for Rohingya refugees placed on a class-action lawsuit they filed in California last week against Facebook — excuse me, make that Meta Platforms. As reported by Kelvin Chan of The Associated Press, the suit claims that Facebook’s actions in Myanmar stirred up violence in a way that “amounted to a substantial cause, and eventual perpetuation of, the Rohingya genocide.”

Even by Zuckerberg’s standards, $150 billion is a lot of money. Facebook’s revenues in 2020 were just a shade under $86 billion. And though the pricetags lawyers affix on lawsuits should always be taken with several large shakers of salt, the case over genocide in Myanmar could be just the first step in holding Facebook to account for the way its algorithms amplify hate speech and disinformation.

The lawsuit is also one of the first tangible consequences of internal documents provided earlier this fall by Frances Haugen, a former Facebook employee turned whistleblower who went public with information showing that company executives knew its algorithms were wreaking worldwide havoc and did little or nothing about it. In addition to providing some 10,000 documents to the U.S. Securities and Exchange Commission, Haugen told her story anonymously to The Wall Street Journal, and later went public by appearing on “60 Minutes” and testifying before Congress.

The lawsuit is a multi-country effort, as Mathew Ingram reports for the Columbia Journalism Review, and the refugees’ lawyers are attempting to apply Myanmar’s laws in order to get around the United States’ First Amendment, which — with few exceptions — protects even the most loathsome speech.

But given that U.S. law may prevail, the lawyers have also taken the step of claiming that Facebook is a “defective” product. According to Tim De Chant, writing at Ars Technica, that claim appears to be targeted at Section 230, which would normally protect Facebook from legal liability for any content posted by third parties.

Facebook’s algorithms are programmed to show you more and more of the content that you engage with, which leads to the amplification of the sort of violent posts that helped drive genocide against the Rohingyas. A legal argument that would presumably find more favor in the U.S. court system is the algorithmic-driven spread of that content, rather than the content itself.

“While the Rohingya have long been the victims of discrimination and persecution, the scope and violent nature of that persecution changed dramatically in the last decade, turning from human rights abuses and sporadic violence into terrorism and mass genocide,” the lawsuit says. “A key inflection point for that change was the introduction of Facebook into Burma in 2011, which materially contributed to the development and widespread dissemination of anti-Rohingya hate speech, misinformation, and incitement of violence—which together amounted to a substantial cause, and perpetuation of, the eventual Rohingya genocide..”

Facebook has previously admitted that its response to the violence in Myanmar was inadequate. “We weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence,” the company said in 2018.

The lawsuit at least theoretically represents an existential threat to Facebook, and no doubt the company will fight back hard. Still, its initial response emphasized its regrets and steps it has taken over the past several years to lessen the damage. A Meta spokesperson recently issued this statement to multiple news organizations: “We’re appalled by the crimes committed against the Rohingya people in Myanmar. We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw [the Burmese armed forces], disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content. This work is guided by feedback from experts, civil society organizations and independent reports, including the UN Fact-Finding Mission on Myanmar’s findings and the independent Human Rights Impact Assessment we commissioned and released in 2018.”

No doubt Zuckerberg and company didn’t knowingly set out to contribute to a human-rights disaster that led to a rampage of rape and murder, with nearly 7,000 Rohingyas killed and 750,000 forced out of the country. Yet this tragedy was the inevitable consequence of the way Facebook works, and of its top executives’ obsession with growth over safety.

As University of Virginia media studies professor and author Siva Vaidhyanathan has put it: “The problem with Facebook is Facebook.”

Maybe the prospect of being forced to pay for the damage they have done will, at long last, force Zuckerberg, Sheryl Sandberg and the rest to do something about it.

A tidal wave of documents exposes the depths of Facebook’s depravity

Photo (cc) 2008 by Craig ONeal

Previously published at GBH News.

How bad is it for Facebook right now? The company is reportedly planning to change its name, possibly as soon as this week — thus entering the corporate equivalent of the Witness Protection Program.

Surely, though, Mark Zuckerberg can’t really think anyone is going to be fooled. As the tech publisher Scott Turman told Quartz, “If the general public has a negative and visceral reaction to a brand then it may be time to change the subject. Rebranding is one way to do that, but a fresh coat of lipstick on a pig will not fundamentally change the facts about a pig.”

And the facts are devastating, starting with “The Facebook Files” in The Wall Street Journal at the beginning of the month; accelerating as the Journal’s once-anonymous source, former Facebook executive Frances Haugen, went public, testified before Congress and was interviewed on “60 Minutes”; and then exploding over the weekend as a consortium of news organizations began publishing highlights from a trove of documents Haugen gave the Securities and Exchange Commission.

No one can possibly keep up with everything we’ve learned about Facebook — and, let’s face it, not all that much of it is new except for the revelations that Facebook executives were well aware of what their critics have been saying for years. How did they know? Their own employees told them, and begged them to do something about it to no avail.

If it’s possible to summarize, the meta-critique is that, no matter what the issue, Facebook’s algorithms boost content that enrages, polarizes and even depresses its users — and that Zuckerberg and company simply won’t take the steps that are needed to lower the volume, since that might result in lower profits as well. This is the case across the board, from self-esteem among teenage girls to the Jan. 6 insurrection, from COVID disinformation to factional violence in other countries.

In contrast to past crises, when Facebook executives would issue fulsome apologies and then keep right on doing what they were doing, the company has taken a pugnacious tone this time around, accusing the media of bad faith and claiming it has zillions of documents that contradict the damning evidence in the files Haugen has provided. For my money, though, the quote that will live in infamy is one that doesn’t quite fit the context — it was allegedly spoken by Facebook communications official Tucker Bounds in 2017, and it wasn’t for public consumption. Nevertheless, it is perfect:

“It will be a flash in the pan,” Bounds reportedly said. “Some legislators will get pissy. And then in a few weeks they will move onto something else. Meanwhile we are printing money in the basement, and we are fine.”

Is Facebook still fine? Probably not. At the moment, at least, is difficult to imagine that Facebook won’t be forced to undergo some fundamental changes, either through public pressure or by force of law. A number of news organizations have published overviews to help you make sense of the new documents. One of the better ones was written by Adrienne LaFrance, the executive editor of The Atlantic, who was especially appalled by new evidence of Facebook’s own employees pleading with their superiors to stop amplifying the extremism that led to Jan. 6.

“The documents are astonishing for two reasons: First, because their sheer volume is unbelievable,” she said. “And second, because these documents leave little room for doubt about Facebook’s crucial role in advancing the cause of authoritarianism in America and around the world. Authoritarianism predates the rise of Facebook, of course. But Facebook makes it much easier for authoritarians to win.”

LaFrance offers some possible solutions, most of which revolve around changing the algorithm to optimize safety over growth — that is, not censoring speech, but taking steps to stop the worst of it from going viral. Keep in mind that one of the key findings from the past week involved a test account set up for a fictional conservative mother in North Carolina. Within days, her news feed was loaded with disinformation, including QAnon conspiracy theories, served up because the algorithm had figured out that such content would keep her engaged. As usual, Facebook’s own researchers sounded the alarm while those in charge did nothing.

In assessing what we’ve learned about Facebook, it’s important to differentiate between pure free-speech issues and those that involve amplifying bad speech for profit. Of course, as a private company, Facebook needn’t worry about the First Amendment — it can remove anything it likes for any reason it chooses.

But since Facebook is the closest thing we have to a public square these days, I’m uncomfortable with calls that certain types of harmful content be banned or removed. I’d rather focus on the algorithm. If someone posts, say, vaccine disinformation on the broader internet, people will see it (or not) solely on the basis of whether they visit the website or discussion board where it resides.

That doesn’t trouble me any more than I’m bothered by people handing out pamphlets about the coming apocalypse outside the subway station. Within reason, Facebook ought to be able to do the same. What it shouldn’t be able to do is make it easy for you to like and share such disinformation and keep you engaged by showing you more and — more extreme — versions of it.

And that’s where we might be able to do something useful about Facebook rather than just wring our hands. Reforming Section 230, which provides Facebook and other internet publishers with legal immunity for any content posted by their users, would be a good place to start. If 230 protections were removed for services that use algorithms to boost harmful content, then Facebook would change its practices overnight.

Meanwhile, we wait with bated breath for word on what the new name for Facebook will be. Friendster? Zucky McZuckface? The Social Network That Must Not Be Named?

Zuckerberg has created a two-headed beast. For most of us, Facebook is a fun, safe environment to share news and photos of our family and friends. For a few, it’s a dangerous place that leads them down dark passages from which they may never return.

In that sense, Facebook is like life itself, and it won’t ever be completely safe. But for years now, the public, elected officials and even Facebook’s own employees have called for changes that would make the platform less of a menace to its users as well as to the culture as a whole.

Zuckerberg has shown no inclination to change. It’s long past time to force his hand.