Photo (cc) 2011 by thierry ehrmann

Previously published at GBH News.

For researchers, Facebook is something of a black box. It’s hard to know what its 2.8 billion active users across the globe are seeing at any given time because the social media giant keeps most of its data to itself. If some users are seeing ads aimed at “Jew haters,” or Russian-generated memes comparing Hillary Clinton to Satan, well, so be it. Mark Zuckerberg has his strategy down cold: apologize when exposed, then move on to the next appalling scheme.

Some data scientists, though, have managed to pierce the darkness. Among them are Laura Edelson and Damon McCoy of New York University’s Center for Cybersecurity. With a tool called Ad Observer, which volunteers add to their browsers, they were able to track ads that Facebook users were being exposed to and draw some conclusions. For instance, they learned that users are more likely to engage with extreme falsehoods than with truthful material, and that more than 100,000 political ads are missing from an archive Facebook set up for researchers.

As you would expect, Facebook executives took these findings seriously. So what did they do? Did they change the algorithm to make it more likely that users would see reliable information in their news feed? Did they restore the missing ads and take steps to make sure such omissions wouldn’t happen again?

They did not. Instead, they cut off access to Edelson’s and McCoy’s accounts, making it harder for them to dig up such embarrassing facts in the future.

“There is still a lot of important research we want to do,” they wrote in a recent New York Times op-ed. “When Facebook shut down our accounts, we had just begun studies intended to determine whether the platform is contributing to vaccine hesitancy and sowing distrust in elections. We were also trying to figure out what role the platform may have played leading up to the Capitol assault on Jan. 6.”

In other words, they want to find out how responsible Zuckerberg, Sheryl Sandberg and the rest are for spreading a deadly illness and encouraging an armed insurrection. No wonder Facebook looked at what the researchers were doing and told them, gee, you know, we’d love to help, but you’re violating our privacy rules.

But that’s not even a real concern. Writing at the Columbia Journalism Review, Mathew Ingram points out that the privacy rules Facebook agreed to following the Cambridge Analytica scandal apply to Facebook itself, not to users who voluntarily agree to provide information to researchers.

Ingram quotes Princeton professor Jonathan Mayer, an adviser to Vice President Kamala Harris when she was a senator, who tweeted: “Facebook’s legal argument is bogus. The order “restricts how *Facebook* shares user information. It doesn’t preclude *users* from volunteering information about their experiences on the platform, including through a browser extension.”

The way Ingram describes it, as well as Edelson and McCoy themselves, Facebook’s actions didn’t stop their work altogether, but it has slowed it down and made it more difficult. Needless to say, the company should be doing everything it can to help with such research. Then again, Zuckerberg has never shown much regard for such mundane matters as public health and the future of democracy, especially when there’s money to be made.

By contrast, Facebook’s social media competitor Twitter has actually been much more open about making its data available to researchers. My Northeastern colleague John Wihbey, who co-authored an important study several years ago about how journalists use Twitter, says the difference explains why there have been more studies published about Twitter than Facebook. “This is unfortunate,” he says, “as it is a smaller network and less representative of the general public.”

It’s like the old saw about looking for your car keys under a street light because that’s where the light is. Trouble is, with fewer than 400 million active users, Twitter is little more than a rounding error in Facebook’s universe.

Earlier this year, MIT’s Technology Review published a remarkable story documenting how Facebook shied away from cracking down on extremist content, focusing instead on placating Donald Trump and other figures on the political right before the 2020 election. Needless to say, the NYU researchers represent an especially potent threat to the Zuckerborg since they plan to focus on the role that Facebook played in amplifying the disinformation that led to the insurrection, whose aftermath continues to befoul our body politic.

When the history of this ugly era is written, the two media giants that will stand out for their malignity are Fox News, for knowingly poisoning tens of millions of people with toxic falsehoods, and Facebook, for allowing its platform be used to amplify those falsehoods. Eventually, the truth will be told — no matter what steps Zuckerberg takes to slow it down. There should be hell to pay.