Everything you know is wrong (Facebook edition)

Like many observers, I have often cited Facebook, along with Fox News, as one of the most dangerous forces promoting disinformation and polarization. Its algorithms feed you what keeps you engaged, and what keeps you engaged is what makes you angry and upset.

But what if most Facebook users don’t even see news? Nieman Lab editor Laura Hazard Owen conducted a real-world experiment. And what she found ought to give us pause:

Even using a very generous definition of news (“Guy rollerblades with 75-pound dog on his back”), the majority of people in our survey (54%) saw no news within the first 10 posts in their feeds at all.

Moreover, the top three most frequently seen news sources weren’t the likes of Newsmax, Breitbart and Infowars — they were CNN, The New York Times and NBC News, which epitomize the mainstream.

I asked Owen to clarify whether her definition of news popping up in people’s feeds was restricted to content that came directly from news organizations or whether it included news stories shared by friends. “It was ALL news,” she replied, “whether shared by a news organization or a friend.”

Is it possible that we all misunderstand the effect that Facebook is having (or not having) on our democracy?

Comments are open. Please include your full name, first and last, and speak with a civil tongue.

3 thoughts on “Everything you know is wrong (Facebook edition)

  1. aaronread1

    In her own words: “My sample size was small and shouldn’t be used to draw sweeping conclusions about Facebook news consumers overall.”

    I have to do more a of deep-dive into her methodology but that right there is what jumped out at me: she had a sample size of a mere 173 participants and, worse, they were all self-selected. I think her article is interesting but at most it’s probably only a call for more research. A call that, unfortunately, is presented in a way that a lot of people could very easily cherry-pick as “oh, I guess Facebook is harmless after all.”

    If there’s one thing we’ve learned, it’s that Facebook is amazingly good at manipulating data.

  2. Deborah Nam-Krane

    Interesting! The two takeaways I have from that piece: 1) Conservative or very conservative people see more news than others and 2) the more news you read away from Facebook, the more you see on Facebook. For me, the issue with Facebook in general but news especially is the “filter bubble” effect: the algorithms follow you around all over the internet as well as the site, get a sense of what you respond to, then keep giving you the same thing. I’m glad to hear that the company is probably not spreading disinformation for everyone, but it’s still pretty damaging to entrench people’s opinions and not challenge them.

  3. Laurel Strand

    Until about a month or two ago, my Facebook feed was set to give priority to posts from media outlets such as the New York Times, the Washington Post, Atlantic magazine, and two local media, the Seattle Times and Crosscut. Then one day my feed suddenly and inexplicably ceased to feature these posts. Instead, now I see posts by friends, a large portion of which are shared memes and music videos, and a *lot* of sponsored ads. When I try to reset my News Feed Preference, this error message pops up: “Sorry, something went wrong. We are working on it and we’ll get it fixed as soon as we can.”

Comments are closed.