By Dan Kennedy • The press, politics, technology, culture and other passions

Tag: Facebook Page 3 of 13

The Buffalo horror raises thorny issues about hate speech and the media

Image via Today’s Front Pages at FreedomForum.org.

Correction: An earlier version of this post identified 4chan’s hosting service. In fact, it was a porn site that uses the name 4chan but is otherwise unrelated.

Our thoughts at this time need to be with the Black community of Buffalo — and everywhere — as we process the horror of one of the worst mass murders of recent years. We need to do something substantive about guns, racism and white supremacy. What actually happened, and what we can do to prevent such horrific events from happening again, must be at the top of our agenda.

This blog, though, is primarily about the media and often about free speech. So let me address some of the secondary issues. The shootings intersect with notions of hate speech, social media and the role of Fox News in mainstreaming dangerous racist ideologies such as so-called replacement theory, which holds that the left is trying to push out white people in favor of non-white immigrants in order to obtain an electoral advantage.

First, keep in mind that hate speech is legal. The New York Times today says this about New York Gov. Kathy Hochul:

When pressed on how she planned to confront such hate speech online, without impinging on First Amendment rights, Ms. Hochul noted that “hate speech is not protected” and said she would soon be calling meetings with social media companies.

Hochul is wrong, and the Times shouldn’t have used “noted,” which implies that she knows what she’s talking about. If hate speech were illegal, Tucker Carlson would have been kicked off Fox long ago.

What’s illegal is incitement to violence, and you might think whipping up racist hatred would qualify. In fact, it does not — and the very Supreme Court case that made that clear was about a speaker at a rally who whipped up racist hatred. Brandenburg v. Ohio (1969) held that a ranting Ku Klux Klan thug demanding “revengeance” against Jews and Black people had not engaged in incitement because his threats were non-specific.

Hochul can cajole and threaten. And she should. But it’s going to be difficult to do much more than that.

As for the media themselves, that’s a morass, and it’s too early to start sorting this out. But the shooter reportedly fell down the 4chan hole during the pandemic, immersing himself in the racism and hate that permeate the dark corners of the internet. There are a lot of moving parts here, but it seems unlikely that a young mass murder-in-the-making was sitting around watching Fox, even if some of his rants paralleled Carlson’s rhetoric. Fox’s role is to mainstream such hatred for its frightened, elderly viewers. The radicalization itself happens elsewhere.

So, are we going to ban 4chan? How would that even work? If the government tried to shut them down, they could just go somewhere else. I’m sure Vladimir Putin would be happy to play host.

4chan represents the bottom of this toxic food chain; Fox News is at the top. In the middle are the mainstream social media platforms — Facebook, Twitter, Twitch (which allowed the shooter to livestream his rampage for nearly two minutes before taking it down) and the like. It’s too early to say what, if anything, will happen on that front. But it’s probably not a good time to be a billionaire who wants to buy Twitter so that there will be less moderation on the platform than there is currently.

As it turns out, that billionaire, Elon Musk, may be backing away.

Why dark money in the Sarah Palin libel case could distort justice

Peter Thiel. Photo (cc) 2012 by Hubert Burda Media.

Jack Shafer asks an important question: Who is funding Sarah Palin’s legal battle against The New York Times? As Shafer observes in his new Politico Magazine piece, Palin’s legal team overlaps with the lawyers who represented Hulk Hogan in his lawsuit against Gawker. That effort turned out to be funded by Facebook billionaire Peter Thiel, who was aggrieved at having been outed by a Gawker-owned website. Shafer writes:

Nobody can criticize Palin for passing the hat to finance her case — if that’s what she did. Lawsuits are expensive and crowdfunding them without naming the funders is a time-honored practice — civil liberties groups do it routinely — and the practice is especially praiseworthy when the litigation is of the “impact” variety, designed to change the law and protect rights. But as the Gawker case demonstrated, such lawsuits can also be seen as punitive exercises, financed by a third party as payback.

The problem is that when lawsuits are funded by vast sums of dark money, they can have a distorting effect. Hogan’s invasion-of-privacy suit after Gawker published video of him having sex without his permission was certainly worthy of pursuing. But in the ordinary course of such matters, it would have been settled and life would have gone on. Instead, Hogan’s lawyers used secret Thiel money to push the suit all the way to its conclusion, with Gawker ultimately going bankrupt and shutting down. (The site has since been relaunched under new ownership.)

Unlike Hogan’s case, Palin’s libel suit against the Times is entirely lacking in merit. The Times published an editorial falsely tying Palin’s rhetoric to the 2011 shooting of then-congresswoman Gabby Giffords and the killings of six others. But there was zero evidence that the Times acted with “actual malice” (knowing falsehood or reckless disregard for the truth), which is the standard for public officials and public figures.

Palin’s suit shouldn’t have gotten as far as it did, and the devastating defeat she suffered this week ought to put an end to it. But if she’s backed by an endless stream of screw-you money, she can keep pushing, and perhaps get her case eventually heard by the U.S. Supreme Court — where Justices Clarence Thomas and Neil Gorsuch have indicated they’re prepared to overturn or pare back the libel standards that have protected the press since the landmark 1964 Times v. Sullivan decision.

Help local news? Sure. Force Google and Facebook to pay? Probably not.

Sen. Amy Klobuchar meets a fan in Iowa. Photo (cc) 2019 by Gage Skidmore.

For years now, news executives have been complaining bitterly that Google and Facebook repurpose their journalism without paying for it. Now it looks like they might have an opportunity to do something about it.

Earlier this week a Senate subcommittee chaired by Sen. Amy Klobuchar, D-Minn., heard testimony about the Journalism Competition and Preservation Act (JCPA), sponsored by her and Sen. John Kennedy, R-La. The bill would allow representatives of the news business to bargain collectively over a compensation package with Google and Facebook without running afoul of antitrust laws. If they fall short, an arbitrator would impose a settlement.

“These big tech companies are not friends to journalism,” said Klobuchar, according to an account of the hearing by Gretchen Peck of the trade magazine Editor & Publisher. “They are raking in ad dollars while taking news content, feeding it to their users, and refusing to offer fair compensation.”

There’s no question that the local news ecosystem has fallen apart, and that technology has a lot to do with it. (So do the pernicious effects of corporate and hedge-fund ownership, which has imposed cost-cutting that goes far beyond what’s necessary to run a sustainable business.) But is the JCPA the best way to go about it?

The tech giants themselves have been claiming for years that they provide value to news organizations by sending traffic their way. True, except that the revenues brought in by digital advertising have plummeted over the past two decades. A lawsuit brought by newspaper publishers argues that the reason is Google’s illegal monopoly over digital advertising, cemented by a secret deal with Facebook not to compete.

Though Google and Facebook deny any wrongdoing, the lawsuit strikes me as a more promising strategy than the JCPA, which raises some serious questions about who would benefit. A similar law in Australia has mainly served to further enrich Rupert Murdoch.

Writing at Nieman Lab, Joshua Benton argues, among other things, that simply taxing the technology companies and using the money to fund tax subsidies for local news would be a better solution. Benton cites one provision of the Build Back Better legislation — a payroll tax deduction for hiring and retaining journalists.

In fact, though, the payroll provision is just one of three tax credits included in the Local Journalism Sustainability Act; the others would reward subscribers and advertisers. I have some reservations about using tax credits in a way that would indiscriminately reward hedge-fund owners along with independent operators. But I do think it’s worth a try.

Even though local news needs a lot of help, probably in the form of some public assistance, it strikes me that the Klobuchar-Kennedy proposal is the least attractive of the options now on the table.

Northeastern’s Myojung Chung and John Wihbey on attitudes about regulating social media

Myojung Chung

In the latest “What Works” podcast, Professors Myojung Chung and John Wihbey, colleagues from Northeastern University’s School of Journalism, share the findings from their new working paper, published by Northeastern’s Ethics Institute.

They and their colleagues examined attitudes about the regulation of social media in four countries: the U.K., Mexico, South Korea and the U.S. With Facebook (or Meta) under fire for its role in amplifying disinformation and hate speech, their research has implications for how the platforms might be regulated — and whether such regulations would be accepted by the public.

John Wihbey

In Quick Takes, Ellen Clegg and I kick around WBEZ Radio’s acquisition of the Chicago Sun-Times, which will result in the newspaper’s becoming a nonprofit organization. We also discuss an announcement that a new nonprofit news organization will be launched in Houston with $20 million in seed money. Plus a tiny Easter egg from country artist Roy Edwin Williams.

You can listen to our conversation here and subscribe through your favorite podcast app.

Antitrust suit brought by states claims Google and Facebook had a secret deal

Photo (cc) by Fir0002/Flagstaffotos

There’s been a significant new development in the antitrust cases being brought against Google and Facebook.

On Friday, Richard Nieva reported in BuzzFeed News that a lawsuit filed in December 2020 by Texas and several other states claims that Google CEO Sundar Pichai and Facebook CEO Mark Zuckerberg “personally signed off on a secret advertising deal that allegedly gave Facebook special privileges on Google’s ad platform.” That information was recently unredacted.

Nieva writes:

The revelation comes as both Google and Facebook face a crackdown from state and federal officials over antitrust concerns for their business practices. Earlier this week, a judge rejected Facebook’s motion to dismiss a lawsuit by the Federal Trade Commission that accuses the social network of using anticompetitive tactics.

The action being led by Texas is separate from an antitrust suit brought against Google and Facebook by more than 200 newspapers around the country. The suit essentially claims that Google has monopolized the digital ad marketplace in violation of antitrust law and has cut Facebook in on the deal in order to stave off competition. Writing in Business Insider, Martin Coulter puts it this way:

Most of the allegations in the suit hinge on Google’s fear of “header bidding,” an alternative to its own ad auctioning practices described as an “existential threat” to the company.

As I’ve written previously, the antitrust actions are potentially more interesting than the usual complaint made by newspapers — that Google and Facebook have repurposed their journalism and should pay for it. That’s never struck me as an especially strong legal argument, although it’s starting to happen in Australia and Western Europe.

The antitrust claims, on the other hand, are pretty straightforward. You can’t control all aspects of a market, and you can’t give special treatment to a would-be competitor. Google and Facebook, of course, have denied any wrongdoing, and that needs to be taken seriously. But keep an eye on this. It could shake the relationship between the platforms and the publishers to the very core.

From COVID to our crisis of democracy, 2021 turned out to be a scant improvement over 2020

Photo (cc) 2021 by Blink O’fanaye

Previously published at GBH News.

Hopes were running high when we all turned the calendar to 2021. Would the worst 12 months in anyone’s memory give way to the best year of our lives?

Not quite. Yes, it was better than 2020, but 2021 was hardly a return to paradise. The joy of vaccinations gave way to the reality that COVID-19 is likely to be with us for a long time. The economy recovered rapidly — accompanied by the highest rate of inflation in 40 years. Worst of all, the end of the Trump presidency morphed into a crisis of democracy that is starting to look as ominous as the run-up to the Civil War.

During the past year, I’ve been struggling to make sense of the highs, the lows and the in-betweens through the prism of the media. Below are 10 of my GBH News columns from 2021. They’re in chronological order, with updates on many of the pieces posted earlier this year. If there’s a unifying theme, it’s that we’re in real trouble — but that, together, we can get through this.

The end of the Trump bump, Jan. 27. Even as he was denouncing journalists as “enemies of the people,” Donald Trump, both before and during his presidency, was very, very good for the media. Cable TV ratings soared. The New York Times and The Washington Post signed up subscribers by the bucketload. Several weeks after Trump departed from the White House, though, there were questions about what would happen once he was gone. We soon got an answer. Even though Trump never really left, news consumption shrank considerably. That may be good for our mental health. But for media executives trying to make next quarter’s numbers, it was an unpleasant new reality.

Local news in crisis, Feb. 23. The plague of hedge funds undermining community journalism continued unabated in 2021. The worst newspaper owner of them all, Alden Global Capital, acquired Tribune Publishing and its eight major-market papers, which include the Chicago Tribune, New York’s Daily News and, closer to home, the Hartford Courant. When the bid was first announced, there was at least some hope that one of those papers, The Baltimore Sun, would be spun off. Unfortunately, an epic battle between Alden and Baltimore hotel mogul Stewart Bainum resulted in Alden grabbing all of them. Bainum, meanwhile, is planning to launch a nonprofit website to compete with the Sun that will be called The Baltimore Banner.

The devolution of Tucker Carlson, April 15. How did a stylish magazine writer with a libertarian bent reinvent himself as a white-supremacist Fox News personality in thrall to Trump and catering to dangerous conspiracy theories ranging from vaccines (bad) to the Jan. 6 insurrection (good)? There are millions of possible explanations, and every one of them has a picture of George Washington on it. Carlson got in trouble last spring — or would have gotten in trouble if anyone at Fox cared — when he endorsed “replacement theory,” a toxic trope that liberal elites are deliberately encouraging immigration in order to dilute the power of white voters. A multitude of advertisers have bailed on Carlson, but it doesn’t matter — Fox today makes most of its money from cable fees. And Carlson continues to spew his hate.

How Black Lives Matter exposed journalism, May 26. A teenager named Darnella Frazier exposed an important truth about how reporters cover the police. The video she recorded of Minneapolis police officer Derek Chauvin literally squeezing the life out of George Floyd as he lay on the pavement proved that the police lied in their official report of what led to Floyd’s death. For generations, journalists have relied on law enforcement as their principal — and often only — source for news involving the police. That’s no longer good enough; in fact, it was never good enough. Frazier won a Pulitzer Prize for her courageous truth-telling. And journalists everywhere were confronted with the reality that they need to change the way they do their jobs.

The 24th annual New England Muzzle Awards, July 1. For 24 years, the Muzzle Awards have singled out enemies of free speech. The Fourth of July feature made its debut in The Boston Phoenix in 1998 and has been hosted by GBH News since 2013, the year that the Phoenix shut down. This year’s lead item was about police brutality directed at Black Lives Matter protesters in Boston and Worcester the year before — actions that had escaped scrutiny at the time but that were exposed by bodycam video obtained by The Appeal, a nonprofit news organization. Other winners of this dubious distinction included former Boston Mayor Marty Walsh, retired Harvard Law School professor Alan Dershowitz and the aforementioned Tucker Carlson, who unleashed his mob to terrorize two freelance journalists in Maine.

How to help save local news, July 28. Since 2004, some 2,100 newspapers have closed, leaving around 1,800 communities across the country bereft of coverage. It’s a disaster for democracy, and the situation is only growing worse. The Local Journalism Sustainability Act, a bipartisan proposal to provide indirect government assistance in the form of tax credits for subscribers, advertisers and publishers, could help. The bill is hardly perfect. Among other things, it would direct funds to corporate chains as well as to independent operators, thus rewarding owners who are hollowing out their papers. Nevertheless, the idea may well be worth trying. At year’s end, the legislation was in limbo, but it may be revived in early 2022.

Democracy in crisis, Sept. 29. As summer turned to fall, the media began devoting some serious attention to a truly frightening development: the deterioration of the Republican Party into an authoritarian tool of Trump and Trumpism, ready to hand the presidency back to their leader in 2024 through a combination of antidemocratic tactics. These include the disenfranchisement of Black voters through partisan gerrymandering, the passage of new laws aimed at suppressing the vote and the handing of state electoral authority over to Trump loyalists. With polls showing that a majority of Republicans believe the 2020 election was stolen, it’s only going to get worse in the months ahead.

Exposing Facebook’s depravity, Oct. 27. The social media giant’s role in subverting democracy in the United States and fomenting chaos and violence around the world is by now well understood, so it takes a lot to rise to the level of OMG news. Frances Haugen, though, created a sensation. The former Facebook executive leaked thousands of documents to the Securities and Exchange Commission and spoke out — at first anonymously, in The Wall Street Journal, and later on “60 Minutes” and before a congressional committee. Among other things, the documents showed that Facebook’s leaders were well aware of how much damage the service’s algorithmic amplification of conspiracy theories and hate speech was causing. By year’s end, lawyers for Rohingya refugees from Myanmar were using the documents to sue Facebook for $150 billion, claiming that Mark Zuckerberg and company had whipped up a campaign of rape and murder.

COVID-19 and the new normal, Nov. 17. By late fall, the optimism of June and July had long since given way to the reality of delta. I wrote about my own experience of trying to live as normally as possible — volunteering at Northeastern University’s long-delayed 2020 commencement and taking the train for a reporting trip in New Haven. Now, of course, we are in the midst of omicron. The new variant may prove disastrous, or it may end up being mild enough that it’s just another blip on our seemingly endless pandemic journey. In any case, omicron was a reminder — as if we needed one — that boosters, masking and testing are not going away any time soon.

How journalism is failing us, Dec. 7. Washington Post columnist Dana Milbank created a sensation when he reported the results of a content analysis he had commissioned. The numbers showed that coverage of President Joe Biden from August to November 2021 was just as negative, if not more so, than coverage of then-President Trump had been during the same four-month period a year earlier. Though some criticized the study’s methodology, it spoke to a very real problem: Too many elements of the media are continuing to cover Trump and the Republicans as legitimate political actors rather than as what they’ve become: malign forces attempting to subvert democracy. The challenge is to find ways to hold Biden to account while avoiding mindless “both sides” coverage and false equivalence.

A year ago at this time we may have felt a sense of optimism that proved to be at least partly unrealistic. Next year, we’ll have no excuses — we know that COVID-19, the economy and Trumpism will continue to present enormous challenges. I hope that, at the end of 2022, we can all say that we met those challenges successfully.

Finally, my thanks to GBH News for the privilege of having this platform and to you for reading. Best wishes to everyone for a great 2022.

A $150 billion lawsuit over genocide may force Facebook to confront its dark side

Displayed Rohingya Muslims. Photo (cc) 2017 by Tasnim News Agency.

Previously published at GBH News.

How much of a financial hit would it take to force Mark Zuckerberg sit up and pay attention?

We can be reasonably sure he didn’t lose any sleep when British authorities fined Facebook a paltry $70 million earlier this fall for withholding information about its acquisition of Giphy, an app for creating and hosting animated graphics. Maybe he stirred a bit in July 2019, when the Federal Trade Commission whacked the company with a $5 billion penalty for violating its users’ privacy — a punishment described by the FTC as “the largest ever imposed” in such a case. But then he probably rolled over and caught a few more z’s.

OK, how about $150 billion? Would that do it?

We may be about to find out. Because that’s the price tag lawyers for Rohingya refugees placed on a class-action lawsuit they filed in California last week against Facebook — excuse me, make that Meta Platforms. As reported by Kelvin Chan of The Associated Press, the suit claims that Facebook’s actions in Myanmar stirred up violence in a way that “amounted to a substantial cause, and eventual perpetuation of, the Rohingya genocide.”

Even by Zuckerberg’s standards, $150 billion is a lot of money. Facebook’s revenues in 2020 were just a shade under $86 billion. And though the pricetags lawyers affix on lawsuits should always be taken with several large shakers of salt, the case over genocide in Myanmar could be just the first step in holding Facebook to account for the way its algorithms amplify hate speech and disinformation.

The lawsuit is also one of the first tangible consequences of internal documents provided earlier this fall by Frances Haugen, a former Facebook employee turned whistleblower who went public with information showing that company executives knew its algorithms were wreaking worldwide havoc and did little or nothing about it. In addition to providing some 10,000 documents to the U.S. Securities and Exchange Commission, Haugen told her story anonymously to The Wall Street Journal, and later went public by appearing on “60 Minutes” and testifying before Congress.

The lawsuit is a multi-country effort, as Mathew Ingram reports for the Columbia Journalism Review, and the refugees’ lawyers are attempting to apply Myanmar’s laws in order to get around the United States’ First Amendment, which — with few exceptions — protects even the most loathsome speech.

But given that U.S. law may prevail, the lawyers have also taken the step of claiming that Facebook is a “defective” product. According to Tim De Chant, writing at Ars Technica, that claim appears to be targeted at Section 230, which would normally protect Facebook from legal liability for any content posted by third parties.

Facebook’s algorithms are programmed to show you more and more of the content that you engage with, which leads to the amplification of the sort of violent posts that helped drive genocide against the Rohingyas. A legal argument that would presumably find more favor in the U.S. court system is the algorithmic-driven spread of that content, rather than the content itself.

“While the Rohingya have long been the victims of discrimination and persecution, the scope and violent nature of that persecution changed dramatically in the last decade, turning from human rights abuses and sporadic violence into terrorism and mass genocide,” the lawsuit says. “A key inflection point for that change was the introduction of Facebook into Burma in 2011, which materially contributed to the development and widespread dissemination of anti-Rohingya hate speech, misinformation, and incitement of violence—which together amounted to a substantial cause, and perpetuation of, the eventual Rohingya genocide..”

Facebook has previously admitted that its response to the violence in Myanmar was inadequate. “We weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence,” the company said in 2018.

The lawsuit at least theoretically represents an existential threat to Facebook, and no doubt the company will fight back hard. Still, its initial response emphasized its regrets and steps it has taken over the past several years to lessen the damage. A Meta spokesperson recently issued this statement to multiple news organizations: “We’re appalled by the crimes committed against the Rohingya people in Myanmar. We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw [the Burmese armed forces], disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content. This work is guided by feedback from experts, civil society organizations and independent reports, including the UN Fact-Finding Mission on Myanmar’s findings and the independent Human Rights Impact Assessment we commissioned and released in 2018.”

No doubt Zuckerberg and company didn’t knowingly set out to contribute to a human-rights disaster that led to a rampage of rape and murder, with nearly 7,000 Rohingyas killed and 750,000 forced out of the country. Yet this tragedy was the inevitable consequence of the way Facebook works, and of its top executives’ obsession with growth over safety.

As University of Virginia media studies professor and author Siva Vaidhyanathan has put it: “The problem with Facebook is Facebook.”

Maybe the prospect of being forced to pay for the damage they have done will, at long last, force Zuckerberg, Sheryl Sandberg and the rest to do something about it.

Our latest podcast features Rhema Bland, director of the Ida B. Wells Society

Rhema Bland

Our guest on the latest episode of the “What Works” podcast is Rhema Bland, the first permanent director of the Ida B. Wells Society for Investigative Reporting at the University of North Carolina school of journalism. She was appointed in October 2020 after working in higher education as an adviser to student media programs. She is a veteran journalist who has reported and produced for CBS, the Florida Times-Union, WJCT and the New York Daily News.

The Wells Society was co-founded by award-winning journalists Nikole Hannah-Jones, Ron Nixon and Topher Sanders. The society is named after the path-breaking Black journalist and activist Ida B. Wells, who fearlessly covered the lynching of Black men and was present at the creation of the NAACP. The society’s mission is essential to the industry: to “increase the ranks, retention and profile of reporters and editors of color in the field of investigative reporting.” Bland and her colleagues host training seminars for journalists across the country, focusing on everything from entrepreneurship to racial inequality to COVID-19.

Also in this episode, Ellen Clegg talks about Ogden Newspapers’ purchase of Swift Communications, which publishes community papers in western ski towns as well as niche agricultural titles like the Goat Journal. And I share news about federal antitrust lawsuits that are in the works against Google and Facebook by more than 200 newspapers.

You can listen here and sign up via Apple Podcasts, Spotify or wherever fine podcasts are found.

Antitrust legal actions against Google and Facebook spread to 200-plus newspapers

Some 200 newspapers are engaged in legal actions claiming that Google and Facebook exercise Godzilla-like dominance of digital advertising. Photo (cc) 2009 by Dr Zito.

A lawsuit filed by newspapers against Google and Facebook that claims the two tech giants violated antitrust laws is gaining momentum. Sara Fischer and Kristal Dixon of Axios report that more than 200 papers across the country have joined the effort, which is aimed at forcing Google and Facebook to compensate them for what they say are monopolistic practices that denied them advertising revenue.

I don’t see any New England newspapers on this list. But the papers that are involved in the lawsuits in some way represent about 30 different owners in dozens of states, according to Fischer and Dixon. About 150 papers owned by 17 different groups have actually filed suit so far.

What’s interesting about this is that it has nothing to do with the usual complaint about Google and Facebook — that they repurpose journalism from newspapers, and that the newspapers ought to be compensated. By contrast, the current lawsuits are aimed at practices that the plaintiffs claim are clearly illegal.

The Axios story doesn’t get into the weeds. But I did earlier this year shortly after the first lawsuit was filed by HD Media, a small chain based in West Virginia. Essentially, the argument is twofold:

  • Google is violating antitrust law by controlling every aspect of digital advertising. Paul Farrell, a lawyer for HD Media, put it this way in an interview with the trade magazine Editor & Publisher: “They have completely monetized and commercialized their search engine, and what they’ve also done is create an advertising marketplace in which they represent and profit from the buyers and the sellers, while also owning the exchange.”
  • Facebook is complicit because, according to a lawsuit filed by several state attorneys general, Google and Facebook are colluding through an agreement that Google has code-named Jedi Blue. The AGs contend that Google provides Facebook with special considerations so that Facebook won’t set up a competing ad network.

The two companies have denied any wrongdoing. But if the case against them is correct, then Google is profiting from a perfect closed environment: It holds a near-monopoly on search and the programmatic advertising system through which most ads show up on news websites. And it has an agreement with Facebook aimed at staving off competition.

“The intellectual framework for this developed over the last three to four years,”  Doug Reynolds, managing partner of HD Media, told Axios.

The lawsuit also comes at a time when the federal government is beginning to rethink antitrust law. A generation ago, a philosophy developed by Robert Bork — yes, that Robert Bork, and yes, everything really does go back to Richard Nixon — held that there can be no antitrust violations unless consumers are harmed in the form of higher prices.

President Joe Biden’s administration, by contrast, has been embracing a more progressive, older form of antitrust law holding that monopolies can be punished or even broken up if they “undermine economic fairness and American democracy,” as The New Yorker put it.

The newspapers’ lawsuit against Google and Facebook is grounded in the Biden version of antitrust — Google and Facebook are charged with leveraging their monopoly to harm newspapers economically while at the same time hurting democracy, which depends on reliable journalism.

Become a member of Media Nation for just $5 a month.

A tidal wave of documents exposes the depths of Facebook’s depravity

Photo (cc) 2008 by Craig ONeal

Previously published at GBH News.

How bad is it for Facebook right now? The company is reportedly planning to change its name, possibly as soon as this week — thus entering the corporate equivalent of the Witness Protection Program.

Surely, though, Mark Zuckerberg can’t really think anyone is going to be fooled. As the tech publisher Scott Turman told Quartz, “If the general public has a negative and visceral reaction to a brand then it may be time to change the subject. Rebranding is one way to do that, but a fresh coat of lipstick on a pig will not fundamentally change the facts about a pig.”

And the facts are devastating, starting with “The Facebook Files” in The Wall Street Journal at the beginning of the month; accelerating as the Journal’s once-anonymous source, former Facebook executive Frances Haugen, went public, testified before Congress and was interviewed on “60 Minutes”; and then exploding over the weekend as a consortium of news organizations began publishing highlights from a trove of documents Haugen gave the Securities and Exchange Commission.

No one can possibly keep up with everything we’ve learned about Facebook — and, let’s face it, not all that much of it is new except for the revelations that Facebook executives were well aware of what their critics have been saying for years. How did they know? Their own employees told them, and begged them to do something about it to no avail.

If it’s possible to summarize, the meta-critique is that, no matter what the issue, Facebook’s algorithms boost content that enrages, polarizes and even depresses its users — and that Zuckerberg and company simply won’t take the steps that are needed to lower the volume, since that might result in lower profits as well. This is the case across the board, from self-esteem among teenage girls to the Jan. 6 insurrection, from COVID disinformation to factional violence in other countries.

In contrast to past crises, when Facebook executives would issue fulsome apologies and then keep right on doing what they were doing, the company has taken a pugnacious tone this time around, accusing the media of bad faith and claiming it has zillions of documents that contradict the damning evidence in the files Haugen has provided. For my money, though, the quote that will live in infamy is one that doesn’t quite fit the context — it was allegedly spoken by Facebook communications official Tucker Bounds in 2017, and it wasn’t for public consumption. Nevertheless, it is perfect:

“It will be a flash in the pan,” Bounds reportedly said. “Some legislators will get pissy. And then in a few weeks they will move onto something else. Meanwhile we are printing money in the basement, and we are fine.”

Is Facebook still fine? Probably not. At the moment, at least, is difficult to imagine that Facebook won’t be forced to undergo some fundamental changes, either through public pressure or by force of law. A number of news organizations have published overviews to help you make sense of the new documents. One of the better ones was written by Adrienne LaFrance, the executive editor of The Atlantic, who was especially appalled by new evidence of Facebook’s own employees pleading with their superiors to stop amplifying the extremism that led to Jan. 6.

“The documents are astonishing for two reasons: First, because their sheer volume is unbelievable,” she said. “And second, because these documents leave little room for doubt about Facebook’s crucial role in advancing the cause of authoritarianism in America and around the world. Authoritarianism predates the rise of Facebook, of course. But Facebook makes it much easier for authoritarians to win.”

LaFrance offers some possible solutions, most of which revolve around changing the algorithm to optimize safety over growth — that is, not censoring speech, but taking steps to stop the worst of it from going viral. Keep in mind that one of the key findings from the past week involved a test account set up for a fictional conservative mother in North Carolina. Within days, her news feed was loaded with disinformation, including QAnon conspiracy theories, served up because the algorithm had figured out that such content would keep her engaged. As usual, Facebook’s own researchers sounded the alarm while those in charge did nothing.

In assessing what we’ve learned about Facebook, it’s important to differentiate between pure free-speech issues and those that involve amplifying bad speech for profit. Of course, as a private company, Facebook needn’t worry about the First Amendment — it can remove anything it likes for any reason it chooses.

But since Facebook is the closest thing we have to a public square these days, I’m uncomfortable with calls that certain types of harmful content be banned or removed. I’d rather focus on the algorithm. If someone posts, say, vaccine disinformation on the broader internet, people will see it (or not) solely on the basis of whether they visit the website or discussion board where it resides.

That doesn’t trouble me any more than I’m bothered by people handing out pamphlets about the coming apocalypse outside the subway station. Within reason, Facebook ought to be able to do the same. What it shouldn’t be able to do is make it easy for you to like and share such disinformation and keep you engaged by showing you more and — more extreme — versions of it.

And that’s where we might be able to do something useful about Facebook rather than just wring our hands. Reforming Section 230, which provides Facebook and other internet publishers with legal immunity for any content posted by their users, would be a good place to start. If 230 protections were removed for services that use algorithms to boost harmful content, then Facebook would change its practices overnight.

Meanwhile, we wait with bated breath for word on what the new name for Facebook will be. Friendster? Zucky McZuckface? The Social Network That Must Not Be Named?

Zuckerberg has created a two-headed beast. For most of us, Facebook is a fun, safe environment to share news and photos of our family and friends. For a few, it’s a dangerous place that leads them down dark passages from which they may never return.

In that sense, Facebook is like life itself, and it won’t ever be completely safe. But for years now, the public, elected officials and even Facebook’s own employees have called for changes that would make the platform less of a menace to its users as well as to the culture as a whole.

Zuckerberg has shown no inclination to change. It’s long past time to force his hand.

Page 3 of 13

Powered by WordPress & Theme by Anders Norén