I have been singing the praises of Frontline reports a lot this year.
In August, it was “Documenting Hate: Charlottesville,” which tracked down white supremacists involved in alt-right rallies that took place in Virginia the year before and resulted in the death of a counterprotester. Last month, Frontline scored again with “Trump’s Showdown,” a richly-textured look inside the investigation by Special Counsel Robert Mueller.
But “The Facebook Dilemma,” a two-part report airing tonight and tomorrow night on PBS, might be the best Frontline production of the year. If you want to understand the dark side of Facebook and its negative impact on global culture, this is a documentary not to miss.
As is often the case, Frontline’s timing is superb — premiering on the eve of what is shaping up to be one of the most important midterm elections in the last 100 years. Facebook, you might recall, played a destructive role in the 2016 presidential election, allowing Russia and other foreign operatives to publish propaganda and disinformation dedicated to disrupting and possibly subverting the most important activity in our democracy: electing a president.
Facebook did that by abrogating any sense of gatekeeping responsibility. Remember Mark Zuckerberg less than a week after the 2016 election saying he thought it was a “pretty crazy idea” to imagine fake news on Facebook had any effect on the vote?
Part One efficiently takes viewers back to the founding of Facebook and into the thinking that established the culture of the institution — a culture of idealism that only a college student could believe in, yet one that Zuckerberg clings to publicly today.
Facebook’s primary goals, Zuckerberg has said from its earliest days, were to “give people the power to share and to make the world more open and connected.”
I am not sure what any of those vague words mean, but they sound good, don’t they?
Frontline shows you how disingenuous and hollow they ring today by grounding its report in example after example of Facebook’s real priorities of continuous growth and billion-dollar profits contributing to a world of partisan discord, warfare and exploitation of non-elites. And many of those non-elites are Facebook users.
Part One also includes a trip back to Egypt in the “Arab Spring” of 2011, a moment many in the mainstream media would rather forget for their naive celebration of social media, particularly Facebook, as a great force for democracy.
Social media did play a role in the toppling of President Hosni Mubarak’s 30-year rule, but as Wael Ghonim, who created one of the Facebook pages credited with sparking protests, tells Frontline, the social media platform also drove the “Arab Winter,” which saw the rise of Muslim extremism and autocratic rule.
“What was happening in Egypt was polarization,” Ghonim says. “And all these voices started to clash. And the environment on social media breeded that kind of clash, rewarded that polarization.”
His explanation not only speaks to what happened in Egypt in 2011, but also to what is happening in America today with the toxic level of discourse not just on social media but throughout our civic life.
Here is the pernicious secret: “If you increase the tone of your posts against your opponents, you are going to get more distribution,” Ghonim explains. “Because we tend to be more tribal, if I call my opponents names, my tribe is happy and celebrating. ‘Yes, do it.’”
Then, he says, members of his tribe click “like, comment and share” on their Facebook newsfeed in response.
“So, more people wind up seeing it because the algorithm is going to say, ‘Oh, that’s engaging content. People like it. Show it to more people.’ The hardest thing for me was seeing this tool that brought us together tearing us apart,” Ghonim added.
Sound like American tribalism and toxic media rhetoric today?
The report shows how the exploitation of tribalism and warfare is baked into the Facebook newsfeed algorithm, which determines what will or won’t get promoted on newsfeeds to billions of users. And that is at the heart of Facebook’s business model.
I repeat: That kind of tribal conflict is at the heart of their business model.
At the end of two hours of “The Facebook Dilemma,” it is clear that nothing mattered more to Zuckerberg in recent years than making sure that business model resulted in more growth and profits — social responsibility be damned.
Understanding that makes the explanations and pseudo-mea culpas from current Facebook executives who appear in the report as talking heads all the more maddening.
Their words break down into a couple of basic themes.
“We have been slow to really understand the ways in which Facebook might be used for bad things,” says Naomi Gleit, a senior executive. “We’ve been really focused on the good things.”
That’s the we-are-good-but-maybe-were-a-little-too-trusting line that is the company’s main talking point. It is then used to blame the public, not Facebook, for failing to act responsibly.
“We relied on the what we thought were the public’s common sense and common decency to police the site,” says Tim Sparapani, former director of public policy.
Reporter James Jacoby, who is also one of the writers and producers, is an excellent interviewer, wading through the talking points to capture moments in which the Facebook executives unwittingly reveal the core of their arrogant corporate ideology. Anya Bourg and Dana Priest, a former Washington Post reporter who now teaches at the University of Maryland-College Park, also report.
By the end of the two hours I wanted to scream at the TV each time I heard an executive refusing to take responsibility for the harm and evil Facebook contributed to in places like the Ukraine, the Philippines and Myanmar.
And, yes, the good old U.S. of A. in the election of 2016.
And here we sit on the eve of what are expected to be landmark midterms with no real assurance that Facebook won’t fail democracy again.
The problem isn’t just Zuckerberg or Facebook or even social media. It’s all of us in a way who have come come to worship at the altar of algorithms and metrics, who have come to value clicks over social conscience. In that regard, the media are full of sinners.