👍 😡 Three billion people using one website… what could go wrong?
Reconsidering Facebook at 20 through a book on the Facebook Files
Calling all local community stewards: We have a quick survey about your role in your community. See below for more info.
We’re hiring two new roles: Contract Researcher, Local, and Social Media Fellow
Facebook turned 20 this month, a fact that is huge and momentous, but perhaps not as huge and momentous as the platform’s influence on our culture and society. How do we assess the success of a platform with three billion users?
In some ways, Facebook has delivered on its promise of connection for users like Lola Omolala, who I spoke to here last month. “The blue app” has brought the internet to hundreds of millions and allows them to share stories and form new communities like never before. Facebook is “uniquely positioned for relationship-building,” as Lola put it.
And on the other hand… Facebook, and the age of social media it ushered in, has a lot to answer for. Just ask the 42 Attorneys General suing Meta over Instagram right now. Courtney Radsch takes stock of this legacy in an opinion piece in The Guardian:
… the spread of terrorism and violent extremism, mass violence and online harassment … human trafficking, drug trafficking and the illegal wildlife trade, along with the proliferation of child sexual abuse material and child exploitation … propaganda, disinformation and information warfare, undermining the integrity of our information ecosystems and elections around the world.
Are these impacts accidental, inevitable by-products of a well-meaning company connecting the world? Or the result of uncaring, profit-obsessed executives super-charging algorithmic ranking on a global scale? Which is true? Is it possible to know?
Ever since “The Social Network,” outside observers have craved the opportunity to understand the intent of Facebook’s C-suite, especially the wunderkind turned metaverse enthusiast and Hawaiian cattle farmer, Mark Zuckerberg.
A few years ago, the Facebook Files were leaked by former Facebook employee / whistleblower Frances Haugen and then written about in the Wall Street Journal, providing a rare glimpse into internal decision-making. Now, one of those writers, Jeff Horwitz, has expanded that reporting into a book, Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets, and guest writer Ashira Morris has a book report on it for us.
Horwitz has a strong take, for sure, but the book offers an interesting opportunity to reflect on two decades of Facebook and lessons we might learn for the future.
–Josh Kramer, New_ Public Head of Editorial
Decoding Facebook’s decision-making
Broken Code isn’t concerned with Facebook’s early days: it’s an examination of how the company dealt with power and influence once it had them. Jeff Horwtiz moves through a decade of the platform’s scandals, culminating in the Jan. 6 insurrection in 2021. He circles around what content Facebook encouraged, allowed, or outright banned, and what mechanisms existed to establish and reinforce those norms.
In addition to the Journal’s own damning headlines, the Facebook Files were shared with a consortium of news outlets and led to a deluge of reporting. Haugen testified in front of Congress. Facebook’s oversight board reviewed its system prioritizing famous and powerful accounts. And the issues raised are still part of the ongoing public conversation about what, exactly, Facebook owes its users.
In Horwitz’s telling, in scenario after scenario, executives chose growth metrics over creating a positive community space online. Even when employees raised concerns, Zuckerberg and other higher-ups declined to make intentional changes to prevent the spread of toxic content.
“As much as the company talked about giving people a voice, the company had been built to make people use Facebook — and then repeatedly refined to ensure they used it more,” Horwitz writes.
It is impossible to know what Facebook’s leadership truly intended, but Horwitz shows how systems were created to reward high rates of engagement — with little regard to what, exactly, was being engaged with — and accelerate polarizing and negative content. Zuckerberg has said that this hands-off approach was about protecting free speech and trusting users. But, as Horwitz points out, Zuckerberg’s attempt at impartiality is a choice in and of itself, and it happens to be good for his company’s bottom line and stock price. Perhaps Zuckerberg genuinely believes in it, but either way, a lot of toxic, harmful content proliferated on Facebook for many years.
Growth over quality
Although Facebook’s metrics to mark success have changed over time, the recurring theme across Horwitz’s chapters is a drive toward growth. New product features, from the “like” button to emoji reactions, caused massive shifts in what types of user behavior were boosted across the platform. Horwitz describes a company where programmers were encouraged to update algorithms without observing long-term impacts or thinking of more than meeting a data-driven goal.
This impulse, to move fast, may be driven by the rationale that deep thinking about complex problems is slow, and that iterative changes can improve a platform effectively. As New_ Public engineer Rob Ennals has written:
When I was at Facebook, it was common for engineers to have suspicions that the changes they were shipping were actually making the product worse, but the cultural norm was to ship such changes anyway. The assumption is that it’s worth the cost of shipping some changes that make the product worse if it allows the company to iterate faster.
However, this type of thinking can lead to the inevitable conclusion of the phrase: “move fast and break things.” And some features on Facebook were very broken, for years, for many millions of people. “Facebook was more cavalier in shaping a global communications and broadcasting platform than Netflix was about deciding to steer users toward The Great British Bake Off,” Horwitz writes.
By 2014, one of the primary metrics was how much time each user spent on the site. And while spending every waking hour on Facebook might not be something a platform that cared about a functioning society would incentivize, Facebook rewarded accounts that spent 24 hours logged on. Some of the most active ones only went dark during Russian public holidays — clearly foreign troll accounts understood how to game the system.
This sort of maximalist approach plays out in other examples, like the Messenger app team defending a forwarding feature speeding the spread of viral hoaxes because it was helping them hit their goals.
In 2018, Facebook introduced “meaningful social interactions” as a guiding growth metrics. On its face, this metric might seem like the right kind of engagement to prioritize. However, Horwitz describes how staffers immediately recognized that “turbocharging comments, reshares, and emojis would have unpleasant effects” — it was already clear that the most prolific users were a small number of accounts that posted edgier, more partisan content.
To be fair, it’s possible that many people working at Facebook, perhaps even at times a majority of the executives, wanted the site to be a nice place to spend time and for those “meaningful social interactions” to truly be meaningful engagement with friends and family. But Horwtiz argues that the company kept on rewarding mayhem that increased engagement in the face of overwhelming evidence of toxicity and harm. And he shows a cycle of employees pointing out the negative consequences of unrestrained virality and top leadership choosing growth and engagement over making the platform safer. The virality wins out.
While Horwtiz doesn’t always draw clear connections between more engagement and more revenue for the company, he does note early on that “advertising was the beachhead for machine learning at Facebook.”
Ugly content wins
Given the emphasis on growth over quality, Facebook’s recommendations for groups to join, content to view, and accounts to follow were “bad from the start,” Horwitz writes. In one anecdote, a test account of a 21-year-old Indian woman was nearly immediately fed “a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.”
The executives calling the shots wouldn’t see these sorts of posts, though — their accounts are professionally managed and couldn’t be added as friends by users. They “struggled to grasp the range of what was happening on Facebook,” Horwitz writes, since they weren’t frequent users and rarely dove into the pages and groups that were becoming a primary part of the site.
On top of this differently lived online experience, they didn’t invest in tools to understand what was happening on the platform, content-wise. Horwitz argues that Facebook could have parsed its own data, but chose not to for over a decade. Eventually, Facebook bought social media monitoring software CrowdTangle in 2016. When the CrowdTangle team gave a presentation of the top daily Facebook content to executives, “the unifying trait was that all of it was awful.”
Unfortunately, argues Horwitz, Facebook never proactively defined what content it wanted to host, only playing catch up to remove what it didn’t. Given its gargantuan size and scale, this was an impossible task, even for tens of thousands of content moderators and trust and safety professionals. According to Horwitz, little could get in the way of virality and growth.
Horwitz cites a few counterexamples, like Facebook deciding not to count “angry” emoji reactions toward a post’s virality in the lead-up to the 2020 election, but these usually came years after staffers pointed out problems. Often, Horwitz shows, when a measure is put in place and shown to work, it’s removed as soon as the main event — like an election or media moment — passes, giving the small victories an air of ephemerality.
Hands-off monitoring
Calling the book Broken Code is apt because everything — including what’s boosted and what’s removed — comes back around to decisions made by Facebook’s leadership about what to put into the code. The company’s “engineering-minded leadership” that reinforced a metrics-based approach to decision-making and used machine learning to detect and remove unwanted content, like pornographic images. Facebook’s own engineers acknowledged that Facebook’s tools to stop hate speech could only catch two percent of hateful posts.
And as much as Facebook was unable to keep up with content moderation in English, it performed even worse in other languages — when discussing the role Facebook played in inciting genocide in Myanmar, an employee describes the impossibility of using Google Translate to parse Burmese. The trauma that human moderators for Facebook have been exposed to is briefly noted in the book — a native Portuguese speaker crying while removing violent homophobic content in the aftermath of a Brazilian election — and well-documented in other reporting.
Facebook’s Civic Integrity team is often brought up in the book as the office Cassandra, aware of each problem before it gets out of control. Their initial mandate was increasing civic participation — like nudges that encourage users to vote in elections — and helping political activists use the platform. But the team’s focus quickly shifted from encouraging democratic processes on Facebook to pointing out the ways Facebook was threatening them. A researcher found that 64% of individuals who joined extremist groups joined because Facebook recommended those groups.
Early on in Broken Code, Horwitz poses the question, “What were Facebook’s responsibilities to the constituencies of the online society it had shaped?” The book answers it through the voices of employees petitioning to slow the viral spread of misinformation and stop recommending QAnon groups. Horwitz shows again and again that Facebook employees recognized potential for harm but weren't given the influence or authority to act and prevent it.
Unfortunately, Horwitz doesn't put forward a cohesive positive vision of what Facebook could be. He implies that Zuckerberg and other top executives never slowed down enough to think deeply about the healthiest version of the platform, let alone doing the work to reach that better version. Since the period Horwitz covers here, Facebook has pivoted again — renaming the corporation Meta and even taking steps towards deprioritizing news content.
And it’s worth noting that in addition to reporting, we need far more scientific data to understand how these algorithms affect user behavior and ideology. It’s one thing to surface evidence, as Horwitz has, that some people on Facebook are being shown a lot of toxic content and propaganda. But, if that’s true, what do we know for sure about the effects of this content? Is this actually increasing polarization, and making people’s beliefs more extreme?
In a first of its kind study, New_ Public Co-Director Talia Stroud and her research partners were given access to certain controls in Facebook, and they measured different factors in the three months before the last election. They found, among many other results, that:
Reducing the prevalence of politically like-minded content in participants’ feeds during the 2020 U.S. presidential election had no measurable effects on attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims.
Certainly, individuals’ beliefs and behavior are affected by more than what they see on Facebook, but it’s also undeniable that the platform has played a significant role in shaping the last 20 years. What about the next 20? Facebook already feels less all-encompassing than it did in the years covered by Broken Code.
Perhaps social media platforms centered on infinite growth and ad revenue can become artifacts of the past, and maybe the future will bring new online spaces that offer meaningful connection with new, different kinds of governance and accountability. And perhaps, thanks to reporting like this, we aren’t doomed to repeat Facebook’s mistakes.
New_ Public and the Center for Media Engagement want to know more about what excites online leaders, the role they play, and the challenges they face. Your insights will help us better support community leaders.
The survey takes around ten minutes. Five survey participants will be randomly selected to win $25 gift cards. Please feel free to share with anyone who you think may be interested! You can reach out with any questions about this research study here.
Thanks Ashira!
Still thinking about Usher’s halftime show a week later,
–Josh
Great review and overview of the book!
3 billion was the entire population of the earth when I was growing up in the 60s and 70s. Now there's more than 8 billion of them, and Facebook will be old enough to drink next year. When I was growing up, the drinking age in our house was 9. Anyway, scary stuff to reflect on, I think maybe.