š š” Three billion people using one websiteā¦ what could go wrong?
Reconsidering Facebook at 20 through a book on the Facebook Files
Calling all local community stewards: We have a quick survey about your role in your community. See below for more info.Ā Ā
Weāre hiring two new roles: Contract Researcher, Local, and Social Media Fellow
Facebook turned 20 this month, a fact that is huge and momentous, but perhaps not as huge and momentous as the platformās influence on our culture and society. How do we assess the success of a platform with three billion users?Ā
In some ways, Facebook has delivered on its promise of connection for users like Lola Omolala, who I spoke to here last month. āThe blue appā has brought the internet to hundreds of millions and allows them to share stories and form new communities like never before. Facebook is āuniquely positioned for relationship-building,ā as Lola put it.
And on the other handā¦ Facebook, and the age of social media it ushered in, has a lot to answer for. Just ask the 42 Attorneys General suing Meta over Instagram right now. Courtney Radsch takes stock of this legacy in an opinion piece in The Guardian:Ā
ā¦ the spread of terrorism and violent extremism, mass violence and online harassment ā¦ human trafficking, drug trafficking and the illegal wildlife trade, along with the proliferation of child sexual abuse material and child exploitation ā¦ propaganda, disinformation and information warfare, undermining the integrity of our information ecosystems and elections around the world.
Are these impacts accidental, inevitable by-products of a well-meaning company connecting the world? Or the result of uncaring, profit-obsessed executives super-charging algorithmic ranking on a global scale? Which is true? Is it possible to know?Ā
Ever since āThe Social Network,ā outside observers have craved the opportunity to understand the intent of Facebookās C-suite, especially the wunderkind turned metaverse enthusiast and Hawaiian cattle farmer, Mark Zuckerberg.
A few years ago, the Facebook Files were leaked by former Facebook employee / whistleblower Frances Haugen and then written about in the Wall Street Journal, providing a rare glimpse into internal decision-making. Now, one of those writers, Jeff Horwitz, has expanded that reporting into a book, Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets, and guest writer Ashira Morris has a book report on it for us.
Horwitz has a strong take, for sure, but the book offers an interesting opportunity to reflect on two decades of Facebook and lessons we might learn for the future.Ā
āJosh Kramer, New_ Public Head of Editorial
![An illustration with many small silhouette avatar images and warning symbols, a large eye with a growth chart in the iris, a squiggly line flourish, and an upward-pointing arrow. An illustration with many small silhouette avatar images and warning symbols, a large eye with a growth chart in the iris, a squiggly line flourish, and an upward-pointing arrow.](https://substackcdn.com/image/fetch/w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d520fe8-e531-4d2e-b776-0c7484613490_1457x1049.png)
Decoding Facebookās decision-making
Broken Code isnāt concerned with Facebookās early days: itās an examination of how the company dealt with power and influence once it had them. Jeff Horwtiz moves through a decade of the platformās scandals, culminating in the Jan. 6 insurrection in 2021. He circles around what content Facebook encouraged, allowed, or outright banned, and what mechanisms existed to establish and reinforce those norms.Ā
In addition to the Journalās own damning headlines, the Facebook Files were shared with a consortium of news outlets and led to a deluge of reporting. Haugen testified in front of Congress. Facebookās oversight board reviewed its system prioritizing famous and powerful accounts. And the issues raised are still part of the ongoing public conversation about what, exactly, Facebook owes its users.
In Horwitzās telling, in scenario after scenario, executives chose growth metrics over creating a positive community space online. Even when employees raised concerns, Zuckerberg and other higher-ups declined to make intentional changes to prevent the spread of toxic content.Ā
āAs much as the company talked about giving people a voice, the company had been built to make people use Facebook ā and then repeatedly refined to ensure they used it more,ā Horwitz writes.
It is impossible to know what Facebookās leadership truly intended, but Horwitz shows how systems were created to reward high rates of engagement ā with little regard to what, exactly, was being engaged with ā and accelerate polarizing and negative content. Zuckerberg has said that this hands-off approach was about protecting free speech and trusting users. But, as Horwitz points out, Zuckerbergās attempt at impartiality is a choice in and of itself, and it happens to be good for his companyās bottom line and stock price. Perhaps Zuckerberg genuinely believes in it, but either way, a lot of toxic, harmful content proliferated on Facebook for many years.Ā
Growth over quality
Although Facebookās metrics to mark success have changed over time, the recurring theme across Horwitzās chapters is a drive toward growth. New product features, from the ālikeā button to emoji reactions, caused massive shifts in what types of user behavior were boosted across the platform. Horwitz describes a company where programmers were encouraged to update algorithms without observing long-term impacts or thinking of more than meeting a data-driven goal.Ā
This impulse, to move fast, may be driven by the rationale that deep thinking about complex problems is slow, and that iterative changes can improve a platform effectively. As New_ Public engineer Rob Ennals has written:
When I was at Facebook, it was common for engineers to have suspicions that the changes they were shipping were actually making the product worse, but the cultural norm was to ship such changes anyway. The assumption is that itās worth the cost of shipping some changes that make the product worse if it allows the company to iterate faster.
However, this type of thinking can lead to the inevitable conclusion of the phrase: āmove fast and break things.ā And some features on Facebook were very broken, for years, for many millions of people. āFacebook was more cavalier in shaping a global communications and broadcasting platform than Netflix was about deciding to steer users toward The Great British Bake Off,ā Horwitz writes.Ā
By 2014, one of the primary metrics was how much time each user spent on the site. And while spending every waking hour on Facebook might not be something a platform that cared about a functioning society would incentivize, Facebook rewarded accounts that spent 24 hours logged on. Some of the most active ones only went dark during Russian public holidays ā clearly foreign troll accounts understood how to game the system.Ā Ā
This sort of maximalist approach plays out in other examples, like the Messenger app team defending a forwarding feature speeding the spread of viral hoaxes because it was helping them hit their goals.Ā
In 2018, Facebook introduced āmeaningful social interactionsā as a guiding growth metrics. On its face, this metric might seem like the right kind of engagement to prioritize. However, Horwitz describes how staffers immediately recognized that āturbocharging comments, reshares, and emojis would have unpleasant effectsā ā it was already clear that the most prolific users were a small number of accounts that posted edgier, more partisan content.
To be fair, itās possible that many people working at Facebook, perhaps even at times a majority of the executives, wanted the site to be a nice place to spend time and for those āmeaningful social interactionsā to truly be meaningful engagement with friends and family. But Horwtiz argues that the company kept on rewarding mayhem that increased engagement in the face of overwhelming evidence of toxicity and harm. And he shows a cycle of employees pointing out the negative consequences of unrestrained virality and top leadership choosing growth and engagement over making the platform safer. The virality wins out.Ā
While Horwtiz doesnāt always draw clear connections between more engagement and more revenue for the company, he does note early on that āadvertising was the beachhead for machine learning at Facebook.ā
![A very pixelated portrait of Mark Zuckerberg, made of colorful blocks. A very pixelated portrait of Mark Zuckerberg, made of colorful blocks.](https://substackcdn.com/image/fetch/w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F754fc9a5-0467-44b1-afcc-4aefc6d8c5d6_2913x2096.png)
Ugly content wins
Given the emphasis on growth over quality, Facebookās recommendations for groups to join, content to view, and accounts to follow were ābad from the start,ā Horwitz writes. In one anecdote, a test account of a 21-year-old Indian woman was nearly immediately fed āa near constant barrage of polarizing nationalist content, misinformation, and violence and gore.āĀ
The executives calling the shots wouldnāt see these sorts of posts, though ā their accounts are professionally managed and couldnāt be added as friends by users. They āstruggled to grasp the range of what was happening on Facebook,ā Horwitz writes, since they werenāt frequent users and rarely dove into the pages and groups that were becoming a primary part of the site.
On top of this differently lived online experience, they didnāt invest in tools to understand what was happening on the platform, content-wise. Horwitz argues that Facebook could have parsed its own data, but chose not to for over a decade. Eventually, Facebook bought social media monitoring software CrowdTangle in 2016. When the CrowdTangle team gave a presentation of the top daily Facebook content to executives, āthe unifying trait was that all of it was awful.ā
Unfortunately, argues Horwitz, Facebook never proactively defined what content it wanted to host, only playing catch up to remove what it didnāt. Given its gargantuan size and scale, this was an impossible task, even for tens of thousands of content moderators and trust and safety professionals. According to Horwitz, little could get in the way of virality and growth.
Horwitz cites a few counterexamples, like Facebook deciding not to count āangryā emoji reactions toward a postās virality in the lead-up to the 2020 election, but these usually came years after staffers pointed out problems. Often, Horwitz shows, when a measure is put in place and shown to work, itās removed as soon as the main event ā like an election or media moment ā passes, giving the small victories an air of ephemerality.
Hands-off monitoring
Calling the book Broken Code is apt because everything ā including whatās boosted and whatās removed āĀ comes back around to decisions made by Facebookās leadership about what to put into the code. The companyās āengineering-minded leadershipā that reinforced a metrics-based approach to decision-making and used machine learning to detect and remove unwanted content, like pornographic images. Facebookās own engineers acknowledged that Facebookās tools to stop hate speech could only catch two percent of hateful posts.
And as much as Facebook was unable to keep up with content moderation in English, it performed even worse in other languages ā when discussing the role Facebook played in inciting genocide in Myanmar, an employee describes the impossibility of using Google Translate to parse Burmese. The trauma that human moderators for Facebook have been exposed to is briefly noted in the book ā a native Portuguese speaker crying while removing violent homophobic content in the aftermath of a Brazilian election ā and well-documented in other reporting.
Facebookās Civic Integrity team is often brought up in the book as the office Cassandra, aware of each problem before it gets out of control. Their initial mandate was increasing civic participation ā like nudges that encourage users to vote in elections ā and helping political activists use the platform. But the teamās focus quickly shifted from encouraging democratic processes on Facebook to pointing out the ways Facebook was threatening them. A researcher found that 64% of individuals who joined extremist groups joined because Facebook recommended those groups.Ā
Early on in Broken Code, Horwitz poses the question, āWhat were Facebookās responsibilities to the constituencies of the online society it had shaped?ā The book answers it through the voices of employees petitioning to slow the viral spread of misinformation and stop recommending QAnon groups. Horwitz shows again and again that Facebook employees recognized potential for harm but weren't given the influence or authority to act and prevent it.Ā
Unfortunately, Horwitz doesn't put forward a cohesive positive vision of what Facebook could be. He implies that Zuckerberg and other top executives never slowed down enough to think deeply about the healthiest version of the platform, let alone doing the work to reach that better version. Since the period Horwitz covers here, Facebook has pivoted again ā renaming the corporation Meta and even taking steps towards deprioritizing news content.Ā
And itās worth noting that in addition to reporting, we need far more scientific data to understand how these algorithms affect user behavior and ideology. Itās one thing to surface evidence, as Horwitz has, that some people on Facebook are being shown a lot of toxic content and propaganda. But, if thatās true, what do we know for sure about the effects of this content? Is this actually increasing polarization, and making peopleās beliefs more extreme?Ā
In a first of its kind study, New_ Public Co-Director Talia Stroud and her research partners were given access to certain controls in Facebook, and they measured different factors in the three months before the last election. They found, among many other results, that:
Reducing the prevalence of politically like-minded content in participantsā feeds during the 2020 U.S. presidential election had no measurable effects on attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims.
Certainly, individualsā beliefs and behavior are affected by more than what they see on Facebook, but itās also undeniable that the platform has played a significant role in shaping the last 20 years. What about the next 20? Facebook already feels less all-encompassing than it did in the years covered by Broken Code.
Perhaps social media platforms centered on infinite growth and ad revenue can become artifacts of the past,Ā and maybe the future will bring new online spaces that offer meaningful connection with new, different kinds of governance and accountability. And perhaps, thanks to reporting like this, we arenāt doomed to repeat Facebookās mistakes.Ā Ā Ā Ā
ā Ashira Morris
New_ Public and the Center for Media Engagement want to know more about what excites online leaders, the role they play, and the challenges they face. Your insights will help us better support community leaders.Ā
The survey takes around ten minutes. Five survey participants will be randomly selected to win $25 gift cards. Please feel free to share with anyone who you think may be interested! You can reach out with any questions about this research study here.
Thanks Ashira!
Still thinking about Usherās halftime show a week later,
āJosh
Great review and overview of the book!
3 billion was the entire population of the earth when I was growing up in the 60s and 70s. Now there's more than 8 billion of them, and Facebook will be old enough to drink next year. When I was growing up, the drinking age in our house was 9. Anyway, scary stuff to reflect on, I think maybe.