How storytelling can defeat surveillance
Plus: a sneak peek at our launch next week
Welcome back to Civic Signals, where changes are on the horizon. We’ve emerged from the election crush completely exhausted but also with a renewed pep in our step. What feels possible to you that felt foreclosed on just a few weeks ago? With forward momentum in mind, we bring you a fascinating conversation about fighting the Facebookization of public space, bring back our tasty links with What’s Clicking, and enlist you in our launch next week.
The election is “over,” and we’re feeling excited to say Hello World next week (we know we’ve been promising this for a while—but IT’S HAPPENING).
But first, we bring you our conversation with Mutale Nkonde, a fellow at the Stanford Digital Civil Society Lab, about how we can keep the worst elements of digital spaces from creeping into our offline world—and what’s at stake.
Resisting the “Facebookization” of public space
In June, Nkonde and her colleagues at the Stanford Digital Civil Society Lab—Lucy Bernholz, Toussaint Nothias, and Tawana Petty—published an open letter in Stanford Rewired that called people to resist what they called the “Facebookization” of public space. The group had initially planned to provide social media platforms with a toolkit of suggested changes to protect the U.S. election, but the police response to the protests that followed George Floyd’s death in June prompted them to talk instead about surveillance. It’s a “Faustian bargain,” they wrote—the same platforms and technologies that are credited with giving life to movements like Black Lives Matter are part of a larger apparatus that spies on the public and makes us less free.
As a society, we have been slow to understand how our communications on digital platforms are controlled, monitored, and extracted for profit. For almost three decades, scholars and activists have debated the proper rules for online speech even as we too have voluntarily expanded our dependence on the companies and infrastructure at the source of controversy. Even in the wake of Cambridge Analytica, we still believe in these digital platforms’ promises to document our humanity and empower social activism.
This might sound complicated, but the ways it plays out in real life are very clear. Nkonde cited the example of Derrick Ingram, a 28-year-old Black Lives Matter activist police arrested in August. Ingram, who was accused of shouting into a police officer’s ear with a bullhorn, was arrested after the NYPD used facial recognition technology to track Ingram down via his Instagram account. Brandishing a “Facial Identification Section Informational Lead Report” with a photo taken from Ingram’s Instagram, the NYPD laid siege to his apartment. They closed off the street. They sent a helicopter to patrol overhead. They wore tactical gear and carried shields. Police, who did not have a warrant, waited in the hallway of Ingram’s building with dogs, banging on his door.
And Ingram? He streamed the siege live...on Instagram.
The weaponization of facial recognition technology against protesters via a seemingly innocuous platform like Instagram felt like a watershed, said Nkonde. Not only did it violate the NYPD’s policies for use of facial recognition, being neither an image from a surveillance video nor an arrest photo, it showed how tools which are ostensibly meant to catch criminals are turned against ordinary people.
“When we started talking about tech and society, we focused on the criminal justice system,” said Nkonde. That was a problem, because many people thought “well, I haven’t done anything wrong” and decided it didn’t affect them. “I want to see an understanding of the way technological systems mediate our everyday lives.”
Screen time is not the problem
We’re moving away from screens, said Nkonde, towards “ambient” technology, and it’s no longer useful to think of our digital and physical worlds as separate spaces. “You can go outside and walk into a surveillance system, just by being in public space.” Yet most Americans who don’t study or work in technology have little understanding of the extent to which machines already govern their “offline” lives. We simply can’t think about public space without considering digital technology at this point: the two are too closely intertwined.
As Nkonde and her colleagues wrote in June, the stakes couldn’t be higher:
Privatized digital control over our public spaces: this is what we are building with facial recognition technology, beacon-based advertising, geofencing, Stingray phone trackers, drones, and closed-circuit cameras….If we allow the “Facebook-ization” of our physical, public spaces to continue, we will have no democratically-ruled spaces left in which to speak, gather, mourn, or govern ourselves.
To resist surveillance, educate the public
Nkonde says her goal is for every American to understand the extent to which AI and digital surveillance already impact their everyday lives. “I want people to think about an area of their life that they're really passionate about; an area of their life that they need a public service or a public good” and then imagine it being controlled by an algorithm, she said—because it probably already is. To this end, she’s started AI For The People, a nonprofit creative agency dedicated to helping people tell better stories about AI, with an understanding that we need better storytelling if we’re going to enact broader change.
Nkonde and her colleagues are optimistic about our ability to protect our public spaces. This historical moment, difficult as it is, offers an opportunity, they wrote:
Overpoliced Black communities have been forced to develop tools for resisting surveillance and methods for powerfully articulating its harmful impact. Meanwhile, scholars, technologists, and activists understand the regulations needed to grant people control over their data, prevent the repurposing of personal information, and mandate the destruction of medical surveillance systems when the viral threat passes.
If we bring this varied expertise together, they argue, we can protect communities who have historically been targeted by digital systems. The first step is to talk to each other.
Our co-director Eli Pariser talked to The New York Times about how Facebook and Twitter protected the election by disabling or changing key features of their products. “If you do this for U.S. elections, why not other countries’ elections? Why not climate change? Why not acts of violence?”
Twitter says they’re leaving their quote tweet prompt in place while they take more time “to study and further understand the impact.”
The world can’t get enough of 15-minute cities
“The CEOs of tech companies should try to do less harm to our democracy. But I didn’t elect them to fix it. No one elected them.” Jill Lepore and danah boyd talked about tech and democracy.
Retailers are pushing their minimum-wage employees to become TikTok influencers. Compensation includes “small perks, like gift cards.”
This piece on “The Digital City and the Analog City” is a useful framing device, and a hopeful one: we already know some things about how to make cities better.
Coming next week:
We’ve been promising something big was in the works for a while now, and it’s finally here! Next Thursday, we’re planning to announce the next phase of our project: this newsletter (which will undergo a bit of a makeover), an editorial offshoot with some amazing contributors, a community space where we can get to know each other a bit better—and a few more exciting things we can’t tell you about just yet.
We couldn’t be more excited and we’re going to need your help spreading the word. Watch this space—and if you’re not already following us on Twitter, Instagram, or LinkedIn, this would probably be a good time to do that. Or why not share the newsletter with your friends?
Thanks for being part of the Civic Signals community. Big changes are underway, and we can’t wait to take you with us.
See you in the future,
The Civic Signals Team
Illustrations by Josh Kramer
Civic Signals is a partnership between the Center for Media Engagement at the University of Texas, Austin, and the National Conference on Citizenship, and was incubated by New America.