🥵 🤬 🤩 Understanding how norms shape digital spaces
Research and recommendations for cultivating prosocial norms
Our friends at Sublime just released a zine exploring new possibilities for the internet that looks perfect for fans of New_ Public. You can check it out here and find a special discount code below.
As I wrote in introducing our Digital Spaces Directory, “Nearly every day we meet new people excited by the possibilities of prosocial digital public spaces.” There are so many developers, engineers, designers, and funders attempting to build new products and spaces right now.
One of the main requests we hear from these builders is for more insight into what works and what doesn’t. As we build our own tools and patterns for stewards of online communities, we’ve been doing a lot of reading and we want to share what we’ve learned.
Our researchers Angelica Quicksey and Mary Beth Hunzaker have been taking in dozens of sources, including a lot of social science research applicable to public-spirited product development. We’ll be releasing some full collections of research soon, but before that, we wanted to summarize pieces of it here.
Below we’ll explore insights from the scientific literature, provide links back to the studies and papers themselves,1 and share actionable recommendations for how this research can be put to work in product design.
Tell us what research topics you’d like to read about in the future in the comments below.
This week: a roundup of research on norms — visibility, reinforcement, and intervention
Building on recent newsletters about community stewardship, let’s dive into another subject essential to understanding online groups: norms.
Norms are not just explicit “rules” or community guidelines, they’re also implicit principles that guide members’ understanding of acceptable behavior in a group, formed by their own experiences and observations.
Cultivating shared norms is crucial to the health and success of a community. Highly visible deviant behavior, unpunished norm violations, lack of support for positive behavior, and incompatible private behaviors can all lead to misperceptions about group norms. These misperceptions can in turn cause antisocial behavior, hesitancy to embrace norms, or reluctance to interact at all.
There are all sorts of norms, but we’re especially interested in norms that promote prosocial behavior. These are acts that are generally beneficial to other people like volunteering, public service, helping others directly, or civic behavior like voicing dissent to advocate for beliefs. For online groups, prosocial behaviors include virtual help or generosity, positive interactions, and counteracting toxic behavior.
Let’s explore some of the key factors and concepts related to norms in digital spaces:
Visibility
Somewhat intuitively, the behavior that community members can see for themselves is really important for norm development and maintenance. Unfortunately, on many platforms the most misleading, antisocial behavior — very engaging content — tends to be promoted and rewarded by default due to engagement based ranking. And if antisocial behavior becomes the most visible content, then it has the biggest impact on norms.
There are other issues related to visibility. When stewards intervene and react to antisocial behavior, it can seem important to make a public example, but researchers have shown that often the norm violator is more receptive to private sanctions than public ones. In one study, sanctions on WhatsApp were seen more positively than on Facebook because they were more private.
Product features with high ephemerality, such as Snapchats and Instagram stories that disappear after a day, as well as livestreaming, are somewhat less visible. With ephemeral content, there’s less accountability and norms are not as persistent — norms can evolve more quickly or be almost absent completely.
Recommendations: Beyond what’s explicitly in the rules, be conscious of what behaviors are being rewarded and what’s most visible. Which behaviors are prominently promoted and highlighted? Which are ranked highly? Which actions are presented as defaults? Prioritize bringing visibility to the co-creation norms and governance: include the ability to entertain debate over policies and issues affecting everyone, or offer input in revising rule decisions over time. The University of Washington’s Social Futures Lab is working on tools to support this practice with PolicyKit.
Proactively rewarding or increasing the visibility of prosocial normative behavior is key. It may be worth experimenting with some of these visibility-centric design features:
Explicit signals: Showing rules, rule reminders, nudges, corrective messages or comments, public badges or labels
Implicit signals: Engagement and reactions, upranking, recommendations, prominent actions and default options
Features that hide norm violations: Content removals, demotions and downranking, content screening and warning interstitials
Reinforcement
It’s not enough to see prosocial behavior — users must have the opportunity to be seen practicing it. People like to see others helping, and positive reinforcement is key to perpetuating prosocial norms. Community members are more likely to follow norms if they know their behavior is being observed by others who strongly support those norms.
Reinforcement is connected to how online behaviors, both positive and negative, spread throughout the social internet. In our conventional understanding, individual users with many thousands of weak connections serve as a nexus of viral lies and propaganda.
But according to researchers like Damon Centola, it’s more accurate to say that online behaviors require reinforcement through tightly-knit, clustered social networks. Intuitively, you’re more likely to trust something you’ve heard from multiple people who you trust, rather than a very popular stranger. These clustered connections of strong ties constantly reinforce and amplify each other. This helps explain collective group dynamics, viral marketing, and the spread of disinformation.
Recommendations: To perpetuate prosocial behaviors, consider developing features that reward and reinforce prosocial behavior (see the above recommendations). In developing strategies for stopping the spread of viral antisocial behavior, consider intervening at the network level — impacting the structure of how users are connected — rather than by censoring highly-connected individual users. Consider making some punishment interventions private.
Intervention
If there is no corrective punishment for violating a norm, that norm is vulnerable to being transformed or abandoned. Other users, community stewards, platform representatives, and even automated product features have a role to play when norms are tested. On some dominant social platforms, community members have limited options and may only be able to comment or report a problem. But there’s actually a wide menu of intervention types, not all of which are punitive:
Social corrections identify and clarify norms: If someone says something rude, another commenter might reply with an emoji reaction like 🙄 to signal conflict with a norm.
Social sanctions directly punish norm violations: If a user goes too far, they are banned or suspended.
Hiding violations decreases visibility: If there’s an inappropriate comment, stewards might delete it to turn down the temperature.
Recommendations: Emphasizing rehabilitation and education over punishment can lead to more prosocial outcomes. Not every norm violator is a terrible person that should be removed. In many cases, possibly a majority of cases where a norm is violated, the community member is well-intentioned and getting it wrong because they didn’t understand the norm. After an appropriate intervention, well-meaning members may successfully avoid violating norms in the future. Having robust and diverse options for interventions is key, and all three kinds of the above interventions are welcome in platform design.
Further research
There’s more to norms that we hope to share with you soon. Two pieces of key research in this discussion are Damon Centola’s book How Behavior Spreads: The Science of Complex Contagions and the paper “A conceptual framework for the mutual shaping of platform features, affordances and norms on social media” by Nathalie Van Raemdonck and Jo Pierson.
Here are additional research references for exploring the topic of norms:
Matias: “Preventing harassment and increasing group participation through social norms in 2,190 online science discussions”
Batson: “Four forms of prosocial motivation: Egoism, altruism, collectivism, and principlism.”
Ceylan et al: “Sharing of misinformation is habitual, not just lazy or biased”
Katsaros et al: “Procedural Justice and Self Governance onTwitter: Unpacking the Experience of RuleBreaking on Twitter”
Penner et al: “Prosocial Behavior: Multilevel Perspectives”
Parker: “The Art of Gathering: How We Meet and Why It Matters”
Prosocial Design Network: “Prosocial Design Research Compendium”
Schoenebeck, Blackwell: “Reimagining Social Media Governance: Harm, Accountability, and Repair”
Zhang et al: “PolicyKit: Building Governance in Online Communities”
What subjects should we cover in future research roundups? Tell us in a comment:
The team behind Sublime, previously featured here, is out with a new publication called “Can You Imagine? A Library of Possibilities for Reimagning the Web.” Our own Sam Liebeskind has a piece in it, which you can read here, and you can buy your own copy of the digital or limited-edition print version. Use NewPublic20 for 20% off.
Coming up with a good norms joke about Norm from Cheers,
–Josh
Here’s our guide on how to read and understand this type of research.
As someone who works with high school and middle schools, I define principles as guiding our thoughts and actions towards intended outcomes. My foundational principle is "recognizing the dignity, the essential worth, of everyone is non-negotiable (as opposed to respect which is often used to demand obedience from people in a system that have less power than people in "respected" positions of authority. All of the other principles come from this dignity foundation like "listening to being prepared to be changed by what you hear" or "be easy on people, hard on ideas" or "no one knows everything, together we know a lot." This article is incredibly helpful for educators who are trying and often struggling to teach their students about their behavior online and puts words and explanations to how norms literally work in young people's lives. We so often lecture young people about norms but don't show how they actually work in group dynamics--in real life let along online spaces. All to say thank you!
As a person working in the game development sector, I am hungry for research that correlates anti-social behavior in gamescapes and game-centric digital communities to IRL anti-social behaviors. Does this research exist?