🪤 The internet was supposed to be democratic. Why isn’t it?
Media theorist Nathan Schneider’s new book on the implicit feudalism of digital communities
We’re hiring a Senior Software Engineer for our Public Spaces Incubator! Learn more and apply here.
We’re seeking admins or mods of online groups for local neighborhoods, towns, and cities. Think: Facebook groups, Subreddits, email lists, and forums. We’re looking to chat with people in the US who do this work, in exchange for a small stipend. Get in touch here.
A few years ago, Nathan Schneider found himself running a large, online community — an email list of more than 500 people. Having started the group, the software put him in charge by default. He realized that, when someone was being a jerk, there were no policies in place for what to do. Not only was it entirely up to him, he was completely unaccountable — there were no tools for group members to challenge his authority or weigh in on decisions.
And around the same time, Nathan’s mother had just been elected president of her garden club, which had a 38-page manual, including bylaws. In his new book, Governable Spaces: Democratic Design for Online Life, Nathan writes:
Few online groups I had been part of could hold a candle to the simple and effective set of rules that had governed the garden club since the 1960s, rules unremarkable among countless similar organizations with a vast range of purposes. Few online groups will last so long.
In addition to his part-time volunteer role as email overlord, Nathan Schneider is a media theorist, founder of the Media Economies Design Lab at University of Colorado Boulder, and long-time friend of New_ Public.
Core to Governable Spaces is the idea of “implicit feudalism” on social media — we’ve all become subservient vassals of CEOs like Mark Zuckerberg. We recently explored this topic with Amy Zhang. Below, Nathan charts how, as Humprey Obuobi described in the last newsletter, some of the earliest digital social spaces were BBSs, which necessarily had to be local, and relied deeply on a “sysop” operator. But over time, as the internet commercialized and social platforms arose, this dynamic of centralized control remained, greatly to the benefit of the tech companies.
In this excerpt, Nathan traces this progression of implicit feudalism and interrogates some of the mechanisms and nuances. But as the book goes on to argue, the dangers reach far beyond just the tech platforms. Nathan suggests that the lack of democratic habits in everyday online life is fueling the rise of authoritarianism in global politics.
The book couldn’t be more relevant to this moment: In advance of Reddit going public, the company actually offered shares to longtime moderators and users. It is a small step toward a goal Nathan has proposed for startups: “exit to community.” And while the US Senate considers what to do with the TikTok legislation passed by the House of Representatives, Nathan has suggested that this issue is also an opportunity to explore more democratic forms of ownership.
Though it’s complex, I love the richness of Nathan’s writing. Take yourself to school on platform governance by checking out some of the links below. Beyond this section, Governable Spaces has case studies and suggestions for new ways of organizing digital communities. And in the spirit of open source resources, the ebook is available to download free. Nathan begins this section by describing what might be considered the beginning of this era of social media, the 2016 presidential election.
–Josh Kramer, New_ Public Head of Editorial
The Rise of Platforms
Nathan Schneider
An excerpt from his new book, Governable Spaces
As Facebook’s public relations apparatus was beginning to come to terms with the platform’s contested role in the 2016 US election, Mark Zuckerberg issued a lengthy essay called “Building Global Community.” In it, he indicated a turn toward emphasizing “meaningful groups” over the user-curated political news that was making Facebook notorious. Recognizing the limits of the company’s regulatory capacity, he mused about the opportunity to “explore examples of how community governance might work at scale.” The essay contains various nods to US political pieties, including a quotation from Abraham Lincoln; at the time, some observers speculated that Zuckerberg might be considering a run for the presidency.
At least from a technical perspective, the rise of globe-spanning corporate networks presented an opportunity for departing from implicit feudalism. No longer was a community’s virtual space sitting in somebody’s house or on a university server; now, the infrastructure was in the hands of companies that described their product as “platforms.” The term bears a claim to neutrality, to simply providing an empty stage for users to fill. Seemingly, the platforms created a new layer of abstraction: compared to earlier systems, communities form at a greater remove from the servers. In 1996, the US Congress passed the Communications Decency Act, whose Section 230 protected platforms from most liability for user behavior. The companies could control the platform layer, while enabling communities to govern however they liked. Yet implicit feudalism persisted, even as platform founders preached democracy.
Facebook is the world’s largest private social-media network, with around 3 billion active users. It has enabled communities to form with its Groups feature since 2005, the year after the website first appeared. Reddit also began in 2005, and by 2008 the social-news platform came to be organized around user-created and user-governed groups known as “subreddits.” Reddit’s active-user population is an order of magnitude smaller than that of Facebook, which still places it among the top ten US networks. In many respects, the two platforms are quite different; Facebook emphasizes users’ “real names” and mutual connections, while Reddit tends to rely on individualized, pseudonymous identities marked with reputation-based “karma.” Both enable significant degrees of local control among user communities, in distinct ways. They have become spaces of tremendous creativity and democratic practice. Nevertheless, both adopt and further advance the pattern of implicit feudalism inherited from earlier networks like BBSes and email lists, despite lacking many of their predecessors’ technological constraints.
Why do feudal defaults persist on large platforms? A Facebook Group doesn’t reside in its creator’s house. A subreddit doesn’t consume the computing resources of its moderators, only that of Reddit itself. It is no longer so obvious that the founder of a community should have dictatorial say over it. The norms and design elements of implicit feudalism are no longer a matter of technical necessity. But they became a business model.
Managing online communities can be hard, thankless work, involving negotiations with an often tiny minority of disruptive users and reviewing potentially traumatic content so that others don’t have to. One of the first large commercial platforms, America Online, began appointing “community leaders” in the early 1990s to moderate its chat rooms and message boards in exchange for reduced cost of access, providing compensation for what was generally perceived as volunteering. But some of these people recognized that their efforts were generating real profits for the company and began to protest; the program drew scrutiny from the Department of Labor as under-compensated work. Since then, platforms have avoided such gray-area compensation. Instead, the allure of implicit feudalism has served as another kind of compensation to incentivize the labor of community management. Rather than criminally low wages, platforms offer moderators the perk of unchecked power.1
An exception that proves the rule among social platforms is Slashdot, an early social-news website with a tech-savvy user-base. As Slashdot grew during the late 1990s, it developed a complex system of moderation (and “metamoderation”) based on a “karma” score — the term Reddit would later adopt. As users accrued karma from other users, they gained the power to moderate and evaluate others’ moderation decisions, producing a basically functional, Wikipedia-like culture of responsible voluntarism. Reputation became a kind of compensation. Slashdot thus employed a fluid system of mutual endorsement rather than a Debian-style electoral republic, but it similarly showed that an open, dynamic system of user empowerment could manage the content on a large platform in ways that generally satisfied its users. Perhaps such a model was even too responsible, failing to produce the kind of provocation and engagement that commercial social networks thrive on.
One mechanism of apparent self-governance that appears in both Facebook and Reddit is the ability for non-moderator users to evaluate fellow users’ posts — on Facebook with the Like button and its various affective sub-options and on Reddit with “upvotes” and “downvotes.” These tools allow users to mutually decide which content is more worth each other’s attention and thus which should rise to the top of the group’s feed. The platforms also allow users to add comments, which have amplifying effects as well. But the most definitive powers of amplification (elevating messages to the top of a group’s feed) and sanction (ejecting posts and users) are reserved for those with administrative roles, who gain their authority by appointment and succession deriving from the group’s founder. Interviews with admins on both platforms reveal that they rarely consult with non-admins on decisions about how to use these powers. Ordinary users’ evaluative tools thus seem to operate as assists on behalf of admins, as well as the companies’ business interests, more than as a means of shared governance. The strongest form of effective voice for ordinary users remains that of exit: to leave a given Facebook Group or subreddit for another or to start a new one.
Facebook and Reddit implement advances in implicit feudalism over earlier paradigms. For instance, rather than merely offering blank text fields for rule-making, as in MediaWiki and GitHub, these platforms have developed structured rule-making interfaces for group admins. Artificial intelligence tools, such as Facebook’s “false news” detector and Reddit’s programmable AutoModerator, offer to streamline the labor of moderating content. Analytics dashboards present admins with detailed reports on the activity of their groups, in effect gamifying the admin role toward maximizing user usage. Such tools add to the panopticism and potency of implicit feudalism’s repertoire.
Feudal community governance has become a norm in the governance of platform companies themselves. This is most evident in the power Mark Zuckerberg retains over Facebook through its dual-class stock structure. To extend the metaphor of feudalism: if admins are ladies and lords, Zuckerberg acts as a monarch, who holds similarly absolutist powers over the rules by which his nobles operate, even without appearing to interfere in their fiefdoms directly. Zuckerberg also rebuffs shareholder proposals to put constraints on his authority. Yet Facebook has meanwhile engaged in “democracy theatre,” such as its 2009 user referendum on proposed changes to its terms of service. For users’ votes to be binding, the company stipulated that 30 percent of its over 1 billion users at the time would need to participate — a scale equivalent to the entire US population. As one might expect for an unprecedented process on a decision about complex legal language, well under a single percentage point of the quorum was reached. The company shrugged, called the vote “advisory,” and proceeded with the rule change as it saw fit.
Reddit’s corporate edifice has had its own brushes with a kind of democracy, such as in the 2015 “Reddit revolt,” when moderators galvanized by crackdowns on toxic behavior during the Gamergate controversy turned on the company. They switched their subreddits to private en masse, resulting in a widespread blackout of the platform’s content and the resignation of interim CEO Ellen Pao. With the victory, however, came heightened enforcement of site-wide policies that brought about more conformity between the platform’s policies and moderator policies at the subreddit level. The moderators can lord over their fiefdoms, but they face consequences if they try to band together against the monarchy.
Conway’s Law is a celebrated truism in software development: technical systems tend to resemble the communication structures of the organizations that create them. Among companies like Facebook and Reddit, the influence has seemed to go the other way. The communication structures of technical systems informed what seemed plausible and practical for the architecture of corporations. Implicit feudalism made its way from the server permissions and the online community to the boardroom.
The centrality of implicit feudalism to online experience has at times wavered, only to return again. In a follow-up missive to “Building Global Community,” Zuckerberg pivoted from a vision of Facebook as a community-oriented “meaningful” space to that of a “privacy-focused” platform for private chat and “intimate” group exchanges. It was a retreat from his aspirations two years earlier for “global community.” Already, Facebook-acquired platforms WhatsApp and Instagram were making gains against the company’s namesake product. The photo-sharing app Instagram did not initially enable persistent groups; WhatsApp permits them within the logic of chat, as opposed to Facebook’s forum-like threaded discussions. Zuckerberg appeared to be learning from China-based WeChat and TikTok in enshrining networked individuals rather than a network of communities as the rubric for platform society. TikTok in particular has shown the possibility of targeted advertising based on personal viewing habits alone, without need for a social graph. This shift trades feudalism—which presumes community, however hierarchical—for platform-mediated experiences, apparently detached from any particular kind of politics. But politics seemed likely to return with Zuckerberg’s next pivot in renaming the company as Meta, proposing to provide the infrastructure for entire immersive worlds. Meanwhile, ascendant community platforms such as Slack and Discord explicitly imitate the social software that gave rise to implicit feudalism—down to the “#” marking channel names following IRC and Discord’s “server” nomenclature for its virtual groups. As corporate teams and mutual-aid activists alike adopt these tools as the basis of their organizing, feudal designs continue to grow in influence.
Thanks Nathan!
The Plurality Institute and The Council for Tech and Social Cohesion are hosting a symposium on comment section R&D at the Internet Archive in San Francisco on May 2.
The Young Futures Academy is seeking to fund early-stage organizations and solutions focused on fostering meaningful social connection, teen belonging, and wellbeing in a tech-driven world. Applications due April 5.
Hoping that this year, “April showers bring May flowers” is more metaphorical than literal,
Josh
Wang et al., “Governing for Free,” presents a nuanced exploration of how different governance regimes generate motivating psychological effects for moderators, based on a survey of Reddit users. Top-down governance models provide some kinds of psychological benefits, particularly in large communities, while more democratic models can provide others.
Lovely edition, thank you.