🌄 MetaFilter’s rule-laden mini-utopia
What a high-quality social platform looks like
📏 What does “success” mean for a platform? Growth? Or something else?
👨🏽⚖️ How a deeply moderated site actually works: “not all posts past muster”
🔗 Prizing connection, community and emotional intelligence above traffic
As we examined last week, social platforms come in all sorts of shapes and sizes. But one of the oldest and most replicated formats for socializing online is a forum: a place to share links and talk about them. Reddit is a notable, massive example, and there are many others, some of which are notoriously toxic breeding grounds for the very worst behavior online.
This week, we take a closer look at MetaFilter, a very old online forum that’s an exceedingly pleasant place to spend time online, with really high-quality content. For example, let’s say you have Covid, and want some advice on tv shows to stream, but you can’t really handle anything too intense. You could try Googling it, but you’re probably going to get better answers from other human beings. This kind of query is where a high-quality forum like MetaFilter really delivers. It has a ton of amazing things I’d be shocked (and thrilled!) to see larger platforms adopt, from the card club where strangers all over the world mail each other cards, to the one-click “Hide US politics posts” button.
What separates a forum like MetaFilter from Quora, Reddit, or even 8chan? The answer is culture — rules and expectations, developed over a long time. To be clear: MetaFilter isn’t good because it has a lot of old rules, it’s good because it has the right old rules. Below, I lay out some unique, interesting qualities that have developed at MetaFilter over the years and how they’ve contributed to a culture that is still thriving after two decades.
Slow, purposeful barriers to entry
Today, there’s an expectation that if you want to join a social media network, the experience should be “frictionless” — you should be able to start as easily and seamlessly as possible. The quicker a platform can get you comfortable and interested (and typically, at least casually addicted), the better. MetaFilter, which comes from an era of image-less forums, where you can’t even reply to a post in-line, takes a different approach. Anyone can read the site, but to post you must pay $5 and wait one week. As they explain on their new user page, these rules, and other waiting periods, are partly to combat spam and bots. But primarily, they want users to understand what MetaFilter is, how it works, and what’s expected of users before they jump in completely.
It may take a little time and effort to get into MetaFilter, but every decision about how the site works is instantly available for any visitor to read through the MetaTalk subpage. This goes far beyond basic policies and terms of service. For example, the site is currently moving to a new moderation model, which is being chronicled by the users.
The owner since 2017 has been Josh Millard, who goes by cortex on MetaFilter. Millard is now ceding control of the site to a transition team of longtime users who will decide how to proceed. In a recent candid post, Millard opened up about what it takes to run the site. “I've been especially aware of both the toll the job has been taking on me and the degree to which my burnout and mental health challenges have been preventing me from being as effective a manager, moderator, and business administrator as I want MetaFilter to have,” he wrote.
Here, the gulf between MetaFilter and the largest social platforms gets even wider. If Elon Musk’s stated goal of a Twitter with no restrictions on speech is one end of the spectrum, then MetaFilter might be the other end. It’s important to know that the amount of new content each day is extremely low (maybe 10 posts) and the site has a deep catalog of posts that mods do not want to be repeated. The FAQ could not be more clear: “MetaFilter is a moderated site and not all posts pass muster.”
Beyond the basic platform no-no’s (harmful, illegal, and/or low-quality content), MetaFilter mods will actually remove posts where they believe you are “axegrinding” (“you posted on a hot-button topic that you frequently post about and/or used heavy-handed editorializing language”) or doing a “stunt post” (“you were doing something cutesy or pointed with your post that was making some sort of statement, not linking to something neat on the web”). This is not a platform where anything goes.
“We've had a reputation from early on for having hands-on, engaged moderation. That shouldn't be remarkable!” Millard told me over email. He continues:
Paying people to do moderation costs money; it's most of MetaFilter's budget. It's important, and it's always been a priority, and it means we haven't seen the kind of race-to-the-bottom abuse nightmares and ad/cookie/etc design hellscape you get when you abdicate ethical responsibilities in exchange for profit.
MetaFilter’s guidelines are uniformly kind, even when warning you that running afoul of the rules may get your post deleted: “Please note: post deletions are a judgement of the post, not the poster.” The mods who care for the site are really invested in how users act, and how that behavior ultimately shapes the site:
Read a thread before commenting. Engage with what people are really saying. Respond appropriately to people's mood and investment in a topic. Refrain from making light jokes in a serious discussion.
In other words, read the room! I’ve never seen any platform guidelines that emphasize empathy and fairness as much as these.
This particular aspect of MetaFilter now seems so alien to how newer platforms operate that I think it’s worth honing in on: MetaFilter is a space of cooperation and collaboration, and it has firm rules against trying to shill for yourself. The new user page reads, “Metafilter is not a site for self-promotion, SEO, PR, marketing, or advertising,” and the guidelines say, “It is never okay to use MetaFilter as a promotional tool. Transparency and honesty are important to the community and we rely on users to abide by the guidelines and participate honestly.” For me, the closest recognizable analogue to this is probably Wikipedia’s rule that you can’t edit your own Wikipedia page.
Connection over engagement
Generally, when we talk about why social media platforms are “successful,” we’re assessing their size, prominence, and profits. But it’s probably clear by now that unlike most platforms, growth and engagement are not MetaFilter’s north star.
In 2014, then owner Matt Haughey chronicled MetaFilter’s initial rise and decline in a Medium post, and said about 12,000-15,000 members were active on the site each day, with about 80 million readers annually. At that point, most of the site’s revenue came from ads, and in the years since they have switched to primarily a membership and donation model. Current membership numbers are difficult to find, so I asked Millard about the current state of MetaFilter:
The raw level of user activity on MeFi has fallen off considerably in the last ten years; I'd say daily user activity is more in the realm of say 3,000 logged-in folks dropping by vs. the 12K-15K Matt talked about in that blog post. It's measurably quieter and lower volume now, for worse and better, but remains active enough to feel like a fairly busy, lived-in space and we've put greater emphasis over the last several years toward prioritizing community care and justice within the site, which I think has led to it being a healthier and more welcoming space even as the raw numbers have shrunk. Trading a degree of excitable chaos for some somewhat more stable community intimacy as the web has changed around us.
So, even if the site is a lot smaller than it used to be, with fewer resources, I think it’s still a pretty remarkable place. MetaFilter, now over twenty years old, is possibly the nicest and kindest general-purpose message forum for strangers to interact with each other online.
There are so many ways to measure success of a platform besides growth, and we have a few of our own, but real-life, time-tested examples of digital public spaces that are actually flourishing are few and far between. I think MetaFilter is unreservedly one of them.
Community Cork Board
Earlier this month, we got a really interesting email from Joe Kosmin:
I own a small social media community that has built a site from scratch, mostly for people in the Houston Texas area. A lot of the principles sound great, many in line with my own thoughts. Recently my group struggled and suffered tremendously with hostilities surrounding Covid and the election.
Info security and keeping people and their info safe has been key to our group. Encourage humanization is a good way to crystalize what I’ve been working toward, but haven’t been able to clearly define until now.
Although we’ve done a lot to encourage a good community, I feel like we’re still failing.
Given that our site is coded from scratch and offers a lot of flexibility in trying new things, and we are small and fairly agile, I would definitely like to experiment with different ideas to improve the quality of our discussions/interactions.
I know there are others like me. Is there any group of social media platform owners who are interested in this? It might be really useful to have such a group, to collaborate and discuss the nitty gritty practical things that work or don’t work.
We think so too! It’s great to hear that our Signals research lines up with Joe’s real world experience. I suggested to Joe that we run his letter here in the newsletter, because maybe some readers would have suggestions. Don’t let Joe down! If you have any ideas, or want to get in touch with Joe directly, you can email firstname.lastname@example.org with “FOR JOE” in the subject line and I’ll make sure he gets it. Also, send us a letter of your own and maybe we’ll publish it here!
Events reminder: Joi Rae, New_ Public Head of Operations and Partnerships, will be on the “Building Positive Digital Spaces” panel at the Responsible Tech Summit on May 20. On May 26, she’ll be at Digital Void Festival: Brooklyn.
Panel: The Institute for Rebooting Social Media at the Berkman Klein Center is hosting a panel with Joanne Cheung, who will discuss her research and open access article Real Estate Politik: Democracy and the Financialization of Social Networks on a panel next Tuesday, May 17 at 3pm EST. RSVP by Monday.
New job postings: In addition to our four internships and our COO, Head of Product, and Head of Community jobs, we’ve added two new opportunities. We are looking to hire a lead Researcher to support the ethnographic research required to build products that serve people and communities’ real needs. We’re also seeking a Chief of Staff to join our Co-directors and develop and manage the systems that underpin their leadership of New_ Public, and work alongside them in building and deepening partnerships. Apply here.
Screenshots via MetaFilter.
New_ Public is a partnership between the Center for Media Engagement at the University of Texas, Austin, and the National Conference on Citizenship, and was incubated by New America.