🗣️🗯️ Real conversations about tech, trust, and community
Where do you stand? Join complex debates on fact-checking, local connection, and platform control
This is a special, extra newsletter to highlight some great conversations. We’ve been asking questions on tools we’ve been testing out recently. Please feel free to dive in and create a new comment, or respond to someone else’s, to spark more discourse and try out these features. Thanks!
–Josh Kramer, Head of Editorial, New_ Public
Should social media platforms be responsible for fact-checking?
After Mark Zuckerberg announced Meta would no longer be using fact-checking on its platforms, I wrote at length about social media fact-checking and moderation.
In our conversation, “It’s complicated” emerged as a popular choice. Professor Jonathan Stray, who I quoted in the newsletter, weighed in to argue that fact-checking requires credibility and trust, which Meta lacks. He also commented on Community Notes, the feature from Twitter/X that Meta is working on incorporating now:
Platforms can and absolutely should take steps to curb harm caused by false information. But they can't do so effectively without credibility and trust. Therefore the core of any effective program to combat misinformation is whether it can generate broad public legitimacy. Rightly or wrongly, the previous regime of content moderation and professional fact checking (two different programs that are often confused) was not trusted by the conservative half of the population. …
Crowdsourced systems like community notes are an interesting step in this direction, as, empirically, they are more trusted by both liberal and conservatives than expert fact checks. It's noteworthy that X makes all the code and data (crowdsourced labels and votes) available [publicly], so external parties can audit it for performance and bias.
While Matthew Starr, a tech policy expert and my friend IRL, reminds us that this may be a difficult job because of human nature:
Determining the lines of what is and is not "particularly harmful misinformation" is a complicated exercise, and ultimately there are (likely biased) humans involved in the process of making that call. Setting clear guidelines for how exactly this should operate is largely impossible.
Meta’s new policy went into effect on Facebook, Instagram, and Threads this week. What do you think?
How can the internet strengthen local communities?
I interviewed Richard Young, Founder & Executive Director of Lexington civic health nonprofit CivicLex, and was struck by what he had to say about the limitations of strengthening local connection online. In part, he thinks that most digital communication is too dehumanizing for building community, and that social media spaces have really degraded since the pandemic.
We wanted to see what different kinds of people would think, and readers with different perspectives weighed in here. What do you think?
Laurel T, choosing the perspective of “Local organizer,” offered this:
When the LA fires were happening, there was so much unknown about when to evacuate, what to pack in a go bag, where to go, etc. But I was able to find that information on social media from friends in LA reposting from reputable sources. I learned about the invaluable Watch Duty app (which tracks fires and evacuation zones) because someone posted about it on Instagram. And when mutual aid efforts were happening, I learned where I could volunteer and drop off supplies from friends posting where they went on Instagram. If I didn’t have the internet and social media, and only consumed information about the fire from the news, I think living through the fires would’ve been a lot more confusing and lonely.
Should governments control social media?
In writing back and forth, Elle Griffin and I found we had different opinions about the government's role in social media. I came from a qualified and optimistic “yes,” and Elle took the opposite view, but we landed in a productive place with a couple of shared beliefs and ideas.
We opened up the conversation to readers of both our newsletters and found a wide range of thoughtful, sophisticated opinions. Aarjav Chauhan chose “It’s complicated” and invoked one of our favorite thinkers, the late Nobel-winning economist Elinor Ostrom:
I believe that social media platforms should be stewarded and self-governed by concerned and active community members. But, as we have seen [with] Twitter and Facebook, governing these online spaces is quite impossible without any form of interaction or influence by larger State governing bodies.
In thinking through this question, I think it might be useful to call some of Ostrom's principles on the successful governance of the commons, in particular nested tiers of governance. If we are to redesign social media platforms as digital commons, then a useful step might be to envision ways by which small-scale and decentralized online social media spaces can democratically engage with the State and the private sector.
Some, like Josh Bruce, answered in the negative: “Unequivocally no, if ‘control’ means directing content on social media.” Whereas others, like Yvonne Danyluck, took the opposite stance: “Someone needs to step in, at least at first, to re-jig the balance of power. But if it's government, it needs to be an informed, people-first mandate.” Where do you land?
We are still seeking an Open Source Developer for Public Spaces Incubator and a Head of Engineering for Local Lab, and we have just started looking for a Social Media Fellow.
Announcing: we are convening a virtual event on the new ways people get their info locally. Please join us for “Hyperlocal Heroes: Building Community Knowledge in the Digital Age” on April 25, 2pm EST / 11am PST.
Really feeling the “April showers” this year,
–Josh