Content Policy on the Social Web

On Monday, Mark Zuckerberg, CEO of Meta, announced a new content policy for Meta on Threads. We are disappointed in these changes, which put vulnerable people on and off Meta platforms in harm’s way for harassment. Ideas matter, and history shows that online misinformation and harassment can lead to violence in the real world. There are good analyses of the details of the policy changes at EFF, The Verge, and Platformer.

Meta is one of many ActivityPub implementers and a supporter of the Social Web Foundation. We strongly encourage Meta’s executive and content teams to come back in line with best practices of a zero harm social media ecosystem. Reconsidering this policy change would preserve the crucial distinction between political differences of opinion and dehumanizing harassment. The SWF is available to discuss Meta’s content moderation policies and processes to make them more humane and responsible.

Distributed Moderation

What do these changes mean for the Fediverse? Through ActivityPub, Meta’s Threads network is connected to the social web, also called the Fediverse. This is a diverse network of independent social services using different codebases and different kinds of content. The network of 300M users of Threads can follow and be followed by people in tens of thousands of other communities. These services are operated by a variety of entities: corporations, universities, enterprise IT, cooperatives, non-profit organizations, and self-organized volunteers.

Theoretically, this distributed structure allows people to make choices about which platforms they want to use – based not only on technical features, but also on community composition and moderation policies. Users don’t need to give up on social connections they already have with friends and family; they can stay connected across services using ActivityPub. Different communities and services can have different content policies, but people in different communities can still stay connected. 

Ideally, having an account on a Fediverse service gives people the best of both worlds: they can stay connected to users and content they like, and filter out content and users that they don’t. When unwanted content from one community lands in the feeds of people in other communities, the receiving users or their moderators can react under their own local policy: removing individual text or image posts; blocking individual users; or blocking the entire sending community.

Practically, though, there are limitations to this flexibility.  Filtering on the receiving side requires orders of magnitude more effort. If a single sending service delivers bad content to users on one hundred or one thousand receiving services, each moderator on the receiving end has to clean up the mess locally. Moderators get understandably frustrated with this kind of displacement of responsibility. A common response is to block servers that send bad content entirely.

In the case of Threads, though, there are complicating factors. Threads is much, much bigger than the typical Fediverse community, and it has many high-profile users in politics, media and technology. It’s also an easy onboarding service to the Fediverse for people who are used to Facebook or Instagram, meaning many of our friends, colleagues and family use it. Blocking the Threads service means blocking access for all users on the receiving service from all these important accounts.

Unfortunately, there’s not an easy answer for Fediverse moderators. We encourage trust and safety teams across the social web to use their best judgement and the tools available to keep users safe, connected, and informed, and also to minimize moderators’ stress and burnout. IFTAS Connect is a great community resource for connecting with other moderators to discuss these tradeoffs.

Improving Social Web Resilience

We see the challenge of a large service that has poor local content policy as a chance to strengthen the social and technical infrastructure of the Fediverse. None of these options will resolve current problems immediately, but we hope starting the research now will make the Fediverse more resilient in the future.

Ultimately, the safety and well-being of people around the world should not be in the hands of any single company. Moderation policies are a competitive advantage in an open social network. We continue to encourage the use of ActivityPub, and the distributed control that it brings.

Like this: