Meta’s Oversight Board takes its first Threads case

6 months ago 26
ARTICLE AD

Meta’s Oversight Board has now extended its scope to include the company’s newest platform, Instagram Threads. Designed as an independent appeals board that hears cases and then makes precedent-setting content moderation decisions, the board to date has decided on cases like Facebook’s ban of Donald Trump, Covid-19 misinformation, the removal of breast cancer photos, and more.

Now the board has begun hearing cases emerging from Threads, Meta’s Twitter/X competitor.

This is an important point of differentiation between Threads and rivals like X, where Elon Musk and other users heavily rely on crowdsourced fact-checks by Community Notes to complement its otherwise light moderation. It’s also very different from how decentralized solutions, like Mastodon and Bluesky, are managing moderation duties on their platforms. Decentralization allows community members to establish their own servers with their own set of moderation rules, as well as the option to defederate from other servers whose content runs afoul of their guidelines.

The startup Bluesky is also investing in stackable moderation, meaning community members can create and run their own moderation services, which can be combined with others to create a customized experience for each individual user.

Meta’s move to difficult decisions to an independent board that could overrule the company and its CEO Mark Zuckerberg was meant to be the solution to the problem of Meta’s centralized authority and control over content moderation. But as these startups have shown, there are other ways to do this that allow the user to be more in control of what they see, without stepping on the rights of others to do the same.

Nevertheless, the Oversight Board on Thursday announced it would hear its first case from Threads.

The case involves a user’s reply to a post of a screenshot of a new article where Japanese Prime Minister Fumio Kishida made a statement about his party’s alleged underreporting of fundraising revenues. The post also included a caption criticizing him for tax evasion, and contained derogatory language as well as the phrase “drop dead.” It also used derogatory language for someone who wears glasses. Because of the “drop dead” component and hashtags calling for death, a human reviewer at Meta decided the post violated the company’s Violence and Incitement rule — despite sounding much like your run-of-the-mill X post these days. After their appeal was denied a second time, the user appealed to the Board.

The Board says it selected this case to examine Meta’s content moderation policies and enforcement of practices over political content on Threads. That’s a timely move, considering not only that it’s an election year, but also that Meta declared it would not proactively recommend political content on Instagram or Threads.

The Board’s case will be the first involving Threads, but it won’t be the last. The organization is already preparing to announce another bundle of cases tomorrow focused on criminal allegations based on nationality. These latter cases were referred to the Board by Meta, but the Board will also receive and weigh in on appeals from Threads users, as it did with the case concerning Prime Minister Kishida.

The decisions the Board renders will influence how Threads as a platform chooses to uphold users’ ability to express themselves freely on its platform, or whether Threads will moderate content more closely than on Twitter/X. That will ultimately help shape public opinion about the platforms and influence users to choose one or the other, or perhaps a startup experimenting with new ways to moderate content in a more personalized fashion.

Read Entire Article