Meta's potential sweeping moderation changes

Meta’s Oversight Board, an independent body that judges high-stakes content moderation cases, just made a decision that could change how Meta moderates across its platforms.  

Catch-up: Back in January, Meta removed a music video clip by UK drill artist Chinx (OS) from Instagram per the request of the London Metropolitan Police, who claimed it contained a “veiled threat” and might increase the risk of potential retaliatory gang violence.

  • Drill, a rap subgenre noted for its particularly aggressive nature, has stirred moral panic in the UK and US for its alleged (and highly disputed) ties to gang violence.

What happened: The Oversight Board ruled that the clip should actually not have been taken down, citing insufficient evidence of a real threat provided by the London police and saying Meta should have given more consideration to artistic expression. 

  • The board recommended a system for content removal requests on a public log that would hold government and law enforcement officials to higher standards.
     
  • It also recommended that Meta allow users to appeal such content removal decisions directly to the board, and clarify its guidelines, specifying what a “veiled threat” is. 

Why it matters: If the measures are accepted, they could “transform how the tech giant… interacts with police and governments around the world,” per the FT, and, if the board is correct, reduce bias and discrimination in content removal requests from state actors.