The Oversight Board weighs in on Meta’s most-moderated word


The Supervisory Board is calling on Meta to change the way it handles the word “martyr,” an Arabic term that has caused more deletions than any other word or phrase on the company’s platforms. Meta asked the group to help develop new rules it stopped after trying to update it internally.

The Arabic word “shahid” is often translated as “martyr”, although the board notes that this is not a precise definition and that the word can have “multiple meanings”. But Meta’s current rules are based solely on the definition of “martyr,” a definition the company said it intended. This led to a “blanket ban” of the word when used in conjunction with people designated by the company as “dangerous persons”.

However, the policy ignores the “linguistic complexity” of the word, “even when referring to dangerous individuals, it is often used in reporting and neutral commentary, in academic debates, human rights debates, and even in more passive ways,” the Review Board says. according to him. “There is strong reason to believe that the multiple meanings of the word ‘martyr’ have resulted in the removal of a significant amount of material not intended to glorify terrorists or their violent acts.”

In its recommendations to Meta, the Review Board says the company should end its “blanket ban” on the word used to refer to “dangerous individuals” and that posts should only be removed if there are other clear “signals of violence.” if the content violates other policies. The board also wants Meta to better explain how it uses automated systems to enforce these rules.

If Meta accepts the recommendations of the Review Board, it could have a significant impact on the Arabic-speaking users of the platform. The board notes that because the word is so common, it “results in more content removals than any other word or phrase under the Community Standards” among the company’s apps.

“Meta operated under the assumption that censorship could and would increase safety, but the evidence shows that censorship can marginalize entire populations while not improving safety at all,” said Helle Thorning-Schmidt, co-chair of the board (and former Danish prime minister). said in the statement. “The Council is particularly concerned that Meta’s approach could affect journalism and civil discourse, as media organizations and commentators may be reluctant to report on entities designated to prevent content from being removed.”

This is hardly the first time that Meta has been criticized for its moderation policies, which disproportionately affect Arabic-speaking users. A the company alleged that Meta’s moderators had been less accurate when rating Palestinian Arabic, resulting in “false hits” to users’ accounts. Company Last year, after Instagram’s automated translations began inserting the word “terrorist” into the profiles of some Palestinian users.

The idea is also another example of how long it can take for Meta’s Board of Control to influence the social network’s policies. The company first asked the board to consider the rules more than a year ago (the Supervisory Board said it had “suspended” publication of the policy to ensure the rules would “stand up” to the “extreme stress” of the October 7 attacks in Israel and the conflict in Gaza). Meta will now have two months to respond to the recommendations, although actual changes to the company’s policies and practices could take weeks or months to implement.

“We want people to be able to use our platforms to share their ideas and have a set of policies in place to help them do so safely,” said a Meta spokesperson. “We aim to apply these policies fairly, but doing so at scale presents global challenges, so in February 2023 we sought guidance from the Review Board on how we treat the word ‘martyr’ when referring to designated individuals or organizations. We will consider the Council’s opinion and respond within 60 days.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *