Meta’s Oversight Board made just 53 decisions in 2023


The Supervisory Board has published its latest report looks at its impact on Meta and its ability to change the policies that govern Facebook and Instagram. The Board of Directors reports that in 2023, it received 398,597 applications, the vast majority of which were from Facebook users. However, he took on only a small proportion of these cases, issuing a total of 53 decisions.

However, the board suggests that the cases it chooses could have a major impact on Meta users. For example, he credits his work for influencing improvements to Meta’s strike system and an “account status” feature that helps users check whether their posts violate any of the company’s rules.

Stopping the overall influence of the board is more complicated. The group says it sent a total of 266 referrals to Meta between January 2021 and May 2024. Of these, the company has fully or partially implemented 75 and reported “progress” on 81. The rest were rejected, “dropped or reframed,” or Meta claimed to have been implemented at some level, but did not provide evidence to the board. . (There are currently five recommendations awaiting a response.) Those numbers raise some questions about how much change Meta is willing to make in response to the board it created.

The Supervisory Board's report on how it responded to Meta's recommendations,The Supervisory Board's report on how it responded to Meta's recommendations,

Board of Control

Note that the report contains no criticism of Meta and no analysis of efforts (or lack thereof) to follow Meta’s recommendations. The report calls for a case where Meta recommended the former prime minister’s suspension noted that within six months the company had reversed its decision to release a video that could incite violence. But the report did not mention the fact that Meta refused to suspend the former prime minister’s account and refused to further clarify its rules for public figures.

The report also points to challenging topics that the council may tackle in the coming months. He notes that he wants to see content “reduced,” or what some Facebook and Instagram users might call “shadowbans” (a term borrowed for Meta. denied that its algorithms deliberately penalize users for no reason). “One area we’re interested in investigating is low-level content, where the platform limits the visibility of a post without telling the user,” the Review Board writes.

It is not yet clear how the group will solve the problem. The board’s powers currently allow it to measure specific pieces of content that Meta removes or leaves after a user request. But maybe the board will find another way to the issue. A spokesperson for the Board of Control notes that the group has expressed concern about the low-grade content it contains About content related to the Israel-Hamas war. “This is something the Board would like to look into further, as their decisions regarding Meta’s dismissal are quite opaque,” the spokesperson said.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *