Facebook and Instagram’s algorithms facilitated child sexual harassment, state lawsuit claims


New Mexico State last December He sued Meta For failing to protect children by claiming that Facebook and Instagram algorithms recommend sexually explicit content to minors. Now an unedited internal Meta presentation has surfaced, the company’s own employees estimate that 100,000 child users are being molested every day. The Wall Street Journal informed.

According to a 2021 internal document, Facebook’s “People You Know” (PMYK) algorithm was singled out as the primary linker of children to predators. When staff reported these findings to Meta executives, they reportedly rejected recommendations to redesign the algorithm to stop recommending adults to minors.

According to one worker, this feature was responsible for 75 percent of all inappropriate contact with minors. “How have we not extinguished PYMK among adults and children?” another employee said. “It’s really, really sad,” said another.

According to a 2020 internal memo, the problems were particularly insidious on Instagram, with “sex talk” 38 times more common on the platform than on Facebook Messenger in the US. In one case, an Apple executive reported that his 12-year-old son had been solicited on Instagram. “This is something that makes Apple angry with threats[en]We want to remove us from the App Store,” said the employee involved in solving the problem.

New Mexico alleges that Meta failed to address large-scale predation on its platform, particularly around its recommendation algorithms. State investigators first created fake accounts for the children, with the children often providing the birth dates of the adults. they get their ages wrong accessing online services they shouldn’t. Later, they testified that the accounts were used by children, one with a missing baby tooth and since the seventh grade. The lawsuit alleged, among other things, that the accounts sent child sex images and offered to pay for sex.

The state further alleges that Meta leaders did not take action to limit adult predation on minors until late 2022, and have yet to take serious action recommended by security officials. Instead, it sought to block offers only to adults who had previously demonstrated questionable behavior towards children. However, according to a Meta study, 99 percent of disabled child care accounts failed to report their age.

Recently Meta applied the measures For teenage users on Instagram and Facebook, including blocking unfollowers from messaging them and blocking offensive comments. In addition to the New Mexico complaint, Meta is facing lawsuits from 41 states alleging it harms the mental health of its youngest users. More recently unstamped complaint The 33-state lawsuit alleges Meta “coveted and harassed” users under the age of 13 and was unscrupulous about how it managed the accounts of underage users when they were discovered.

This article contains affiliate links; we may earn a commission if you click on such a link and make a purchase.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *