New Mexico sues Snap over its alleged failure to protect kids from sextortion schemes


The attorney general of the state of New Mexico said against accused the company of failing to protect children from sexual violence, sexual exploitation and other harm . The lawsuit alleges that Snapchat’s features “encourage the sharing of child sexual abuse material (CSAM) and facilitate the sexual exploitation of children.”

The state Department of Justice conducted a month-long investigation into Snapchat and uncovered “a vast network of dark web sites dedicated to sharing non-consensual sexual images stolen from Snap.” It claims to have found more than 10,000 records of Snap and child sexual abuse material “in the last year alone,” and says Snapchat is “by far” the largest source of images and videos on dark web sites it has investigated.

In his complaint []the agency accused the app of being “a breeding ground for predators to collect explicit images of children and find, groom and extort them.” It states that “criminals are distributing sextortion scripts that contain instructions on how to victimize minors.” He claims that these documents are publicly available and actively used against victims, but they have not been blacklisted by “yet”. . . Snapchat.”

In addition, investigators found that multiple accounts on Snapchat that publicly shared and sold CSAM were linked to each other through the app’s recommendation algorithm. The suit alleges that “Snap specifically designed its platform to be addictive to young people, leading some of its users to develop depression, anxiety, insomnia, body dysmorphia and other mental health issues.”

The Snapchat complaint follows a similar child safety suit . Engadget has reached out to Snap for comment.

“Our undercover investigation found that Snapchat’s harmful design features created an environment where predators could easily target children through sextortion schemes and other forms of sexual abuse,” Attorney General Raul Torrez said in a statement. “Snap has assured users that photos and videos posted on their platform will disappear, but predators can capture that content permanently, and they’ve created a virtual yearbook of child sex images that are sold, traded and stored indefinitely. Through our lawsuits against Meta and Snap, the New Mexico Department of Justice will continue to hold these platforms accountable for putting more profit over the safety of children.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *