Snap calls New Mexico’s child safety complaint a ‘sensationalist lawsuit’


Snap accused New Mexico’s attorney general in a filing that asked the court to dismiss its application, accusing it of deliberately seeking out adult users seeking sexually explicit content to make its app appear unsafe. the state’s claim. in a document shared by The Vergethe company questioned the veracity of the state’s claims. The Attorney General’s Office said it was added by a user named Enzo (Nud15Ans) while using a fake account believed to belong to a 14-year-old girl. From this link, the app allegedly offers more than 91 users, including adults looking for sexual content. In its motion to dismiss, Snap said those “allegations are patently false.”

The company says it was a fake account that searched for and added Enzo. The prosecutor general’s operatives were also the ones who searched for and added accounts with suspicious usernames like “nudenude_22” and “xxx_tradehot”. In addition, Snap accuses the office of “repeatedly [mischaracterizing]The office apparently cited a document when it noted in its filing that the company “made a conscious decision not to retain images of child sexual abuse” and suggested that it not report or provide those images to law enforcement. Snap has denied that this is the case and has clarified that it does not allow the storage of child sexual exploitation material (CSAM) on its servers Children Exploited.

The director of communications for the New Mexico Department of Justice was unimpressed by the company’s arguments. in a comment sent to The VergeLauren Rodriguez accused Snap of focusing on minor details of the investigation in an “attempt to distract from the serious issues raised in the state’s case.” Rodriguez also said that “Snap continues to put profits ahead of protecting children” instead of “addressing critical issues … by making real changes to its algorithms and design features.”

New Mexico concluded after a month-long investigation that Snapchat’s features “encourage the sharing of child sexual abuse material (CSAM) and facilitate the sexual exploitation of children.” It said it found “a vast network of dark web sites dedicated to sharing non-consensual sexual images stolen from Snap” and that Snapchat is the largest source of images and videos it has seen on dark web sites “to date.” . The Attorney General’s Office called Snapchat a “breeding ground for predators to collect explicit images of children and find, groom and extort them.” Snap employees experience 10,000 sexual harassment incidents every month it is said in the lawsuit of the officehowever, the company is not warning users to “not create fear” among users, according to the lawsuit. The complaint accused Snap’s senior management of ignoring former trust and security officials who pushed for additional security mechanisms.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *