Microsoft joins coalition to scrub revenge and deepfake porn from Bing


Microsoft partnered with to help remove non-consensual intimate images, including deepfakes, from the Bing search engine.

When a victim opens a “case” with StopNCII, the database creates a digital fingerprint, called a “hash,” of an intimate photo or video stored on that person’s device without the need to download the file. The hash is then sent to participating industry partners, who can search for matches to the original and remove them from their platforms if they violate their content policies. The process also applies to artificial intelligence-generated deep fakes of a real person.

Several other tech companies have agreed to work with StopNCII to clean up intimate images shared without permission. Meta tool and uses it on Facebook, Instagram and Threads platforms; includes other services that cooperate with the effort Reddit, Snap, Niantic, OnlyFans, PornHub, Playhouse and Redgifs.

Strangely enough, Google is not on this list. The tech giant has its own kit including to report objectionable images . However, not participating in one of the few centralized places to remove revenge porn and other private images certainly places an additional burden on a piecemeal approach to restoring victims’ privacy.

In addition to efforts like StopNCII, the US government has taken some steps this year to address the damage caused by the deep fake side of objectionable images. The A new law on the subject and a group of senators took action to protect the victims was presented in July.

If you believe you have been a victim of non-consensual intimate image sharing, you can file a case with StopNCII. and Google ; If you are under 18, you can report to NCMEC .



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *