Microsoft announced today that it has partnered with StopNCII to proactively remove harmful intimate images and videos from Bing using digital hashes people create from their sensitive media.

StopNCII is a project operated by the Revenge Porn Helpline that allows people to create digital hashes of their intimate pictures and videos without uploading the media from their phone. StopNCII then adds these hashes to a database used to find the same or similar images online, which are then removed by their partners, including Facebook, TikTok, Reddit, Pornhub, Instagram, OnlyFans, and Snapchat.

In March, Microsoft shared its PhotoDNA technology with StopNCII, allowing enhanced creation of digital hashes without a person's images or videos leaving their device.

"Much like the hashing technology that StopNCII.org already uses (PDQ), PhotoDNA is an additional process that enables identified harmful images to be hashed into a digital fingerprint, which then can be shared with industry platforms to identify and remove any non-consensual intimate image abuse material," explains the March announcement.

How StopNCII.org works

Microsoft announced today that they have been piloting the use of StopNCII's database of hashes to remove intimate images within the Bing search index. Using this database, Microsoft says they have taken action on 268,899 images through the end of August.

The rise of artificial intelligence has also led to an increased generation of deep fake nude images from non-intimate photos shared online. While these images are fake, they can be as equally distressing to those who are being exploited.

A 2019 report by DeepTrace, now Sensity, showed that 96% of deepfake videos on the internet are pornographic in nature and almost all featured non-consensual use of a woman's likeness. Many of these images are uploaded as "revenge porn," for extortion, or to generate revenue by unscrupulous sites.

Unfortunately, AI-generated images make it harder to match against PhotoDNA hashes. In these cases, those impacted should manually report the images to Microsoft, Google, and other online media companies.

Microsoft says that impacted people can use their Report a Concern page to request that real or synthetic images be taken out of the Bing search index.

While Google has not joined this initiative, they also provide guidelines and a way to remove intimate images from their index.

Related Articles:

FIN7 hackers launch deepfake nude “generator” sites to spread malware

Hacker gets 10 years in prison for extorting US healthcare provider

ChatGPT allows access to underlying sandbox OS, “playbook” data

US indicts Snowflake hackers who extorted $2.5 million from 3 victims

New Google Pixel AI feature analyzes phone conversations for scams