Google is partnering with the UK nonprofit StopNCII to strengthen its fight against the spread of unconsensual intimate images, also known as Revenge Porn.
The search giant uses the hash of StopNCII, a digital fingerprint of images and videos, to actively identify and delete unconsensual intimate images in searches.
Stopncii helps adults to prevent online sharing by creating their own identifiers or hashs to represent intimate images. These hashes are provided to partner platforms like Facebook, allowing you to automatically identify and remove matching content from the platform or service.
It is worth noting that the private image itself never leaves the device, as only the hash is uploaded to StopNCII’s system.
“Our existing tools allow people to request the removal of NCII from their search. We continue to start improving our rankings to reduce the visibility of this type of content,” Google wrote in a blog post. “We also heard from survivors and advocates that given the size of the open web, there’s more to do to alleviate the burden on those affected by it.”
Google is slow to adopt the Stopnii system, as it is becoming a partnership with nonprofits a year after Microsoft integrated the tool into Bing. Other companies affiliated with StopNCII include Facebook, Instagram, Tiktok, Reddit, Bumble, Snapchat, Onlyfans, X, and more.
The partnership between the search giant and nonprofit organizations shows the latest move to combat intimate images of non-consensus. Last year, Google made it easier to easily remove Deepfake’s unconsensual intimate images from searches, making them difficult to find.
