Google partners with StopNCII to fight non-consensual intimate imagery

Google partners with StopNCII to fight non-consensual intimate imagery (NCII)

Google has announced a partnership with the UK nonprofit StopNCII to use StopNCII’s image hashes to help prevent non-consensual intimate imagery (NCII), often called revenge porn, from appearing in search results and on participating platforms.

How it works

  • Users (adults) select intimate images/videos on their device and StopNCII creates a digital fingerprint (hash) of the file. The image itself never leaves the user’s device.
  • StopNCII uploads only the hash to its system and shares it with participating platforms and search providers.
  • If someone uploads a matching file, platforms using real-time hash-matching can block the upload or remove the content automatically.

Limitations

  • The system only works for known images (you must have a copy to create a hash).
  • It does not cover AI-generated images, audio, or text chats.
  • It only protects content on participating platforms.

Confirmed partners & sources

How to create a case

If you are 18+ and want to pre-register intimate images you don’t want shared, visit StopNCII’s site and follow their case-creation flow. The tool creates hashes locally; the organization says the image never leaves your device. Note: the service covers nude, semi-nude, or sexual-act images only.

Sources & further reading

Have questions or experiences to share? Leave a comment below — what more should platforms do to stop NCII?

Leave a Reply

Your email address will not be published. Required fields are marked *

Diese Seite verwendet Cookies, um die Nutzerfreundlichkeit zu verbessern. Mit der weiteren Verwendung stimmst du dem zu.

Datenschutzerklärung