Google partners with StopNCII to fight non-consensual intimate imagery (NCII)
Google has announced a partnership with the UK nonprofit StopNCII to use StopNCII’s image hashes to help prevent non-consensual intimate imagery (NCII), often called revenge porn, from appearing in search results and on participating platforms.
How it works
- Users (adults) select intimate images/videos on their device and StopNCII creates a digital fingerprint (hash) of the file. The image itself never leaves the user’s device.
- StopNCII uploads only the hash to its system and shares it with participating platforms and search providers.
- If someone uploads a matching file, platforms using real-time hash-matching can block the upload or remove the content automatically.
Limitations
- The system only works for known images (you must have a copy to create a hash).
- It does not cover AI-generated images, audio, or text chats.
- It only protects content on participating platforms.
Confirmed partners & sources
- StopNCII official site: stopncii.org
- Google announcement: blog.google — StopNCII program partnership
- Additional partners referenced in coverage include Meta, Snap, Pornhub, RedGIFs and Microsoft Bing. See StopNCII FAQ and related coverage for more details.
How to create a case
If you are 18+ and want to pre-register intimate images you don’t want shared, visit StopNCII’s site and follow their case-creation flow. The tool creates hashes locally; the organization says the image never leaves your device. Note: the service covers nude, semi-nude, or sexual-act images only.
Sources & further reading
- StopNCII FAQ & official site: https://stopncii.org/faq/
- Google announcement: https://blog.google/products/search/stopncii-program-partnership/
- Additional reporting and press references (industry coverage) — search the web for “StopNCII partners” for the latest partner list and rollout dates.
Have questions or experiences to share? Leave a comment below — what more should platforms do to stop NCII?