Google is partnering with a UK nonprofit to fight non-consensual intimate imagery (NCII). (You may know it better as revenge porn.) Over the coming months, the company will begin using StopNCII's hashes. These user-uploaded digital fingerprints can block individuals' unwanted intimate content from appearing in search results.<br /> StopNCII has a pretty neat system to combat revenge porn. Say you have some images you most definitely don't want surfacing online. Select the picture on your device, and StopNCII will create a digital fingerprint of the file. That hash will be uploaded to the service. The photo itself never leaves your device. The organization then shares the hash (again, not the spicy pic) with participating platforms.<br /> Then, if an asshole ex ta [...]
Security teams are buying AI defenses that don't work. Researchers from OpenAI, Anthropic, and Google DeepMind published findings in October 2025 that should stop every CISO mid-procurement. Thei [...]
The US House of Representatives has passed the Take It Down Act, a bipartisan bill that criminalizes the "publication of non-consensual, sexually exploitative images," including AI-generated [...]
The price for a PS4 copy of the relatively obscure Star Wars racing game, Star Wars Racer Revenge, has dramatically increased in the last few days because of the game’s use in the latest PlayStatio [...]
It's refreshing when a leading AI company states the obvious. In a detailed post on hardening ChatGPT Atlas against prompt injection, OpenAI acknowledged what security practitioners have known fo [...]
The week has been a mixed bag for Apple. First, it launched a new iPhone app for organizing events and being actually social; then, it had to contend with a third-party app store offering a porn app i [...]
Later this month, Italian citizens will have one extra step to go through before getting on porn sites. On Friday, Italy's regulatory agency for communications, known as AGCOM, announced an age v [...]