Destination
X has to prove it wasn't negligent when removing CSAM from its site

X isn't off the hook yet when it comes to a significant legal case about child sex abuse content on its platform. On Friday, a circuit judge from the US Court of Appeals ruled that X Corp. has to again face claims that it was negligent in taking down child sex abuse content and didn't have an effective reporting infrastructure for these offenses.<br /> This ruling from Judge Danielle Forrest is the latest step in a lawsuit filed in 2021 against Twitter, before it was rebranded to X. The suit lists two underage boys as the plaintiffs and alleges Twitter, now X, "slow-walked its response to reports about, and did not immediately remove from the platform, pornographic content that a trafficker had coerced plaintiffs into producing."<br /> A previous decision wi [...]

Rating

Innovation

Pricing

Technology

Usability

We have discovered similar tools to what you are looking for. Check out our suggestions for similar AI tools.

Destination
Elon Musk's Grok AI posted CSAM image following safeguard 'lapses'

Elon Musk's Grok AI has been allowing users to transform photographs of woman and children into sexualized and compromising images, Bloomberg reported. The issue has created an uproar among users [...]

Match Score: 104.66

Destination
Amazon discovered a 'high volume' of CSAM in its AI training data but isn't saying where it came from

The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The "vast majority" of that cont [...]

Match Score: 58.93

Destination
West Virginia is suing Apple alleging negligence over CSAM materials

The office of the Attorney General for West Virginia announced Thursday that it has filed a lawsuit against Apple alleging that the company had "knowingly" allowed its iCloud platform " [...]

Match Score: 58.55

Destination
xAI is being sued by teens who say Grok created CSAM using their photos

xAI, which is already facing multiple investigations around the world over widespread reports that Grok repeatedly created sexualized images of children, is now facing a class action lawsuit. Three te [...]

Match Score: 51.33

Destination
EU backs away from requiring tech companies to scan and remove CSAM

EU member states have agreed on a position regarding online child protection legislation that doesn't force global tech companies to identify and remove child sexual abuse materials (CSAM.) This [...]

Match Score: 51.20

Destination
UK regulator Ofcom opens a formal investigation into X over CSAM scandal

The UK’s media regulator has opened a formal investigation into X under the Online Safety Act. "There have been deeply concerning reports of the Grok AI chatbot account on X being used to creat [...]

Match Score: 50.43

Destination
California is investigating Grok over AI-generated CSAM and nonconsensual deepfakes

California authorities have launched an investigation into xAI following weeks of reports that the chatbot was generating sexualized images of children. "xAI appears to be facilitating the large- [...]

Match Score: 43.98

Destination
Malaysia and Indonesia are the first to block Grok following CSAM scandal

Malaysia and Indonesia are the first countries to block Grok, claiming that X’s chatbot does not have sufficient safeguards in place to prevent explicit AI-generated deepfakes of women and children [...]

Match Score: 43.18

Destination
Where to sell your used and unwanted gadgets

Springtime is a period of renewal – and that often includes new iterations of all your favorite gadgets. Laptops and phones and even game consoles are all getting a new look on the outside and new s [...]

Match Score: 37.42