X isn't off the hook yet when it comes to a significant legal case about child sex abuse content on its platform. On Friday, a circuit judge from the US Court of Appeals ruled that X Corp. has to again face claims that it was negligent in taking down child sex abuse content and didn't have an effective reporting infrastructure for these offenses.<br /> This ruling from Judge Danielle Forrest is the latest step in a lawsuit filed in 2021 against Twitter, before it was rebranded to X. The suit lists two underage boys as the plaintiffs and alleges Twitter, now X, "slow-walked its response to reports about, and did not immediately remove from the platform, pornographic content that a trafficker had coerced plaintiffs into producing."<br /> A previous decision wi [...]
The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The "vast majority" of that cont [...]
xAI, which is already facing multiple investigations around the world over widespread reports that Grok repeatedly created sexualized images of children, is now facing a class action lawsuit. Three te [...]
The UK’s media regulator has opened a formal investigation into X under the Online Safety Act. "There have been deeply concerning reports of the Grok AI chatbot account on X being used to creat [...]
California authorities have launched an investigation into xAI following weeks of reports that the chatbot was generating sexualized images of children. "xAI appears to be facilitating the large- [...]
Malaysia and Indonesia are the first countries to block Grok, claiming that X’s chatbot does not have sufficient safeguards in place to prevent explicit AI-generated deepfakes of women and children [...]