We've seen artificial intelligence give some pretty bizarre responses to queries as chatbots become more common. Today, Reddit Answers is in the spotlight after a moderator flagged the AI tool for providing dangerous medical advice that they were unable to disable or hide from view. The mod saw Reddit Answers suggest that people experiencing chronic pain stop taking their current prescriptions and take high-dose kratom, which is an unregulated substance that is illegal in some states. The user said they then asked Reddit Answers about other medical questions. They received potentially dangerous advice for treating neo-natal fever alongside some accurate actions as well as suggestions that heroin could be used for chronic pain relief. Several other mods, particularly from health-focus [...]
Dozens of subreddits have opted to block links to X in their communities over the last 24 hours in a movement that appears to be gaining momentum across Reddit. Hundreds more appear to be actively dis [...]
A group of researchers covertly ran a months-long "unauthorized" experiment in one of Reddit’s most popular communities using AI-generated comments to test the persuasiveness of large lang [...]
Reddit is suing companies SerApi, OxyLabs, AWMProxy and Perplexity for allegedly scraping its data from search results and using it without a license, The New York Times reports. The new lawsuit follo [...]
Two days after releasing what analysts call the most powerful open-source AI model ever created, researchers from China's Moonshot AI logged onto Reddit to face a restless audience. The Beijing-b [...]
Reddit had filed a lawsuit against Anthropic, alleging that the AI company behind the Claude chatbot has been using its data for years without permission. The lawsuit comes after Reedit has increasing [...]
Reddit has temporarily banned the subreddit r/WhitePeopleTwitter after Elon Musk complained about the community. The subreddit is currently inaccessible with a message from Reddit stating that the com [...]