Broadcom and a company called CAMB.AI are teaming up to bring on-device audio translation to a chipset. This would allow devices that use the SoC to complete translation, dubbing and audio description tasks without having to dip into the cloud. In other words, it could massively improve accessibility for consumers.<br /> The companies promise ultra-low latency and enhanced privacy, being that all processing is kept local to the user's device. The wireless bandwidth should also be drastically reduced.<br /> <br /> As for the audio description piece, there's a demo video of the tool being used on a clip from the film Ratatouille. The AI can be heard describing the scene in various languages, in addition to a written translation appearing on-screen. This looks incr [...]
Anthropic on Tuesday announced Project Glasswing, a sweeping cybersecurity initiative that pairs an unreleased frontier AI model — Claude Mythos Preview — with a coalition of twelve major technolo [...]
Summary: Google is in talks with Marvell Technology to develop two new AI chips – a memory processing unit and an inference-optimised TPU – adding a third design partner alongside Broadcom and Med [...]
OpenAI's custom AI chip project with Broadcom has hit a funding wall. Broadcom won't finance production unless Microsoft commits to buying 40 percent of the chips, and Microsoft hasn't [...]
Voice AI is moving faster than the tools we use to measure it. Every major AI lab — OpenAI, Google DeepMind, Anthropic, xAI — is racing to ship voice models capable of natural, real-time conversat [...]
The premise seems simple enough. LG promises that you can set its Sound Suite speakers anywhere and Dolby’s home theater tech will make them perform well. The soundbar, subwoofer and speakers don’ [...]