2025-05-15
The Ray-Ban Meta glasses are getting an upgrade to better help the blind and low vision community. The AI assistant will now provide "detailed responses" regarding what's in front of users. Meta says it'll kick in "when people ask about their environment." To get started, users just have to opt-in via the Device Settings section in the Meta AI app.
The company shared a video of the tool in action in which a blind user asked Meta AI to describe a grassy area in a park. It quickly hopped into action and correctly pointed out a path, trees and a body of water in the distance. The AI assistant was also shown describing the contents of a kitchen.
2025-04-07
I appreciate devices that don’t try to do too much. There are too many products throwing too many features at the consumer in the hope one or two sticks. I’m reminded of the recently revived Pebbl [...]
2025-04-23
Meta AI, the most interesting thing you can do with Ray-Ban Meta glasses, will soon be available to more people. The company's Live Translation feature is rolling out to all the product's ma [...]
2025-01-21
Mark Gurman at Bloomberg has released a report about Meta's next steps in hardware, crediting sources familiar with the company's work. According to these insiders, Meta is developing at lea [...]
2025-05-07
Diminished tech privacy appears to be another ripple effect from Trump 2.0. The Information reported on Wednesday that Meta has changed its tune on facial recognition. After considering but ultimately [...]
2025-05-20
One of the biggest reveals of Google I/O was that the company is officially back in the mixed reality game with its own prototype XR smart glasses. It's been years since we've seen anything [...]
2025-05-09
Apple is developing a chip for smart glasses, according to Bloomberg's Mark Gurman, and it's based on the chip used for the Apple Watch. The company's silicon group has reportedly remov [...]