Meta’s Smart Glasses Gain Live AI and Live Translation
Meta today added new features to its Ray-Ban smart glasses, including live translation and live AI. With live AI, the Meta smart glasses are able to see whatever the wearer sees thanks to the built-in camera, and can hold real-time conversations. According to Meta, the glasses are able to provide hands-free help with meal prep, gardening, exploring a new neighborhood, and more. Questions can be asked without the need to say the “Hey Meta” wake word, and the AI can understand context between requests for referencing prior queries. Meta says that eventually, the AI will be able to “give useful suggestions before you even ask.” Along with live AI, there’s now a new live translation feature that can translate in real-time between English and either Spanish, French, or Italian. When someone is speaking in one of those three languages, the glasses will translate what they say into English through the speakers or on a connected smart phone, and vice versa. The Meta glasses are now able to use Shazam to identify songs, so if you …