Meta, one of the leading tech giants have its newest Ray-Ban smart glasses, powered by Meta AI, the power process surroundings and answer environment-based questions.
Meta is finally set to let people try its AI features for the Meta Ray-Ban smart glasses, though in an early access test to start. The AI features, which can access Meta’s on-glasses cameras to look at images and interpret them with generative AI, were earlier supposed to launch in 2024.
The company has moved to introduce these features a lot faster than expected, although the early-access mode still being a beta. Along with adding Bing-powered search into Ray-Bans as part of a new update, which ups the power of the glasses’ already available voice-enabled capabilities, the glasses are starting to gain a number of new abilities fast.
One of the flashiest abilities added to the glasses is the new ‘Look and Ask’ which allows them to observe and understand the surroundings using the built-in cameras. With this feature, users can not only ask Meta AI to describe an object in front of them but also give suggestions based on visual inputs by giving a simple voice command of “Hey Meta, look..”.
Earlier in September, Zuckerberg, Meta’s CEO talked about the multimodal AI features coming to the glasses in an interview with The Verge in which he had said that people would talk to the Meta AI assistant throughout the day about the different questions they have suggesting that it could answer questions about what wearers are looking at or where they are.