Photo by Steve Johnson on Unsplash
Meta on Wednesday announced an early access program for its Ray-Ban Meta smart glasses users in the US allowing them a chance to explore a new trial featuring multimodal AI-powered capabilities in-built in the pair of shades.
According to its press release, users will be able to communicate through their glasses, which will also possess the ability to comprehend visual information through the incorporated camera.
Users can prompt Meta AI for assistance in generating a caption for a photo captured during a hike or seek Meta AI's description of an object they are holding, the company said.
In an Instagram reel, Meta Platforms CEOMark Zuckerberg showcased the update by using the glasses to request suggestions for a pair of pants that would complement a shirt he was holding. In response, the glasses provided a description of the shirt and presented a few recommendations for pants that could complement it.
The AI assistant also apparently provided an accurate description of a lit-up, California-shaped wall sculpture in a video featuring CTO Andrew Bosworth.
Bosworth explained additional features, such as asking the assistant for photo captions, translation, and summarization — functionalities commonly found in other AI products from companies like Microsoft and Google.
Meta is also introducing the capability for Meta AI on the glasses to access real-time information, partly powered by Bing. Users can inquire about sports scores or details regarding local landmarks, restaurants, stocks, and more.
The Ray-Ban Meta smart glasses, unveiled in September alongside other Meta products such as the Meta Quest 3, feature the Qualcomm Snapdragon AR1 Gen1 Platform SoC and comes equipped with a 12-megapixel sensor, an LED light, and 32GB of built-in storage.
Pricing varies based