AI powered visual search features came Last year, Ray-Ban’s Meta sunglasses got an impressive (and disturbing) capabilities — but the new one in the latest beta looks pretty useful. It identifies attractions in different places and tells you more about them, acting as a kind of guide for travelers, writes Meta CTO Andrew Bosworth. Topic post.
Bosworth showed some sample images explaining why the Golden Gate Bridge is orange (easier to see in the fog), the history of San Francisco’s “painted ladies” houses, and more. For them, the images appeared as text under the images.
In addition, Mark Zuckerberg used Instagram to demonstrate new capabilities through several videos shot in Montana. This time, the glasses use audio to verbally describe the history of Big Sky Mountain and Roosevelt Arch while explaining (like a caveman) how the snow is formed.
Meta reviewed the function in its place Contact event last year, as part of new “multimodal” capabilities that allow you to answer questions based on your environment. This, in turn, was enabled when all of Meta’s smart glasses gained access to real-time data powered in part by Bing Search (rather than having a 2022 knowledge cutoff like before).
The feature is part of Meta’s Google Lens-like functionality, which allows users to “point” what they see through the glasses and ask the AI questions about it — like fruit or foreign text that needs translation. Available to everyone on Meta early access program, which is still limited in number. “For those who don’t have access to the beta yet, you can add yourself to the waiting list while we work to make it available to more people,” Bosworth wrote in his post.
This article contains affiliate links; we may earn a commission if you click on such a link and make a purchase.