Meta’s smart glasses can now tell you where you parked your car


Meta provides some Features of Ray-Ban smart glasses powered by artificial intelligence for users in the US and Canada. CTO Andrew Bosworth on Threads that today’s update for the glasses includes more natural language recognition, meaning the “Hey Meta, look and tell me” commands should disappear. Users will be able to engage the AI ​​assistant without the “look and” part of the call.

Most of the other AI tools showcased at last month’s Connect event are also making their way into frameworks today. This includes voice prompts, timers and reminders. The glasses can also be used for the Meta AI to call a phone number or scan a QR code. CEO Mark Zuckerberg showed off new reminder features to find your car in the garage on his Instagram feed. One notable omission from this update is the live translation feature, but Bosworth didn’t share a timeline for when that feature will be available.

Meta’s smart glasses have already made headlines once today after two Harvard students used them . Their combination of facial recognition technology and a large language processing model was able to detect addresses, phone numbers, family member details and, in part, Social Security Numbers.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *