Adrianna Nine
2024-03-13 16:15:00
www.extremetech.com
If you forgot about Ray-Ban Meta after its debut last year, you’re not alone. Smart glasses have always been more of a novelty than a practical everyday accessory. When Meta first showed off theirs in September, it inadvertently raised concerns about privacy and distraction. It also took months for Meta to introduce what it calls “multimodal AI,” or the ability for Ray-Ban Meta to answer questions about whatever the user is looking at. Because this is arguably one of the biggest reasons a person would buy smart glasses, Ray-Ban Meta’s first months on the market were lackluster.
Now, Meta is trying to expand on and enhance its glasses’ contextual query capabilities. In a Threads post on Tuesday, Meta CTO Andrew Bosworth said the company is improving Ray-Ban Meta’s multimodal AI features “across performance and domains.” Starting now, members of Meta’s early access program can test the glasses’ ability to identify and describe popular landmarks.
Bosworth’s post offers the example of the Painted Ladies, a famous row of colorful Victorian houses in San Francisco. A screenshot from Meta View—Ray-Ban Meta’s companion app—shows Bosworth’s view of the Painted Ladies from across the street. Underneath is a short paragraph about the landmark, including when and by whom the houses were built. Other San Francisco examples from Bosworth include the Golden Gate Bridge and the Coit Memorial Tower.
Credit: Andrew Bosworth
In an Instagram post, Mark Zuckerberg also demonstrated Ray-Ban Meta’s ability to identify the Roosevelt Arch at the entrance of Yellowstone National Park. Using the prompt, “Hey Meta, look and tell me about the history of this monument,” Zuckerberg got the glasses to describe the landmark’s origin. His glasses also identified Lone Mountain in Big Sky, Montana, and were able to state the mountain’s peak elevation.
Meta’s executives will only post examples that Ray-Ban Meta identified correctly, so it’s hard to tell how reliable this new multimodal AI feature might be. Based on some earlier user experiences, the bar is on the floor. Engadget’s Karissa Bell, for instance, reportedly found the glasses’ grasp on real-time information to be “shaky at best” when she tried its multimodal AI features earlier this year. She even said she received “completely inaccurate information in response to simple questions.”
So far, it appears Ray-Ban Meta’s landmark ID feature is only available to testers in the United States. Whether this is a matter of Meta’s beta rollout cadence or the feature’s inability to identify landmarks outside of the US is unclear. Those interested in trying the feature can join an early access program waitlist using their glasses’ serial number.