Instead, the glasses resisted engaging with any books, figures, prints, or toys I showed them.
The only time Ive experienced similar levels of disconnect was in typical, awkward conversations with my father.
More updates are coming down the road that should add real-time translation and real-time video scanning.
© Kyle Barr
That feature should allow the AI to comment about what youre seeing live.
But I wouldnt trust that AI to describe my rooms decor accurately, let alone supermarket items.
Meta granted me a pair of the new Headliner transition lens glasses.
© Screenshot: Kyle Barr / Gizmodo
I can say with certainty they look much better than my current pair of old, yellowed shades.
Id consider them a solid option for personal audio when lounging on the beach.
Does Metas AI Not Have Any Nerdy Information in its Training Data?
© Screenshot: Kyle Barr / Gizmodo
I pointed the glasses at my metal print of a scene from the 2019 RPGDisco Elysium.
Its best guess was Borderlands.For some reason, it thought the faithful detective Kim Kitsuragi was Claptrap.
Harry Dubois, AKA Tequila Sunset, was one of the vault hunters.
© Screenshot: Kyle Barr / Gizmodo
I asked it to identify what my gaming setup included.
I tried it with memorabilia, both less or more esoteric.
My statue of Marv from theSin Citycomics was, according to Meta, The Hulk.
© Screenshot: Kyle Barr / Gizmodo
Like my parents, the glasses seem to think anything nerdy is probably a character from the Marvel movies.
Even when the glasses got things right, the AI struggled to be specific or accurate.
Yes, Dad, I play Warhammer 40K.
© Screenshot: Kyle Barr / Gizmodo
No,Dad, these books have nothing to do with it.
But hey, the gadget knew who Luigi was.
Nintendos reach obviously extends beyond the bounds of my little nerd bubble.
© Screenshot: Kyle Barr / Gizmodo
Still, youd think an AI could tell a Pokemon apart from a Legend of Zelda Korok.
It will look at a bottle of pomegranate molasses in my cupboard and tell me its soy sauce.
Remember when Googles first attempt at on-rig AIlied about the Webb Telescope?
© Screenshot: Kyle Barr / Gizmodo
Metas AI model used for the Ray-Bans will lie to your face, to your own eyes.
The answers it does get correct are often short and largely unhelpful.
That number is very much quantifiable.
© Screenshot: Kyle Barr / Gizmodo
We have yet to experience Metas Llama 3.2 multimodal models.
Metas AI says it still uses Llama 3.1 70B but that LLM may not be suited for mundane queries.
The glasses dont have access to location data (which is probably for the best).
© Photo: Kyle Barr / Gizmodo
The wearable AI couldnt tell me where the nearest boba tea place was near Union Square.
There are two within a three-block radius.
The AI models are purposefully limited in other ways for privacy, just not your own.
© Screenshot: Kyle Barr / Gizmodo
Metas AI wont be able to describe any face or person it sees.
Despite Metas efforts, theRay-Bans still have heavy privacy implications.
A group of university studentshacked the Ray-Bans glassesto add facial recognition.
© Screenshot: Kyle Barr / Gizmodo
The group posted a video showing just how well their glasses worked on Twitter last week.
This isnt what the Ray-Ban Metas were designed for.
Meta positions their Ray-Bans for the influencer crowd wanting to drop their pictures on Instagram.
In their current state, I wouldnt use the AI functions for anything more than a party trick.
News from the future, delivered to your present.
Meta Pissed Off Everyone With Poorly Redacted Docs
Meta is being very transparent on accident.
Elon Musk and Jack Dorsey, too.