The glasses had a microphone, camera, and speakers to gather as much information.
The glasses themselves use Android XR, Googles homegrown OS for extended reality devices.
Google product manager Nishtha Bhatia showed the crowd what the glasses cameras could display through the built-in camera.
Google VP Shahram Izadi wearing Google’s prototype AR glasses.© Jason Redmond / TED
Gemini can translate text it sees into different languages, though Izadi suggested that feature may produce mixed results.
That same camera could also parse text and graphs and turn it into more-digestible soundbites.
Its akin to whatGoogle Deepmind demoed with Project Astralast year.
© Gilberto Tadday / TED
The glasses should have more capabilities to connect with your smartphone and access all your phone apps.
Yet the real killer feature is the connection with your other Google apps.
Bhatia asked the glasses to look at a record from rapper Teddy Swims and play a track.
The glasses opened up YouTube Music and played the requested song.
Curiously, the pair shown at TED may not be sold under Googles own banner.
Google showed off the Samsungs prototype headset on the TED2025 stage.
It would instead rely on a microphone for speech and a camera for gesture controls.
News from the future, delivered to your present.
Meta Pissed Off Everyone With Poorly Redacted Docs
Meta is being very transparent on accident.