As usual,Google I/O 2024is an absolute whirlwind of news and announcements.
I say semi because its evident after the demo that this part of Gemini is in its infancy.
It can answer questions about the environment around you by identifying objects, faces, moods, and textiles.
The Project Astra sign in the Google I/O AI sandbox.Photo: Florence Ion / Gizmodo
It can even help you remember where you last placed something.
There were four different demonstrations to choose from for Project Astra.
The demo I got was a version of Free-Form on the Pixel 8 Pro.
A preview of Project Astra’s Pictionary mode, another offering inside the demo room.Photo: Florence Ion / Gizmodo
Then, it correctly identified that he was carrying his phone.
In a follow-up question, our group asked about his clothes.
It gave a generalized answer that he seems to be wearing casual clothing.
Gemini correctly identified the phone in the person’s hand.Photo: Florence Ion / Gizmodo
I took hold of the Pixel 8 Pro for a quick minute.
I got Gemini to identify a pot of faux flowers correctly.
Gemini noticed they were also colorful.
Gemini correctly identified that the faux flowers were tulips.Photo: Florence Ion / Gizmodo
From there, I wasnt sure what else to prompt it, and then my time was up.
I left with more questions than I had going in.
With Googles AI, it seems like a leap of faith.
But thats not what this demonstration was about.
It was to showcase the capabilities of Project Astra and how well interact with it.
My biggest question is: Will something like Project Astra replace Google Assistant on Android devices?
I couldnt get an answer from the few Google folks I did ask.
News from the future, delivered to your present.
Meta Pissed Off Everyone With Poorly Redacted Docs
Meta is being very transparent on accident.