Did Google Sneakily Reveal Google Glass In an AI Demo?

Did Google Sneakily Reveal Google Glass In an AI Demo?

Google held its annual I/O 2024 developer’s conference in the early hours of this morning, with this year’s event stuffed with AI from wall to wall. Among the AI tools and features shown off, Google debuted Project Astra; a fast, responsive AI chatbot that can tap into your camera.

Astra is part of Google’s initiative to build AI agents, these being AI models that can be helpful in everyday life – to which I say, “Isn’t that the point of this to begin with?” I digress, the demo that Google showed off at I/O 2024 for Project Astra was seriously impressive. Here’s the video from the event.

It’s worth mentioning that this was a pre-recorded demo. Yesterday, OpenAI debuted an extremely similar tech with GPT-4o, but that was a live demo. Take both events with a grain of salt, these are big PR events for these companies after all, but Google’s demo was certainly interesting.

In the video, Google is showing off a person using a smartphone (a Pixel 8 Pro) and showing it around a room, with an AI capable of identifying objects in the room and prompting responses about said objects – based on verbal inputs from the user.

“Building on Gemini, we’ve developed prototype agents that can process information faster by continuously encoding video frames, combining the video and speech input into a timeline of events, and caching this information for efficient recall,” Google DeepMind CEO Demis Hassabis said on the Google Blog.

“By leveraging our leading speech models, we also enhanced how they sound, giving the agents a wider range of intonations. These agents can better understand the context they’re being used in, and respond quickly, in conversation.”

In my opinion, the most impressive part of the demo was when the camera was pointed directly at a segment of code, and it was able to provide a brief description of said code. It was also supposedly able to identify a location by having the camera pointed out the window.

Again, this is a scripted demo, and it’s too early to say if this will represent any product from Google down the line. Wait. 1 minute, 28 seconds in. I know what I see. That’s Google Glass, baby.

Or, at least, a pair of AR glasses that may or may not be linked to the failed and much-memed Google Glass project, potentially now rejigged in the direction of AI. To be fair, the glasses shown in the demo look nothing like Google Glass, made up of a thick black frame and without the futuristic Glass UI. Rather, Google is using these bulky-looking glasses as a vehicle to show off what it wants from AI. Google refers to it in the video description as a prototype.

The glasses supposedly have cameras fitted to them, through which the AI can identify what the wearer is looking at, and provide descriptions of what they are looking at.

Anyway. Google says some of these AI capabilities will be coming later this year.

As for a new pair of AR glasses from Google? Who knows. Could Glassholes make a comeback? I hope not. Reports from 2022 indicated that the company was intending to build a headset, and at I/O 2022, the company teased a feature for smart glasses that could bridge language barriers. Time will tell.

Catch up on Google I/O 2024 here.

Image: David Paul Morris/Bloomberg (Getty Images), Google


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.