Begin typing your search...

Google teases AI glasses and upgraded Gemini Live in TED Talk Demo

Google teases AI glasses and upgraded Gemini Live in TED Talk Demo

Google teases AI glasses and upgraded Gemini Live in TED Talk Demo
X

22 April 2025 8:46 PM IST

In a bold preview of what’s next in wearable technology, Google unveiled its latest AI-powered glasses prototype during a live demonstration at a recent TED Talk. The new device, built around Google’s Gemini AI platform, blends real-time vision, memory, and voice interaction into a discreet, everyday form factor.

A Glimpse Into Google’s Wearable AI Future

Shahram Izadi, Vice President and General Manager of Android XR at Google, took the stage to introduce the AI Glasses — smart eyewear that looks like regular prescription glasses but is packed with cutting-edge features. Equipped with built-in cameras, speakers, and a subtle display, the glasses can “see” what the wearer sees and interact via Gemini AI.

One live example during the demo showed the glasses composing a haiku inspired by the expressions of the surrounding audience — a striking showcase of real-time, contextual creativity.

Introducing Visual Memory to Wearables

The demo also highlighted a memory feature, first developed in Google’s Project Astra. This allows Gemini to remember objects and scenes for up to 10 minutes after they’ve left the user’s view, adding a new dimension to AI-assisted living and interaction.

Collaboration with Samsung and the XR Ecosystem

Google initially teased the idea of XR (Extended Reality) glasses back in December 2024, in collaboration with Samsung. According to the company, Android XR integrates years of AI, AR, and VR advancements to create immersive, helpful experiences for both headsets and glasses.

Gemini Live Gets Smarter

Beyond wearables, Google also hinted at significant updates coming to Gemini Live — its real-time, two-way voice assistant. In a separate interview with 60 Minutes, Demis Hassabis, CEO of Google DeepMind, confirmed that contextual memory could soon become part of Gemini Live, enabling it to recall previous conversations and visual inputs during interactions.

Hassabis also teased the possibility of social awareness features — like personalised greetings when a user switches on the feature.

What’s Next?

While still a prototype, early tests suggest the AI Glasses could eventually handle more complex tasks like online transactions and deeper AI-powered interactions. Although no official release date has been announced, these developments mark Google’s re-entry into the AI wearable space — more than a decade after the original Google Glass.

Next Story
Share it