Friday, 18 April, 2025 UTC


Summary

At TED2025 Google showed off sleek smart glasses with a HUD, though the company described them as " conceptual hardware".
Shahram Izadi, Google's Android XR lead, took to the TED stage earlier this month to show off both the HUD glasses and Samsung's upcoming XR headset, and the 15-minute talk is now publicly available to watch.
0:00
/2:18
A supercut of the TED2025 demo.
The glasses feature a camera, microphones, and speakers, like the Ray-Ban Meta glasses, but also have a "tiny high resolution in lens display that's full color". The display appears to be monocular, refracting light in the right lens when seen from certain camera angles during the demo, and has a relatively small field of view.
The demo focuses on Google's Gemini conversational AI system, including the Project Astra capability which lets it remember what it sees via "continuously encoding video frames, combining the video and speech input into a timeline of events, and caching this information for efficient recall".
Here's everything Izadi and his colleague Nishtha Bhatia show off in the demo:
  • Basic Multimodal: Bhatia asks Gemini to write a haiku based on what she's seeing, while looking at the audience, and it responds "Faces all aglow. Eager minds await the words. Sparks of thought ignite"
  • Rolling Contextual Memory: after looking away from a shelf, which contains objects including a book, Bhatia asks Gemini what the title is of "the white book that was on the shelf behind me", and it answers correctly. She then tries a harder question, asking simply where her "hotel keycard" is, without giving the clue about the shelf. Gemini correctly answers that it's to the right of the music record.
  • Complex Multimodal: holding open a book, Bhatia asks Gemini what a diagram means, and it answers correctly.
  • Translation: Bhatia looks at a Spanish sign, and without telling Gemini what language it is, asks Gemini to translate it to English. It succeeds. To prove that the demo is live, Izadi then asks the audience to pick another language, someone picks Farsi, and Gemini successfully translates the sign to Farsi too.
  • Multi-Language Support: Bhatia speaks to Gemini in Hindi, without needing to change any language "mode" or "setting", and it responds instantly in Hindi.
  • Taking Action (Music): as an example of how Gemini on the glasses can trigger actions on your phone, Bhatia looks at a physical album she's holding and tells Gemini to play a track from it. It starts the song on her phone, streaming it to the glasses via Bluetooth.
  • Navigation: Bhatia asks Gemini to "navigate me to a park nearby with views of the ocean". When she's looking directly forwards, she sees a 2D turn-by-turn instruction, while when looking downwards she sees a 3D (though fixed) minimap showing the journey route.
Google Teases AI Smart Glasses With A HUD At I/O 2024
Google teased multimodal AI smart glasses with a HUD at I/O 2024.
UploadVRDavid Heaney
This isn't the first time Google has shown off smart glasses with a HUD, and it's not even the first time said demo has focused on Gemini's Project Astra capabilities. At Google I/O 2024, almost one year ago, the company showed a short prerecorded demo of the technology.
Last year's glasses were notably bulkier than what was shown at TED2025, however, suggesting the company is actively working on miniaturization with the goal of delivering a product.
However, Izadi still describes what Google is showing as " conceptual hardware", and the company hasn't announced any specific product, nor a product timeline.
In October The Information's Sylvia Varnham O'Regan reported that Samsung is working on a Ray-Ban Meta glasses competitor with Google Gemini AI, though it's unclear whether this product will have a HUD.
Meta HUD Glasses Price, Features & Input Device Reportedly Revealed
A new Bloomberg report details the price and features of Meta’s upcoming HUD glasses, and claims that Meta’s neural wristband will be in the box.
UploadVRDavid Heaney
If it does have a HUD, it won't be alone on the market. In addition to the dozen or so startups which showed off prototypes at CES, Mark Zuckerberg's Meta reportedly plans to launch its own smart glasses with a HUD later this year.
Like the glasses Google showed at TED2025, Meta's glasses reportedly have a small display in the right eye, and a strong focus on multimodal AI (in Meta's case, the Llama-powered Meta AI).
Unlike Google's glasses though, which appeared to be primarily controlled by voice, Meta's HUD glasses will reportedly also be controllable via finger gestures, sensed by an included sEMG neural wristband.
Apple too is reportedly working on smart glasses, with apparent plans to release a product in 2027.
Apple Exploring Releasing Smart Glasses In 2027
Apple seems to be exploring making smart glasses, and reportedly could ship a product in 2027.
UploadVRDavid Heaney
All three companies are likely hoping to build on the initial success of the Ray-Ban Meta glasses, which recently passed 2 million units sold, and will see their production vastly increased.
Expect competition in smart glasses to be fierce in coming years, as the tech giants battle for control of the AI that sees what you see and hears what you hear, and the ability to project images into your view at any time.