How Google's new AI glasses feel

Stephan Scheuer

Mar 05, 2026

Barcelona. Google has chosen one of the biggest tech stages of the year for its next attempt in the data glasses business. The company is showing prototypes of its new Android XR platform at the Mobile World Congress (MWC) in Barcelona.

For Google, this is more than just a product demo. The company is preparing its re-entry into a market that it already shaped more than a decade ago and in which it is also one of the most prominent failed pioneers. A version with a speaker and camera is due to be launched this year. An improved version with a screen in the lens is due to follow next year.

Handelsblatt was able to try out one of the early prototypes at the exhibition centre.

A quick self-test at the Google stand

Google has set up a small room for the demonstration. A table with books, a record, a work of art on the wall. The glasses are supposed to show what they see.

The prototype looks like a normal pair of glasses. A small display appears in front of my right eye. I look at a picture on the wall. The camera recognises the subject. Shortly afterwards, the glasses display information.

Then a record on the table: the glasses identify the album and seconds later start the appropriate song via the integrated speakers.

Interaction takes place via voice or a button on the temple. A small speech bubble appears on the display while the Gemini AI assistant responds - as text in the field of vision and simultaneously via audio.

This article is for members only.
You are not logged in - please log in.

Articles

Related Articles

Mar 16, 2026 by Lina Knees, Felix Holtermann

"The entire industry is digging the same trench"

Mar 04, 2026 by Felix Holtermann, Thomas Jahn

What the biggest IPO of all time reveals about Elon Musk