Tech

Google Glasses Are Coming Again: Here’s What to Expect

Published

on

Last December, I wore Google Glasses in several forms while they were still under development. Soon you’ll be able to get your hands on the final versions. When, exactly, and for how much? We may find out more in just a handful of days.

While Meta has been the biggest tech company aiming for a place on your face in glasses form, it’s far from the only one. Google’s about to enter the race with a whole range of smart glasses, the company’s first return to everyday face tech since Google Glass in 2013.

This time, the focus is almost entirely on AI. Gemini will be the reason and the biggest function for what makes Google’s Android XR glasses work, but they’ll come in a wide range of designs: Warby Parker, Gentle Monster, Kering Eyewear and Samsung are all expected to have their own models. Xreal, a maker of display glasses, will have an additional plug-in mixed reality device called Project Aura, too.

Advertisement

This year’s Google I/O developer conference is just around the corner on May 19, and we should hear a lot more about Google’s smart glasses strategy. But we already know a lot, since Google talked about and demoed these glasses last year. Now that we’re in 2026, all these glasses should finally arrive, and if you’ve even been half-thinking about getting a pair of smart glasses, you’ll want to see what they’re all about.

Watch this: What to Expect From Google I/O: Glasses, Glasses, Glasses

All about Gemini

Google, Samsung and Qualcomm have been collaborating on Android XR, a new OS for a whole range of mixed reality headsets, AI glasses, display-enabled glasses and eventually augmented reality glasses. The first product of this collaboration, Samsung Galaxy XR, arrived last fall. 

Galaxy XR is very much a VR headset, but also a mixed reality computer, similar to the Apple Vision Pro and the Meta Quest 3. It runs Android apps via its Android XR OS, and also has Gemini AI that can respond to voice, and run live to see anything on your device’s screen and in the real world via its external cameras.

Advertisement

That on-tap Gemini assistant is exactly what will be the key app for the next wave of smart glasses. Much like Meta’s Ray-Ban and Oakley glasses, which use Meta AI, Google’s glasses will use Gemini and also related Gemini apps like Nano Banana and NotebookLM

Pop-up information on the display-enabled glasses will offer contextual details, like live map data.

Google

The display-free glasses will use microphones and built-in speakers to respond to AI prompts, handle live language translation, or play music and phone calls. A camera can take photos and videos, or activate a Gemini Live mode for continuous recording and AI awareness about the world. 

Advertisement

An additional line of display-enabled glasses, with a color display in one lens, will show snapshots taken on the glasses, show phone notifications, play videos or even provide live assistive captioning or translation. Certain apps will also work on the glasses as extensions of what you’re doing on your phone: Google Maps can show directions and maps displayed on the ground in front of you with a head tilt, or Uber can show driver status.

CNET’s Patrick Holland trying on a prototype model of the glasses last year, also at Google I/O.

Lexy Savvides

Three (or more) design partners

Warby Parker, Korean fashion eyewear brand Gentle Monster and European eyewear brand Kering are already official Android XR glasses partners, meaning all three will launch lines of Android XR glasses. Expect lots of designs and fashion riffs, much like how Meta’s glasses partner EssilorLuxottica makes many frame designs under its Oakley and Ray-Ban brands.

Advertisement

Gucci smart glasses are expected via Kering, and there are sure to be more surprises. Also, Samsung is likely in the mix. Even though Samsung is already a partner helping make all these other glasses (likely by provisioning camera and display components), Samsung is reportedly going to announce its own Android XR glasses at some point, too. 

Add to the mix Xreal, a manufacturer of USB-tethered display-enabled glasses, which is making its own Android XR mini-computer called Project Aura (more on that below).

Much like Google’s many partnerships with watch brands years ago via Android Wear, more glasses brands could come aboard. 

Advertisement

Project Aura, made by Xreal and Google, are display glasses that can run Android XR apps like a full mixed reality headset. They’re just part of what’s coming next year.

Google

A separate sort of AR glasses experience, Project Aura

The Xreal-made glasses work differently from the other smart glasses, acting more like a mini VR headset than an all-day set of eyewear. Project Aura is a specialized set of Xreal glasses with a larger display and extra cameras that plug into a processing puck the size of a phone. Wearing them (which I did last year), you can run apps and 3D experiences and even use hand tracking like a VR headset.

Project Aura runs the same apps as the Galaxy XR and uses the same chipset. It’s truly a sort of shrunken-down mixed reality experience, aiming to serve as a development tool for future Google AR glasses that might connect directly to phones as well as an actual product. But it’s not meant to be worn all day. Instead, like Xreal’s other glasses, it’s a sort of “headphones for your eyes” wearable display with audio that can extend displays out around you on the go.

The big difference: How well they’ll work with Google and Android

Google’s big advantage with Android XR should be how well these devices work with AI apps you might already use or with apps on your phone. On Android phones, these should feel more deeply integrated with phone controls and apps, like a smartwatch. With iOS, they should also work with Gemini services.

Advertisement

There still haven’t been everyday smart glasses that connect deeply with the phones in our pockets, and Google’s should be the first. Apple might follow next year with glasses of its own.

Google’s already said phone notifications should appear as interactive widgets on the glasses, but will more apps also build deeper hooks? And will more AI be allowed beyond Gemini? For now, Google has said Gemini is the primary AI service for its glasses. But these glasses will also work with WearOS watches, too.

Will you know who’s wearing these glasses, and how comfortable will the AI privacy policies feel?

Advertisement

Scott Stein/CNET

Will Google solve the privacy and social acceptance issues?

Meta has repeatedly run into trouble over its handling of users’ personal data, and inappropriate public use of its smart glasses cameras has led to social media backlash. Meta’s AI privacy policies are murky, and Meta’s not a company that’s respected for social media safety or privacy, with very good reason.

Will Google do better? It’s considered more reputable, but it’s also a company that already blends ads into our personal data and is increasingly swallowing up more data, like health and fitness, for its connected AI services. Google will have to explain how responsible it’ll be with glasses going forward, and overcome public acceptance factors. Will the “Glasshole” moniker come back to bite it?

Price and release date unknown

We have no idea when these glasses are coming, other than “sometime in 2026.” But expect more news starting at Google I/O on May 19. I’ll be there, and we’ll be reporting on all the AI and smart glasses news as it happens. We should know more then.

Advertisement

Source link

You must be logged in to post a comment Login

Leave a Reply

Cancel reply

Trending

Exit mobile version