Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

last december, I wore Google glasses In several forms while it was still under development. You will soon be able to get the final versions. When exactly and how much? We may find out more in a few days.
While Meta was the biggest tech company aiming to get a place in your face In the form of glassesHe’s not the only one. Google is about to enter the race with A complete set of smart glassesthe company’s first return to everyday facial technology since then google glass In 2013.
This time, the focus is almost entirely on artificial intelligence. twin The biggest reason and function will be behind the success of Google’s Android Exreal, maker Of display glassesit will have an additional mixed reality device called Project Aura as well.
this year Google I/O developer conference May 19 is approaching, and we should be hearing a lot more about Google’s smart glasses strategy. But we already know a lot, since Google talked about and showed off these glasses last year. Now that it’s 2026, all of these glasses should finally be arriving, and if you’re even partially considering getting a pair of smart glasses, you’ll want to know what they’re all about.
Watch this: What to expect from Google I/O: Glasses, glasses, glasses
Google, Samsung and Qualcomm are collaborating On Android XRa new operating system for a whole range of mixed reality headsets, AI glasses, display-enabled glasses, and eventually augmented reality glasses. The first product of this cooperation, Samsung Galaxy XRHe arrived last fall.
The Galaxy XR is pretty much a Virtual reality headsetbut it is also a mixed reality computer, similar to Apple Vision Pro and Meta Quest 3. It runs Android apps via its Android
Gemini’s on-tap assistant is exactly what will be the main application for the next wave of smart glasses. Just like Meta’s Ray-Ban and Oakley glasses, which use Meta AI, Google Glass will use Gemini and related Gemini apps as well, such as Nano banana and LM notebook.
Pop-up information on the display-enabled glasses will provide contextual details, such as live map data.
The display-free glasses will use built-in microphones and speakers to respond to AI prompts, handle live language translation, or play music and phone calls. The camera can take photos and videos, or activate Gemini Live mode for continuous recording and AI awareness around the world.
An additional line of display-enabled glasses, with a color display in one lens, will display footage captured on the glasses, show phone notifications, play videos, or even provide live captions or subtitles. Some apps will also work on glasses as an extension of what you do on your phone: Google Maps can display directions and maps displayed on the ground in front of you with a tilt of the head, or Uber can display a driver’s status.
CNET’s Patrick Holland demoed a prototype of the glasses last year, also at Google I/O.
Warby Parker, Korean eyewear brand Gentle Monster, and European eyewear brand Kering are already official partners for Android XR glasses, meaning all three will be launching Android XR glasses lines. Expect plenty of designs and trends, as Meta’s eyewear partner, EssilorLuxottica, creates several frame designs under its brand Oakley and Ray Ban Trademarks.
Gucci smart glasses Expected via Keyring, there are sure to be more surprises. Samsung will also likely be in the mix. Although Samsung is already a partner in helping make all of these other glasses (most likely by supplying the camera and display components), Samsung will announce its own Android XR glasses at some point as well.
Add to the mix Xreal, the maker of USB-connected display-enabled glasses, which makes its own Android XR-powered mini PC called Project Aura (more on that below).
Just like Google has had numerous partnerships with watch brands for years via Android Wear, more could be coming from eyewear brands.
Project Aura, made by Xreal and Google, are viewing glasses that can run Android XR apps like a full mixed reality headset. They are just part of what is to come in the coming year.
The Xreal glasses work differently than other smart glasses, acting more like a mini VR headset than a set of glasses for the day. Hala Project It’s a specialized set of Xreal glasses with a larger screen and additional cameras that attach to a phone-sized processing disk. By wearing it (which I did last year), you can run apps, 3D experiences, and even use hand tracking like a VR headset.
Project Aura runs the same apps as the Galaxy XR and uses the same chipset. It’s really a kind of shrunken mixed reality experience, and is intended to serve as a development tool for future Google AR glasses that might connect directly to phones as well as an actual product. But it is not meant to be worn all day. Instead, like other Xreal glasses, they’re a wearable display “headphones for your eyes” with audio that can expand the view around you while you’re on the move.
The big advantage Google has with Android XR has to be how well these devices work with AI apps you may already use or with apps on your phone. On Android phones, these elements should feel more integrated with the phone’s controls and apps, like a smartwatch. With iOS, they must also work with Gemini services.
There are still no everyday smart glasses that can connect deeply with the phones in our pockets, and Google Glass has to be the first. Apple may follow suit next year with its own glasses.
Google has already said that phone notifications should appear as interactive widgets on glasses, but will more apps also build deeper hooks? Will more artificial intelligence be allowed outside Gemini? For now, Google said Gemini is the core AI service for its Glasses. But these glasses will work too With WearOS watchesalso.
Will you know who is wearing these glasses, and how comfortable will the AI privacy policies be?
Meta has repeatedly encountered problems over its handling of users’ personal data, and Inappropriate general use Its smart glasses cameras have led to backlash on social media. Meta’s AI privacy policies are vagueAnd Meta is not a well-respected company when it comes to safety or privacy on social media, and for very good reason.
Will Google do a better job? It’s more popular, but it’s also a company that’s already integrating ads into our personal data and increasingly swallowing more data. Such as health and fitness For related artificial intelligence services. Google will have to explain its responsibility in moving glasses forward, and overcome public acceptance factors. Will the nickname “the glass hole” come back to bite him?
We have no idea when these glasses will debut, other than “sometime in 2026.” But expect more news starting at Google I/O on May 19. I’ll be there and reporting on all the AI and smart glasses news as it happens. We should know more then.