Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

teased in Google I/OProject Aura is a collaboration between Xreal and Google. It’s my second Android XR device (the first being Samsung Galaxy XR headset) and are expected to be released in 2026. Wearing them, I understand why the term “smart glasses” doesn’t quite fit.
Is it a headset? Smart glasses? both of them? Those were the questions on my mind as I held Project Aura in my hands for a recent demo. He – she It seemed Like a pair of large sunglasses, except for the wire hanging off the left side, leading to the battery pack that also served as the trackpad. When I asked, Google representatives told me they consider it a headset masquerading as glasses. They have a term too: XR wired glasses.
I can connect wirelessly to a laptop and create a giant virtual desktop in my own space. I have a 70 degree field of view. My first task is to launch Lightroom on my virtual desktop while opening YouTube in another window. I’m playing a 3D tabletop game where I can tap and drag the board to zoom in and out. Look at the panel on the wall and call the circle to search. Gemini tells me the name of the artwork and the artist.
I’ve done all this before on the Vision Pro and Galaxy XR. This time, my head not so Stuffed into a huge speaker. If you wear this in public, most people won’t notice. But this is not augmented reality, which overlays digital information over the real world. It’s a lot like using the Galaxy XR, where you see apps in front of you and around you.
A Google representative told me that everything I tried in Project Aura was originally developed for the Galaxy XR. None of the apps, features, or experiences had to be redesigned to fit Project Aura’s form factor. This is huge.
The XR has a big problem with the app. Take the Meta Ray-Ban and Vision Pro screen. Both launched with a few third-party apps, giving consumers little reason to wear them. Developers also have to pick and choose which of these tools they will invest in creating apps for. This leaves little room for small companies with big ideas to compete or experiment.
That’s what makes Android XR great. Smaller players, like Xreal, have access to apps developed for Samsung headsets. The Android apps will also work on the AI glasses that will be launched next year Warby Parker and the cute monster.
“I think this is probably the best thing for all developers. You don’t see any fragmentation anymore. I think there will be more and more devices coming together. That’s the whole goal of Android XR,” says Qi Xu, CEO of Xreal.
Slide on Google The latest models of artificial intelligence glassesI played around with an Uber demo where a fictional version of me takes a flight from JFK Airport. A representative calls Uber on the phone. I see the Uber widget pop up on the glasses screen. Shows estimated pickup time, my driver’s license plate, and car model. If you look down, a map of the airport appears with real-time directions to the concourse area.
All of this is powered by the Uber Android app. This means Uber didn’t have to code the Android XR app from scratch. In theory, users can just pair the glasses and start using the apps they already have.
When I’m prompted to ask Gemini to play some music, the YouTube Music widget pops up, displaying the funky jazz mix title and media controls. He also uses the YouTube Music app on his Android phone.
I was asked to tell Gemini to take a picture with the glasses on. A preview of it appears on the screen and On your paired Pixel Watch. The idea is that integrating smartwatches gives users more options. Let’s say someone wants audio-only glasses with a camera. They can now take a photo and see how it looks on the wrist. It will work on any Wear OS compatible watch.
I’m also trying live translations where the glasses detect what language is being spoken. I make video calls via Google Meet. Get the Nano Banana Pro to add K-Pop elements to another photo you’ve taken. I’m trying out a second prototype with a display in both lenses, allowing a larger field of view. (these no It will be released next year.) I watch a 3D video on YouTube.
Everything is impressive. I hear some sweet talk about him How Gemini is truly the killer app. But I was really surprised when I was told that next year Android XR glasses will support iOS.
“The goal is to give this ability to have multimedia Gemini in your glasses to as many people as possible. If you’re an iPhone user and you have the Gemini app on your phone, that’s great news. You’ll get the full Gemini experience there,” says Juston Payne, Google’s director of product management for XR.
Payne points out that this will be true broadly across Google’s iOS apps, like Google Maps and YouTube Music. Restrictions on iOS will mostly include third-party apps. But even there, Payne says the Android XR team is exploring workarounds. At the time Locking in the wearable ecosystem At an all-time high, this is a breath of fresh air.
Google’s use of its existing Android ecosystem is a smart move that could give Android only only open Its API is for developers. It also increases pressure on Apple, which has fallen behind in the areas of artificial intelligence and glasses. Make things interoperable between device form factors? Honestly, this is the only way a mediocre device like Project Aura would stand a chance.
“I know we can make these glasses smaller and smaller in the future, but we don’t have that ecosystem,” adds Xu, CEO of Xreal. “There are only two companies in the world right now that can have a real ecosystem: Apple and Google. Apple, it won’t work with the others. Google is the only option for us.”
Google is trying to avoid the mistakes of the past. It deliberately partners with other companies to manufacture devices. It departs from the clear design of the original Google Glass. Contains previously launched applications. Prototypes explore multiple form factors – audio only and displayed in one or both lenses.
Payne doesn’t quibble when I ask the big cultural question: How do you discourage glass holes?
“There’s a very bright, pulsating light in case it registers anything,” Payne says. “So if the sensor is triggered with the intention of memorizing anything, it will tell everyone around it.” This includes inquiries to Gemini for any The task involves the camera. The on and off switches will have clear red and green markings so users can prove to others that they are not lying when they say the glasses are not recording. The permission frameworks, privacy policies, encryption, data retention, and security safeguards found in Android and Gemini will also apply, Payne says.
“There’s going to be a whole process to get certain access to the sensors so we can avoid certain things that could happen if someone decides to use the camera in a bad way,” Payne says, noting that Google is taking a conservative approach to giving third parties access to cameras.
On paper, Google is making smart moves that address many of the challenges inherent in this area. It sounds good, but that’s easy to say before these glasses launch. A lot can change between now and then.