Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Samsung’s new Galaxy phones lay the foundation for the emergence of headphones and glasses


Samsung and Google They are working on Apple Vision Pro like VR mixed reality headset powered by Android XR and Google Gemini. We already knew that and even got it Demo of it last year. But Samsung also revealed more about its focus on the phone Samsung Winter Unpacked EventSpecifically, the joint partnership between Google and Samsung AI could be the missing piece to bring it together. This AI-infused experience will be on next-gen VR/AR headsets this year, but it’s also expected to run on Galaxy S25 Phone and glasses to be contacted.

In a sense, I actually got a preview of what the future holds at the end of last year.

A collection of Samsung products on a chart, with AI layers underneath

Samsung’s vision for its products is all connected through artificial intelligence. And now the AI ​​is consistent.

Samsung

See AI that works in real time

Samsung briefly touched on its upcoming VR/AR headsets and glasses at its most recent Unpacked event, but we pretty much knew about them already. However, Samsung’s offering of real-time AI that can see things on your phone or through cameras is exactly the direction we’re headed. It was expected to arrive in 2025.

Project Moohan (meaning “Infinity” in Korean) is a VR headset with transit cameras that blends virtual and real reality, just like the Vision Pro or… Meta Quest 3. The design looks very similar to the discontinued Meta design Quest Pro But with much better specifications. The headset features hand and eye tracking, runs Android apps via the Android XR operating system that will be fully revealed later this year, and uses Google Gemini AI as an assistive layer throughout. Google Astra Project Technology, which allows real-time assistance on glasses, phones and headphones, For the first time On the Samsung Galaxy S25 series. But I already saw it on my face.

My demos last year allowed me to use Gemini to help me while walking around the room, watching YouTube videos, or doing anything else. Live AI had to be running in this live mode to use it, and then it could see and hear what I was looking at or hearing. There were pause modes to pause live assistance as well.

Samsung showed off what looked like similar real-time AI functionality on the Galaxy S25 phones, and promised more. I expect it to be able to work while watching YouTube videos, just like it did with the Android XR beta. According to Samsung and Google executives working on Android XR, it could be used for direct assistance while playing games.

Dog and artificial intelligence service on the screen of a Samsung phone

Gemini’s quick visual recognition skills may start to feel the same between glasses and phones.

Samsung

Better battery life and handling…for glasses?

Samsung and Google have also confirmed that they are working on smart glasses, also using Gemini AI, to compete with them Meta Ray Ban And a wave of other emerging spectacles. And augmented reality glasses appear to be in the works as well.

While Project Moohan is a standalone VR headset with its own battery pack and processors, just like Apple’s Vision Pro, the smaller smart glasses Google and Samsung are working on — and any glasses after that — will rely on communications and processing assistance from phones . To work. This is how smart glasses like Meta’s Ray-Bans actually work.

But, perhaps, with more features means the need for more intensive phone processing. Live AI could become an increasingly used feature, with phones being relied upon to constantly work to support these glasses. Better processing and graphics, and more importantly, improved battery life and cooling, seemed to me like ways to make these phones better pocket PCs for eventual glasses.

A key image from Samsung shows people surrounded by webs of personal data

Personal data clouds are what Samsung and Google will rely on to power smarter AI assistants on glasses and phones.

Samsung

A personal dataset that these AI tools will need

Samsung also announced a fuzzy-sounding personal data engine that will leverage Google and Samsung AI, storing personal data in a place where AI can develop richer inferences and connections for all the things that are part of your life.

How this would be implemented or secured, or its limits, were very unclear. But it appears to serve as a repository of personal data that Samsung and Google’s AI can learn from and work with extended connected products, including watches, rings and glasses.

Camera-equipped AI wearables are only as good as the data they can help, which is why many of these devices now feel outdated and awkward to use, including Meta’s Ray-Bans in their AI modes. These AIs usually hit a wall when it comes to learning what your existing applications know best. Clearly, Google and Samsung are trying to fix this.

Do I want to trust this process to Google, Samsung, or anyone else? How will these future phones and glasses make the relationship between AI and our data clearer and more manageable? It feels like we’re seeing one drop here, with others emerging when Google I/O Developers Conference will likely discuss the advances of Android XR and Gemini in more depth.

Samsung makes Project Moohan its first headphone, followed by glasses in the future after that. Expect Google to hammer out more details with Samsung at its developer-focused Google I/O conference around May or June and perhaps a full rundown in the summer at Samsung’s expected next Unpacked event. By then, we may know a lot about why this seemingly boring new wave of Galaxy S25 phones is building an infrastructure that will emerge in clearer detail by the end of the year… or even later.



Leave a Reply

Your email address will not be published. Required fields are marked *