Samsung’s New Galaxy Phones Lay Groundwork for Headsets and Glasses to Come
Samsung and Google working on one Similar to Apple Vision Pro Mixed reality VR headset running Android XR and Google Gemini. We already know this and even got Last year’s demo. But Samsung also revealed more about its phone-centric Samsung winter unboxing eventSpecifically, Google’s joint partnership with Samsung’s AI ecosystem could be the missing piece to bring it all together. This AI-infused experience will appear on next-generation VR/AR headsets this year, but it’s also expected to Galaxy S25 Cell phones and glasses that will be connected to them.
In a sense, I had a premonition of the future at the end of last year.
Visual AI that works in real time
Samsung gave a brief look at its upcoming VR/AR headsets and glasses at its latest Unpacked event, but that’s basically all we know. Still, Samsung’s demonstration of real-time AI that can see things on the phone or through the camera is what’s trending for us. Expected to arrive in 2025.
Project Moohan (meaning “infinite” in Korean) is a VR headset with a pass-through camera that blends virtual and reality, like the Vision Pro or Meta’s mission 3. This design feels a lot like Meta and has been discontinued Explore Pro But with better specs. The headset features hand and eye tracking, runs Android apps via the Android XR operating system, which will be fully released later this year, and uses Google Gemini AI as a secondary layer throughout. Google’s astra project Technology that can provide real-time assistance for glasses, mobile phones and headphones, Coming soon On the Samsung Galaxy S25 series of phones. But I’ve seen it work on my face.
Last year’s demo had me using Gemini to assist me while I was looking around a room, watching a YouTube video, or anything else. Live AI needs to be booted into live mode to use it, after which it can see and hear what I see or hear. There’s also a pause mode that temporarily stops live assistance.
Samsung showed off similar real-time AI capabilities on its Galaxy S25 phone and promised more. I want it to work while watching YouTube videos, like my Android XR demo. According to executives from Samsung and Google developing Android XR, it can even provide real-time help while playing games.
Better battery life and processing power…for glasses?
Samsung and Google also confirmed that they are developing smart glasses that also use Gemini AI to interact with Meta’s Ray-Ban glasses and other emerging waves of eyewear. AR glasses are apparently also in development.
While Project Moohan is a standalone VR headset with its own battery pack and processor, like Apple’s Vision Pro, the smaller smart glasses Google and Samsung are developing, and any glasses that come after them, will rely on the phone’s Connect and process assistance to get to work. This is how smart glasses like Meta Ray-Ban glasses work.
But maybe more features means more intensive phone processing. Real-time artificial intelligence may start to become an increasingly used feature, relying on mobile phones to constantly help these glasses. Better processing, graphics, and most importantly, improved battery life and cooling sound like the way to make these phones the ultimate pocket-sized PCs to me.
The personal data sets these AI devices require
Samsung also announced an obscure-sounding personal data engine that Google and Samsung’s artificial intelligence will use to store personal data so that the AI ​​can potentially draw richer conclusions and build relationships with everything in your life. A place to connect.
How this works or how to ensure its security, or what its limitations are, is very unclear. But it sounds like a repository of personal data that Samsung and Google’s artificial intelligence can train and use with connected extensions, including watches, rings, and glasses.
Camera-enabled AI wearables are only as good as the data that helps them, which is why so many of them now feel clunky and weird to use, including Meta Ray-Ban glasses in AI mode. Often, these AI devices have trouble understanding things that existing applications already know better. Google and Samsung are clearly working on solving this problem.
Do I want to trust the process of Google and Samsung or anyone else? How will these phones and future glasses make the relationship between AI and our data clearer and more manageable? While Google’s I/O developer conference will likely discuss Android XR and Gemini advancements in more depth, it feels like we’re seeing one shoe drop here, and others will follow.
Samsung built Project Moohan as its first headphones, and will also launch glasses later. Google and Samsung are expected to reveal more details at the developer-focused Google I/O conference around May or June, with a full outline likely to be released at Samsung’s next expected Unpacked event in the summer. By then, we’ll likely know more about why this seemingly boring new wave of Galaxy S25 phones may be building an infrastructure that will play out in clearer details by the end of the year and perhaps even beyond.