I Witnessed the Future of Smart Glasses at CES. And It’s All About Gestures
In a corner of the bustling showroom floor CES 2025I felt like an orchestra conductor. As I moved my arms subtly from side to side, notes sounded from the cello displayed on the big screen in front of me. The faster I move my arm, the faster the bow slides across the strings. I even earned a round of applause from other booth attendees after one particularly fast performance.
That’s what it’s like to use the Mudra Link wristband, which lets you use gesture controls to operate your device. Motion control is nothing new. I remember back in 2014 on something like Wonderful armband. The difference now is that such devices have a greater reason to exist thanks to the arrival of smart glasses, which seem to be everywhere at CES 2025.
Startups and big tech companies have been trying to develop smart glasses for more than a decade. However, the arrival of artificial intelligence models Can handle both speech and visual input make them at the same time feel more relevant than ever before. After all, digital assistants can be more helpful if they can see what you’re seeing and answer questions in real time, like the idea behind Google’s Astra plan Prototype glasses. IDC’s September report shows that smart glasses shipments are expected to grow by 73.1% in 2024 Reporta further sign that technology-equipped glasses are starting to become popular.
Read more: Nvidia CEO explains how its new AI model could play a role in future smart glasses
Look at this: These new smart glasses want to be your next AI companion
Last fall, Meta showed off its prototype pair AR glasses, called Orioncontrolled by a wristband with hand gestures and neural input. At last year’s AR Augmented World Expo, other startups showed a similar experiment.
At CES, it’s clear that companies are putting a lot of thought into how we’ll navigate these devices in the future. In addition to the Mudra Link band, I’ve found a few other wearable devices that work with glasses.
Take the afferent loop, for example, which applies neurohaptics to your fingers to provide tactile feedback when using gesture controls. It works with devices like smart glasses and headphones, but I got to try a prototype of it paired with a tablet to see how the technology works.
In one demo, I played a simple mini-golf game that required me to pull my arm back to tighten it and then release it to launch the ball. The further I pulled back, the stronger the sensation on my fingers became. The experience of toggling the brightness and audio sliders was similar; as I turned up the brightness, it felt more noticeable on my finger.
It was a simple demo, but it helped me understand the types of approaches companies might take when applying haptic feedback to menus and apps in mixed reality. Afference didn’t mention any specific partners it’s working with, but it’s worth noting that Samsung Next is involved Afference seed round financing. Samsung launches first Health Tracking Smart Ring It was announced in December that it was building the first headset to run on the newly announced platform Android XR Platform For use with upcoming mixed reality headsets.
The Mudra Link wristband works with the newly released TCL RayNeo X3 Pro glasseswhich will launch later this year. I briefly tried using a Mudra Link wristband to scroll through the app menu on the RayNeo glasses, but the software hasn’t been finalized yet.
I spend most of my time using the wristband to manipulate graphics on a large screen used for conference presentation purposes. The cello example was the most eye-catching demo, but I was also able to grab and stretch a cartoon character’s face and move it around the screen by waving my hand and pinching my fingers.
Halliday’s smart glasses were also unveiled at CES, and they work with an included navigation ring. While I didn’t have a chance to try out the ring, I briefly used the glasses to translate language in real time, and the text translation was immediately visible in my field of vision, even on a noisy showroom floor.
Without gestures, there are typically two main ways to interact with smart glasses: on-device touch controls and voice commands. The former is great for quick interactions like sliding menus, launching apps, or answering calls, while the latter is great for summoning and commanding virtual assistants.
Gesture controls make it easier for users to navigate the interface without having to raise their hands to their face, speak loudly, or hold an external controller. However, there’s still a certain level of awkwardness associated with using gestures to control a screen that’s invisible to everyone except those who wear glasses. I can’t imagine waving in public without any context.
Meta is already moving towards gesture-controlled glasses, says its chief technology officer Andrew Bosworth recently told CNET Any future display-enabled glasses will likely require gestures.
If CES is any indication, 2025 is shaping up to be a big year for smart glasses—gesture control will undoubtedly play a role in how we navigate these new spatial interfaces in the future.