Friday, January 24, 2025
HomeTechnologyMeta's Ray-Bans New Live AI and Translation, Hands-On: Signs of AR Glasses...

Meta’s Ray-Bans New Live AI and Translation, Hands-On: Signs of AR Glasses to Come | Global News Avenue

Meta’s Ray-Bans New Live AI and Translation, Hands-On: Signs of AR Glasses to Come

I activated Meta Ray-Bans New real-time AI capabilities Morning walk through Manhattan. It was a strange experience. The white LEDs in the corners of my eyes were always on, and my glasses were watching my life. I asked awkward questions: about the pigeon, about the construction workers, about whether it knew what cars were nearby, or who owned those trucks across the street. The answers I got were complex and sometimes no answers at all. Then my connection died due to bad bluetooth in the city.

My first steps with an eternally conscious artificial intelligence companion are stranger, even more sci-fi than anything I’ve experienced in the past year. Much like a recent Google demo Gemini-powered glasses that are always on, Meta’s Ray-Ban glasses – these are already very available – are taking the next step to become an always-aware assistant. or agentas the field of artificial intelligence now calls it. Real-time AI and real-time translation; once you turn it on, keep going. Assume that artificial intelligence can see what you see. Maybe it will help you do something you don’t know how to do.

Look at this: Meta Ray-Bans real-time translation and real-time AI demonstration

But the features also look like a preview of a new set of Meta glasses coming next year, which, following hints from Mark Zuckerberg, may have their own displays and maybe even gesture-controlled wristbands. on thread after last week a story written By Joanna Stern of The Wall Street Journal.

For now, Live AI feels like a strange glimpse into a more always-on, more intrusive AI future, and in my early attempts it felt more like a companion than a helper. Once the translation does work, however, it feels surprisingly helpful… even if it does so with a bit of a lag.

Meta's Ray-Ban glasses are located next to the phone, showing the Meta AI settings page

Real-time AI mode is part of the Early Access feature set. It is opened and closed individually.

Scott Stein/CNET

Live AI: A continuous listening and watching mentality

Turning on Live AI means starting real-time video recording. While the video isn’t saved for you to watch later, it is processed by Meta’s artificial intelligence from your phone and forwarded to the glasses. The LED light stays on to notify people that it’s on, but in my experience people don’t notice the LED light much, or don’t seem to care. Everything you say can be interpreted by Meta AI, so forget about conversations with other people. I look like a weirdo in the office and talk to myself or appear to be talking to others (only people try to talk to me and then realize I’m not talking to them). But Live AI can be paused by tapping the side of the glasses.

Ending Live AI can be done by saying “Stop Live AI”, but sometimes Meta AI thinks I’m asking it if was Real-time AI – “Who speaks first?” moment. I had to shout several times before I stopped.

CNET's Scott Stein takes a self-portrait wearing Meta Ray-Ban smart sunglasses

With Meta Ray-Ban glasses enabled, it will be difficult for anyone to know you’re wearing smart technology… or talking to artificial intelligence.

Scott Stein/CNET

The challenge with real-time AI is figuring out how to leverage it. I walked around the office asking about the placement of the furniture and was told that everything seemed fine: “The room looked well designed and functional and no noticeable changes were needed.” I asked about the placement of the furniture on my laptop Writing a story, it said: “The text seems to be a cohesive and well-structured piece, without any unnecessary parts.” I’ve been trying to get constructive feedback but it’s hard to get anything that isn’t generic, although it Really points out something worth noting and sums up my points.

When I walked outside it told me which street I was on but it was wrong – I corrected it and then it just acknowledged it and moved on. It knows which Chase bank I’m looking at and tells me bank hours, and it knows Joe’s Pub when I’m standing at the entrance to the Public Theater, but it can’t tell me what’s playing that night. It can recognize ordinary pigeons, mistake a car on the side of the road for a Mercedes (it was a Lincoln), and for some reason recommend a bar down the street that, according to Meta AI Now it “no longer exists”.

Live AI is still in early beta, but I also need to understand what I’ll be using it for. The combination of an early beta feel and unclear purpose feels ridiculous. Or unexpectedly profound. Either way, keeping it running affects battery life: 30 minutes of use instead of the hours Ray-Bans typically work.

Real-time translation mode on your phone, next to your Meta Ray-Ban glasses

Real-time translation requires downloading a separate language pack to work.

Scott Stein/CNET

Translation: Useful, works in several languages

Real-time translation works the same way, starting with a request. But you need to download the language pack for the specific language you want to translate: Spanish to English, for example. Currently only Spanish, French, Italian, and English are supported, which is disappointing.

I chatted with my CNET colleagues Danny Santana In bustling Astor Place, close to our New York offices. He said it in Dominican Spanish and I said it in English. Within seconds the translated reply was in my ears, and during our chat I felt like I had understood it. It’s not perfect: the translating AI doesn’t seem to pick up on some phrases or idioms. The time delay makes it difficult for me to know when the translation is finished, or if there are more to come. It was hard for me to tell when I was replying to Danny because he was waiting patiently for me to speak across the table.

Meta also displays a real-time transcript of the conversation in the Meta View mobile app, which you can refer to while using the glasses to show who you’re talking to or to clarify what was said.

The translation feature on Ray-Bans seems more immediately useful than Live AI, but that’s also because Live AI hasn’t yet figured out what I should use it for. Maybe I could turn it on while I’m cooking, assembling IKEA furniture, or playing board games? I have no idea. Help me with this, Meta. Additionally, the lack of any heads-up display makes Live AI feel like I’m guessing what the glasses are looking at.

Of course, you can also use Google Translate on your phone. Meta uses glasses to translate in a similar way to using headphones. But Meta’s glasses can also see and translate written content, but that’s not part of the conversational, real-time translation model.

Wear Meta Orion AR glasses and wristband

Meta’s AR glasses, the Moonshot Orion, have their own neural input wristband and head-up 3D display. When will these slowly appear on Ray-Ban glasses?

Celso Burgatti/CNET

What’s next: display or gesture? Or both?

Meta’s year-old Ray-Ban glasses are now available Multiple major AI featureseach changing the equation in surprising ways. However, the latest real-time AI features appear to be pushing the limits of the hardware, shortening battery life. I wish I had a better way of knowing what the AI ​​could see, or could point out with my hand what I wanted to ask.

Future glasses may develop in this direction: having both heads-up displays and gesture recognition functions. Meta’s CTO Andrew Bosworth acknowledged in a conversation I had with him at the end of the year that these were the next steps – but the timeline was unclear. Orion glasses by Meta Earlier this year, I showed off ambitious futuristic glasses with 3D displays and a wrist-worn gesture tracker that could recognize finger taps and pinches, which are still a few years away from becoming a reality. But Meta’s wrist-worn Neural Band could come sooner, or could become a way for camera-equipped glasses to recognize gestures. As for displays in smart glasses, Meta could explore smaller heads-up displays to display information before moving on to larger, more immersive AR displays. Bosworth points to next-generation AR glasses Recent blog postsbut will this be possible in next year’s next generation of Ray-Ban-like glasses?

“Gesture-based control requires a downward-facing camera and maybe some lighting,” Bosworth said of future Meta glasses. “You can do it in the current Ray-Ban Metas – in Live AI, we played with it – but you just have to do it in the camera’s field of view.” However, he admits that sooner or later the glasses will Possibility to add EMG straps. “Now you’re adding a device that has to be charged, which is extra cost, it’s extra weight, but it’s so convenient.” But Bosworth believes that EMG straps will only work if there’s a display on the glasses. Useful, but Ray-Ban glasses currently do not have this function. When the Ray-Bans do come with some kind of heads-up display, the input strap will likely debut at the same time. I’ve seen some attempts Similar ideas are found in other products.

Then there’s the issue of battery life: How do these more always-on glasses work for more than a few hours at a time? Or how will all this drive up the cost of the next generation of glasses?

At the same time, Meta’s artificial intelligence may also enter fields such as fitness. As a bridge with VR, Meta has another version of Meta AI. “It would be highly unusual if, a year from now, the AI ​​you’re using to track your steps through the world and advise you doesn’t realize you’re doing these things too. Exercise (VR),” Bosworth said.

As Live AI continues to evolve, having better ways to add gestures may be an absolute necessity. Bosworth believes pointing things out is a key way to train AI to become better in the future. “As artificial intelligence gets better, the need for these simpler, more intuitive gestures has actually increased significantly.”

Meta’s Ray-Bans now don’t let me point at things, which makes Live AI a little confusing to use at times. But maybe it will take newer hardware, and adding gestures and displays, to make the next leap.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments