New Ray-Ban Meta Smart Glasses Have Llama 2-Based AI Voice Assistant

Share

The Ray-Ban Meta smart glasses will have the newly announced Meta AI assistant onboard, though for US buyers only at first.

At Meta Connect today the company announced Ray-Ban Meta as the successor to Ray-Ban Stories, the first-person camera glasses launched in 2021 that let you capture hands-free first-person photos and videos, take phone calls, and listen to music.

The new Ray-Ban Meta smart glasses have improved camera quality, more microphones, water resistance, and better weight distribution for improved comfort. They can also go live on Instagram. However, just like the original, they do not have a display of any sort, only an internal LED above the right eye.

But arguably their most interesting new feature relates to another thing Meta announced today: Meta AI. On mobile, Meta AI will be a text-based conversational assistant, available in Messenger and WhatsApp. But in the Ray-Ban Meta smart glasses you’ll be able to talk to the assistant by saying “Hey Meta”, and you’ll hear a verbal response, completely hands-free.

Meta AI should be much more advanced than the current Alexa, Siri, or Google Assistant, because it’s powered by Meta’s Llama large language model, the same kind of technology that powers ChatGPT.

ChatGPT is fairly verbose by default though. Meta says it has optimized Meta AI for smart glasses for brevity, keeping responses short and to the point.

Meta AI will be available on Ray-Ban Meta smart glasses in the US at launch, and Meta says it will expand availability over time.

After a software update sometime next year, Meta says the assistant on the glasses will also be able to answer questions about what you’re currently looking at, by feeding the camera input to a future multi-modal version of Llama.