Aria Gen 2 adds head tracking, eye tracking, hand tracking, and audio to Meta’s research glasses, and it’s already being tested to help vision impaired people navigate indoors.
Meta’s Aria glasses are not products, nor are they prototypes of future products. They’re displayless glasses designed to support AR, AI, and robotics research by providing rich first-person visual data from onboard cameras.
The original Project Aria glasses from 2020 featured a somewhat similar camera suite: a color camera, two wide angle outwards-facing greyscale cameras, and two inwards-facing eye tracking cameras, as well as a microphone array. But none of these sensors were processed on-device. Instead, the device simply recorded them all to onboard flash memory for later processing on PCs and servers by researchers.
As well as upgrading these cameras and adding an extra one, Aria Gen 2 adds a Meta-developed highly efficient custom chip to process these cameras on-device while using very little power, enabling onboard positional tracking, eye tracking, and hand tracking, as well as speech recognition.
Additionally, Aria Gen 2 adds audio output via open-ear speakers, a heart rate sensor, and an additional microphone that lets the system distinguish between the wearer’s voice and other sounds.
Meta introducing the Aria Gen 2 glasses.
The addition of the custom Meta chip, and audio output, enables Aria Gen 2 to run simple applications on-device.
Meta says the glasses are capable of 6 to 8 hours of “continuous use” and weigh 75 grams, only 25 grams more than Ray-Ban Meta glasses, which lack any kind of tracking.
One such application comes from Meta’s first outside partner for Aria Gen 2, Envision, a company that sells modified Google Glass Enterprise Edition 2 glasses with custom software to help vision impaired people see the world by reading out text and describing what it sees on demand. It also offers a free phone app that does the same.
Leveraging Aria Gen 2’s built-in world-scale positional tracking and precise spatial audio, Meta and Envision are experimenting with helping vision impaired people navigate indoor environments by guiding them with a spatial “beacon” sound.
Envision and Meta testing Aria Gen 2 for helping vision impaired people see.
The companies stress that this application is still in the “exploratory and research” phase, but it points to a future where smart glasses with tracking and AI can make the lives of people with vision impairment easier.
Researchers interested in leveraging Meta’s Aria Gen 2 glasses can sign up here, and the company says it will share more about external availability in the coming months.