Meta recently made its smart glasses platform available to external developers, and within months, a CES exhibitor is showcasing how this openness enables novel accessibility innovations.
Hapware's Aleye is a vibration-based wristband designed to link with Ray-Ban Meta smart glasses, allowing users to perceive facial expressions and nonverbal indicators from conversation partners. The firm indicates this innovation benefits those with blindness, partial sight, or neurodivergence by revealing aspects of interaction typically inaccessible to them.
Resembling a substantial band, Aleye delivers targeted vibrations to the wrist that align with the counterpart's facial reactions and movements. It relies on the glasses' visual processing to relay footage of interactions to a connected application, where software analyzes and identifies the expressions and motions.
The application lets users choose preferred expressions and gestures for monitoring and offers training to differentiate vibration signals. According to Hapware CEO Jack Walters, testers grasped multiple signals in under a few minutes during preliminary trials. The developers focused on natural sensations, like vibrations echoing a dropped jaw for astonishment or lateral pulses simulating a wave, as Walters illustrated.
Furthermore, the app employs Meta AI for spoken notifications of expressions, yet Hapware CTO Dr. Bryan Duarte observed that the ongoing commentary can interrupt discussions. Blind since age 18 following a motorbike incident, Duarte advocates Aleye over alternatives such as Meta's Live AI. He noted it simply reports a person's proximity without detailing smiles, demanding constant queries for unprompted insights.
Hapware is accepting advance orders for Aleye, beginning at $359 for the band alone or $637 bundled with 12 months of app access (essential, at $29 per month ongoing). The Ray-Ban Meta glasses are sold separately, while Meta advances several proprietary accessibility enhancements for the product.