In Facebook’s original presentation of its, the company showed off a thumb clicker for driving the prototype specs. Today, Meta, , revealed a wrist-based neural interface to free your hands from this task.
Neural interfaces are one of many ways Meta wants to approach how people control augmented reality headsets. The company revealed a prototype bracelet during its Facebook Connect conference to show a possible future where your wrist movements are all that’s required to navigate an interface in the headset.
As in the image Facebook originally teased back in March of this year, the wristband for Project Aria glasses looks an awful lot like the 2015-era Myo Armband from Thalmic Labs, which became the company North . This wristband reads neural signals via EMG sensors, and translates them into commands for the Aria. The example given Thursday was being able to move your wrist and select things on a virtual screen, or pinch your fingers together to click on something you see in some future set of glasses.
Coupled with this demonstration was a look at how Project Aria will use onboard cameras to identify real-world household items and make them virtual. A coffee table, couch, television and more can be brought from your real living room into a virtual space, allowing you to either reach out and touch real things while in a VR headset or better display AR projections on your physical surfaces.
Like Project Aria itself, this wristband is nowhere near being ready for public use. These are all a part of Meta’s long term strategy to continue building VR and AR hardware as it plans a larger Metaverse strategy. But in this same presentation, Facebook founder Mark Zuckerberg admitted the plans are at least a year away from their next step of readiness. For now, though, it’s a clear look at where Meta thinks the future of VR and AR will be, and a good indication of what future products could be capable of.