Facebook has a team of engineers working on next-generation computing devices and interfaces. The group on Thursday provided a first look at its latest prototype: a wrist-based controller that uses a combination of artificial intelligence and input from a wearer’s nervous system to interact with VR and AR environments.
Initially, the prototype of Facebook’s AR controller provides simple gesture-based input that is the equivalent to a button-click (e.g., the pinch and release of the thumb and forefinger). That enables such applications as shooting a virtual bow and arrow: Using wrist-based haptics, the device can approximate the sensation of pulling back the string of a bow.
Facebook Reality Labs researchers say that someday their wearable AR controller will provide more advanced capabilities, such as being able to touch virtual interfaces and objects — and pick up virtual objects at a distance. The tech eventually will let you type at on a virtual keyboard on a table or your lap, possibly even at higher speed than is possible with a physical keyboard, according to Facebook.
“Neural interfaces, when they work right — and we still have a lot of work to go here — feel like magic,” Thomas Reardon, director of neuromotor interfaces at Facebook Reality Labs, says in a video accompanying the announcement.
Why is Facebook investing R&D dollars into next-generation human-to-machine interfaces? The social giant still sees VR and AR as major new areas of growth (see: its 2014 acquisition of Oculus VR), and it wants to be at the forefront of creating the enabling technology for the way people use computing platforms for the next decade and beyond.
The Facebook Reality Labs team also is developing a contextually aware, AI-powered interface for AR glasses. At last year’s Facebook Connect conference, the company announced a new line of AR smart glasses, starting with Ray-Ban models scheduled to launch sometime in 2021.
“AR glasses will enable us to be present and connected — how we communicate with this new device will be critical,” Andrew “Boz” Bosworth, head of Facebook Reality Labs, tweeted.
AR glasses will enable us to be present and connected—how we communicate with this new device will be critical. Building this interface demands advances from numerous technological areas and I’m proud of our research teams and the progress we’ve made: https://t.co/6ztS7bYQCw https://t.co/SJgViVJt5e
— Boz (@boztank) March 18, 2021
Later this year, Facebook says, it will pull the curtain back on its work in “soft robotics” to build “comfortable, all-day wearable devices” as well as provide an update on its haptic glove research.
Facebook emphasized that the approach it employs with the wrist-based AR controller “is not akin to mind reading.” Rather, the controller uses electromyography, or EMG, which uses sensors to translate electrical motor nerve signals that travel through the wrist to the hand into digital commands that you can use to control the functions of a device.
“What we’re trying to do with neural interfaces is to let you control the machine directly, using the output of the peripheral nervous system — specifically the nerves outside the brain that animate your hand and finger muscles,” according to Reardon, who joined the FRL team when Facebook acquired CTRL-labs in 2019.
Watch a video of the new Facebook wrist-based controller prototype: