Last year Facebook showed Aria Project, the research project that will help the company create augmented reality glasses. At the time, Facebook presented a vision in which such a device could possibly support many functions that we currently use smartphones for, such as calling friends or finding directions. Now, the company offers new insight into how users might eventually control AR: via their wrists.
The idea, according to researchers at Facebook’s Reality Labs, is to use a technique known as electromyography, or EMG, which can detect nerve signals passing through the wrist. A device worn on the wrist with specialized sensors would be able to interpret these signals and translate them into “digital commands” which can then be used to control an AR device or interface.
“It’s not like reading the mind,” Facebook explains in a blog post. “Think of it like this: you take many photos and choose to share only a few. Likewise, you have a lot of thoughts and you choose to act on only some of them. When this happens, your brain sends signals to your hands and fingers to tell them to move in specific ways in order to perform actions like tapping and sliding. It’s about decoding those signals on the wrist – the actions you’ve already decided to take – and translating them into digital commands for your device. “
According to Facebook, one of the advantages of using such a system is that the EMG is so precise that it can “understand the movement of the fingers of only a millimeter”. Finally, it may not even be necessary to move a finger at all as long as there is an “intention” to do so. This precision could also potentially make browsing AR interfaces a much faster experience than how we currently interact with technology. For example, Facebook researchers say EMG could allow people to type on a virtual keyboard at a faster speed than is possible on a mechanical keyboard.
For now, however, Facebook is still polishing the basics of interacting with EMG. The company showed an interaction it calls “smart click,” which allows users to “click” on a menu by subtly moving their fingers. The interface can also adapt based on contextual information and what it knows about you, like queuing up a playlist when you’re about to run. “The system will be able to make in-depth inferences about what you might want to do in various situations based on the information you choose to share about yourself and your environment,” says Sean Keller, research director at Facebook Reality Labs .
Keller and other researchers pointed out that this work is still at a very early stage, and any type of consumer device is still years away. And there are other issues the company will need to address outside of the mechanics of how wrist-controlled AR works. Namely, the huge privacy concerns that accompany a still active AR platform on Facebook. But it does offer an intriguing glimpse into how Facebook thinks about the future of augmented reality, and what might one day be possible.