Imagine playing a computer game with your intentions or typing just by thinking about a word. While these concepts might seem like the basis for a dystopian sci-fi novel, the reality is that these technologies are already in development.
At a virtual call last week Facebook Reality Labs division demonstrated a prototype of its "mind-reading" wristband. Facebook
During the call, Facebook demoed the "force actions" the wrist band was capable of, with actions like pinching shown being used to manipulate digital objects augmented reality, according to CNBC. Facebook also demoed an AR-based keyboard that allows users to type on virtually any surface. Another thing Facebook unveiled was the addition of haptics. The "Bellowband" iteration of the wrist band uses pneumatic bellows that squeeze the wrist to produce different sensations. This sort of feedback makes the process of interacting with digital objects in AR that much more intuitive.
Facebook's bands use something called electromyography, EMG. Through EMG, Facebook's wrist bands interpret nerve signals your brain sends to your wrist when you want to act. Say you want to type a letter. The wrist bands track your nerve signals and then translate them into a keystroke in AR, effectively allowing you to type without moving your fingers.
Facebook's prototypes hold much promise. They amount to a brain-computer interface that doesn't require invasive probes to function. And the fact that people already wear both watches and glasses will also make it easier for people to adopt Facebook's new wristband and AR glasses.
However, wristbands do not amount to a "complete solution their own," according to FRL Director of Research Science, Hrvoje Benko. They "need to be assisted with intent prediction and user modeling that adapts to you and your particular context in real-time," the company wrote in a blogpost.
Facebook's ultimate vision for its AR platform is to develop it into a "ultra-low friction" means of human-computer interaction. "Rather than constantly diverting your attention back to your device, the interface should simply come in and out of focus when you need it," said Facebook Research Labs Science Manager Tanya Jonker in the blogpost.
With more refined "user modeling," Facebook's glasses eventually should be able to read your immediate surroundings and then present you with options related to your environment. No more taking out your phone to start your favorite playlist before you set out on your morning jog.
Facebook's glasses will recognize where you are and what you are wearing and give you the option to start your playlist in augmented reality. Then the wrist band will interpret your hand's subtle movements and start your playlist, all without the need for you to interact with a device.
Of course, the future implications of Facebook's tech are much more interesting. Imagine being able to type the next great American novel with less than the flick of your wrist. Or imagine being able to belt out Moonlight Sonata on a piano in augmented reality. Soon, all this could soon be as simple as putting on a watch and a pair of Ray-Bans.