The future of gesture control may not be in a camera such as the Leap Motion or Kinect. Rather, it could be embedded in the armband situated just above your wrist — at least Microsoft thinks so.
There’s no denying that wearable tech and motion controllers are going to play a major role in our future mobile devices, but Microsoft is patenting a device that combines both of those technologies. The Windows maker has filed a patent for an armband that allows you to control electronic devices by simply wearing an armband on your forearm. Unlike current gesture controllers like the Leap Motion or Kinect, Microsoft’s armband would let you perform tasks by monitoring the muscles in your arm rather than using a camera to detect movement. This could potentially allow for more accurate gesture detection, as Microsoft acknowledges.
“Most implement-free or hands-free interaction approaches have involved speech and computer vision,” the patent says. “While improving, these technologies still have drawbacks. They can be unreliable, sensitive to interference from environmental noise, and can require a person to make obtrusive motions or sounds.”
Microsoft’s armband would come with EMG sensors embedded inside the strap, and the company notes that the placement of these sensors would be important when detecting movement. Although the armband would sit on your forearm near your elbow, it would be able to pick up finger movements by noticing the contractions in your forearm muscles. Microsoft’s patent comes with a set of images and descriptions that demonstrate compatible gestures, including tapping or curling any finger, lifting or curling your index finger, etc.
While this seems like a relatively new concept, other companies are actively exploring the gesture-control wearable tech scene. Startup Thalmic Labs is working on a Bluetooth armband known as the MYO that does exactly the same thing as the device described in Microsoft’s patent. The $149 MYO, which is currently up for preorder and is slated for a 2014 launch, comes with integrated EMG sensors that let you manipulate a computer screen via arm and finger movements.
“With camera-based [gesture control], the camera has to be able to see you,” said Thalmic Labs 23-year-old CEO Stephen Lake when we saw a prototype back in June. “And that seems obvious, but it means you’re limited to applications where you’re directly in front of the camera.”
There’s no telling whether or not Microsoft’s device will ever come to market, but it’s just another indicator that computing is shifting toward wearable tech and perceptual interaction.