It’s not the same as using The Force, but gesture control is probably as close as we’ll come to “Vadering” electronic devices for quite some time. The technology has advanced greatly since the days of “Put-That-There” and shows no signs of slowing down.
From the Power Glove, to EyeToy, to Kinect, gesture control has been prominently featured in video game devices for a long time. The next big thing for video games seems to be the MYO armband. By sensing muscle movements to determine gestures, MYO will eliminate the need for handheld controls, sensors, and cameras. Though full video game integration will be a future endeavor for MYO, computers is where the armband will first make its name. The device, compatible with Windows and Mac OS, is due to ship to pre-order customers in late 2013. It will have competition though.
HP has teamed with Leap Motion in order to embed gesture capture technology directly into PCs. Without the external device, it will act just like MYO but without the need to wear an armband. Not only is this an attempt to boost PC sales, but the hope is to also help solidify Leap’s technology as mainstream software. Leap is even looking beyond this partnership, wanting to integrate its technology into smartphones, watches, and glasses. Don’t expect Apple to stand idly by though. They applied for a patent regarding motion gesture technology in 2011. Could we see this used in the iWatch or new iPhone? Or, perhaps, deep down in a secret lab, an iTV prototype is in the works.
Not only will we be hand swiping our way into the future, we will also be using our eyes.
The Eye Tribe is releasing its eye control technology to Android and its SDK in June. Once users calibrate the software, their eyes do all the work when using apps and playing games. The claim is it will work for logging in, scrolling, and gaming because of how precise the retina tracking is. According to the Eye Tribe, this is only the beginning for the software.
We would be remiss to talk about eye technology and not mention Google Glass. With early-adopting “Glass Explorers” picking up their headsets, it probably won’t be long before we see people walking down the street with the prism millimeters from their eye. Inspectors of the app code believe users will be able to take photos by simply winking. But that appears to be the simplest form of eye control technology people see in the device. Rumor has it a Google Glass app could be created that will allow users to control their electric wheelchairs. Though we may look odd walking down the street with headsets on or demonic while staring wide-eyed trying to beat a level of Angry Birds, it seems to be a technology that is here to stay.
It will be interesting to see what other platforms integrate gesture control in the future. While fun, interactive applications will appeal to the masses. Hopefully a larger focus will be on helping those with needs. Similar to people controlling their electric wheelchairs, this technology could help people with disabilities use computers and other devices with greater ease.
It’s hard to imagine, but our world is becoming more hands-free by the day. The next question is, what else can we possibly use for gesture control? Perhaps it’s time for someone to develop technology that can read our lips, rather than our voice, and turn it into a text.