Computer Interactions Are Going to Change

The mouse caused huge changes in how people interacted with computers. It has been said that voice input could change interaction as well; however, we’ve seen limited impact from voice input in mainstream applications. The advent of what appears to be a solid gesture system shows promise to have an impact greater than voice, and possibly greater than the mouse.

The Kinect device is currently tied to Microsoft’s Xbox game console, but has been hacked to also work with a personal computer. Microsoft has stated that they will release an official SDK to allow developers to start tapping the device as a part of computer systems.

If you’ve not looked at a Microsoft Kinect device, you should. The ability to recognize gestures offers great promise. In addition to depth and special recognition, it also recognizes voice commands. As one Microsoft person commented to me this past week, the Kinect not only recognizes voice commands, but it also can differentiate between multiple people speaking.

Consider this voice recognition and system in a simple solution. You could create a system to allow you to turn on the lights within a room or the radio by simply saying “Lights on.” You could then adjust the light level or sound by brining your hands together to decrease the light or sound, or you can separate your hands to make the light brighter or the sound louder. More importantly, with the Kinect’s abilities to recognize you, it could ignore other people’s attempts to change your light levels or radio.

At Microsoft’s TechEd 2011 conference a better example was given of the Kinect in use. Doctors have sterile hands for surgery. If a doctor wants to review MRI scans, x-rays, or other documents, they can’t touch a computer keyboard or mouse in the middle of surgery without having to rescrub. By using a Kinect device on their computer, a doctor can look at a computer screen and manipulate images by moving their hands in the air. They can “grab,” hold, and turn an image as well as use gestures to zoom around.

These are just two examples of incorporating a Kinect. The ideas are a bit of a new paradigm. With new paradigms come new ideas and approaches. Over the course of the next few years, the Kinect has more potential to change how we interact with computers than anything else that has come in a long time.

For developers, just as I suggest avoiding coding to specific screen sizes, I also now recommend considering gestures, touch, and voice as potential ways for your users to interact with your applications. Don’t assume a mouse and keyboard.  As a developer, you need to build for today, but there is no reason you can’t also plan for tomorrow!

More by Author

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Must Read