Building the Right Environment to Support AI, Machine Learning and Deep Learning
Even though it might be hard to believe, there was a time when most computers didn't have a mouse, nor did they need one. DOS was replaced by Windows and the interfaces evolved to where the mouse became a critical input device. Of course, it was the third release of Windows before the mouse, along with Windows, took off.
It was just a few years ago people were saying that interacting with your computer via touch had no practical value in applications. The mobile industry has shown that touch can be a primary input form factor for computing devices. With the release of Microsoft Windows 8, touch has been moving to desktops as well. Most systems above the lowest end models include touch screens and accept touch input. Although it doesn't dominate over the mouse and keyboard, touch is starting to make headway in how it is used. The use of touch should continue to increase as people become more familiar with the touch gestures and as better touch features are built into applications. Just as it was the third release of Windows before the mouse paradigm took off, we are still a version of Windows or two away from touch becoming mainstream – on the desktop. I believe it is clear that on mobile devices, the use of touch dominates. The evolution to touch, however, is well on its way.
Although touch isn't completely evolved, the next evolution has already begun as well. A few years ago, Microsoft released Kinect for Windows. More recently, Leap Motion released their own device. Both devices allow gestures and voice to be captured from a computer user. The Kinect also captures video. A few computers have actually started to incorporate the Leap Motion technology. Even so, the use of gestures has not gained mass appeal. Thanks to Intel, that is likely to begin changing this year.
Intel RealSense Perceptual Computing
The evolution occurring next is dubbed "perceptual computing." Simply put, this is the capturing of what is happening near and around the computer. This is generally done by using depth perception, and sound and distance recording. This allows gestures to be captured by registering where actions are occurring (depth perception). In addition to registering movements and physical items, sound also can be detected and triangulated. When you combine all this together, you end up with information on where something is, what movements it is making, and more.
By using depth perception, you can begin identifying items. This includes recognizing hands, fingers, faces, and more. Once you recognize a hand, you can start recognizing gestures made with that hand. Once you recognize a face, you can start recognizing changes to the features, which then can be translated to recognizing emotions being displayed. Once you recognize a body, you can recognize subtle movements. For example, the slight in and out movement of a person's chest can be translated into a heartbeat. As the resolutions on the perceptual computing devices get better, the level of detail that you can recognize will only get better. Each of the examples I've given is already possible.
Even though this might be possible, it is clear that the Kinect and Leap Motion devices have not gone mainstream. As such, just as with track balls, this might seem like a niche. Things are about to change.
Intel RealSense Perceptual Computing
Intel has been working with a perceptual computing system. Their entry in the market is dubbed RealSense. Intel just released their RealSense SDK and will soon be releasing hardware to support it. Although the Kinect and Leap Motion systems have come before Intel's RealSense, neither has gained the mainstream appeal that RealSense could generate.
Why might Intel succeed where Microsoft and Leap Motion have only made minor inroads? There are a number of things worth pointing out regarding Intel's RealSense. These include price, functionality, promotion, and distribution.
Although the Kinect is a fantastic device for doing perceptual computing, the Kinect for Windows device is not only a bit large in size, but also in price. The Leap Motion device is at a much more reasonable price, coming in at under $80. Intel's RealSense camera also comes in at just under $100, should you choose to buy the device.
Although the Leap Motion is the cheapest of the stand-alone devices, the IntelRealSense will include the ability to work with sound, depth, and more.
Some of the specific functionality RealSense will support includes:
- Face Tracking
- Hand Tracking
- Speech Recognition and Synthesis
- User Segmentation
- Object Tracking
- Touchless Controller
- Emotion Detection
- 3D Camera
- And more
Many of these features align with all three offerings. My expectation is that the Kinect will forge ahead with the best of the features, but at the higher price.
One step that Intel is taking to help succeed at higher levels than the others is to do promotion. Granted, the others have had great word-of-mouth discussion, as well as promotion. If all things stayed equal, Intel would likely have a little less traction on the public relations front simply because they are the third major player in the market.
To gain an advantage, Intel has put a few marketing dollars behind their RealSense program. In fact, they put a million dollars behind a contest to get developers to build applications. Although not every developer is motivated solely by the chance to win money, many are likely to be. As such, this promotion should help prime the pump to get developers building solutions that tap into RealSense.
When it comes to adoption, the price doesn't matter if you are not distributed. Although you can buy a stand-alone RealSense camera, the game changer will be in Intel's distribution of the technology.
RealSense technology will be built into a number of computer systems. In fact, Intel's RealSense is expected to be released into systems produced yet this year (2014) by Acer, Asus, Dell, Fujitsu, HP, Lenovo, and NEC. Just as cameras have become standard in computing devices, you can expect perceptual computing hardware to become standard in the future. The fact that seven major device manufacturers are planning integration into their devices is an indicator that they see potential in the technology.
The RealSense SDK
Although the technology isn't mainstream today, it is coming. As a developer, you can wait until the technology is mainstream and then begin to consider the impact on your applications. Alternatively, you can be proactive and begin to plan and adjust your applications today so that you are ready when the market is ready. Each of the devices mentioned previously include their own SDKs for development. Even though the Intel RealSense technology hasn't been released yet, Intel has released the RealSense SDK for development.
Making applications more intuitive and easier to use will generally help increase adoption and usage. Your applications will be able to watch your user, and you'll be able to use that feedback. More importantly, just as the mouse changed how applications are built, once you start considering the addition of gestures and voice, it is easy to see that applications are going to evolve again. It is only a matter of time.
My biggest rant used to be about making sure your applications could support any screen size. That was last decade. My new rant is to state that you need to make sure your applications also can support any input type – whether that's typing, mouse, touch, voice, gesture, or even perception of what is happening around it.
# # #