Using cameras and projectors, Microsoft Research is incorporating input and display into an entire room. The LightSpace technology can display onto a normal table, walls, the floor, or even your body.
Additionally, using depth perception, the technology can display various information depending on distance and movement. For example, a menu can be displayed as a beam to the floor. By placing your hand into the beam and changing the height, you can change menu options, which are displayed on your hand. You can also move virtual items around. For example, you might have a video displayed on a table. By motioning to grab the video with your hand, you can then move it to a wall monitor, or elsewhere.
This seems like a more robust version of what is coming to Xbox with the Microsoft Kinect next month. Of course, it is hard to describe the LightSpace technology with words. You're better off seeing the video with Andy Wilson and Hrvoje Benko, who are researches for Microsoft:
It is always interesting to see what is being played with at Microsoft Research. The LightSpace will be an interesting project to watch evolve.