Intel aims to make interfaces open ended and fluid

Intel aims to make interfaces open ended and fluid
In announcing Perceptual Computing at this year’s Consumer Electronics Show, Intel isn’t just making a play for a Kinect “me-too” product. Its perceptual computing SDK is potentially much more ambitious in that it aims to enable users to switch seamlessly between gesture, touch, voice and traditional keyboard and mouse interactions with devices in a context sensitive way.

One thing that tends to happen with new interactive
technologies like touch or gesture is that they are not
suitable for all kinds of interaction. Waving franticly at
your Xbox or windows PC isn’t always a rewarding or
useful act, and everyone knows about the limitations of touch-screen
keyboards (even if they are potentially getting better).

With perceptual
computing Intel isn’t developing one or more of these,
its saying that we should be smartly switching between then
dependent on tasks.


Watch this video from Intel’s Achin Bhowmik explaining the concept.