HP Labs India

Intuitive Multimodal and Gestural Interaction

The dexterous human hands have historically driven most input modalities for computer systems, including the dominant keyboard and mouse. However with the availability of additional sensors such as touch sensors and cameras, new, more natural input modalities such as pen, touch, and hand gestures are becoming mainstream and redefining human computer interfaces for personal systems.

The Intuitive Multimodal and Gestural Interaction project at HP Labs India explores new and compelling user experiences and supporting technologies for personal systems that are natural, context awareness, adapt with use and achieve multimodal integration.

Our research threads include:

  • Design of compelling human-computer interaction experiences
  • Robust interpretation of pen, touch and visual hand gestures
  • Awareness of user context using technologies such as face detection and recognition
  • Multimodal integration and adaptation

This research will draw on new and existing techniques in the following areas

  • Experience design, user research and ethnography
  • Human Computer Interaction
  • Computer Vision and Image Analysis
  • Pattern Recognition
  • Sensor fusion and Decision Combination

Some current activities:

The projects that we have worked on in the past are:

Publications

Click here for recent Publications listing.

 

This page was last updated on July 27, 2010