The workshop will take place at Sheraton, Maple Room.
2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction
Stanford University, Palo Alto, California, USA
February 13, 2011
In interactive systems, eye-gaze and attentional information have great
potential in improving the communication between the user and the systems.
For instance, by combining with situational and linguistic information,
user's focus of attention is useful in interpreting the user's intentions.
Eye-gaze also serves as a nonverbal signal in mediated communication using
avatars as well as during interaction with humanoid autonomous agents.
Moreover, recent studies have shown that eye gaze can be measured using
brain activities, and such eye-tracking technologies provide new opportunities
to design novel attention-based intelligent user interfaces.
The first eye-gaze workshop held at IUI 2010 covered various research issues
concerning eye-gaze: eye-tracking technologies, analyses of human eye-gaze
behaviors, multimodal interpretation, user interfaces using an eye-tracker,
and presenting gaze behaviors in humanoid interfaces. This year's workshop
aims to continue exploring this important topic by bringing together researchers
including human sensing, intelligent user interface, multimodal processing,
and communication science, with the long term goal of establishing a strong
interdisciplinary research community in "attention aware interactive
Topics of interest
- Technologies for sensing human attentional behaviors in IUI
User's attentional behaviors can be tracked by technologies using a variety
of bodily motions including pupil movements, head movements and torso directions,
as well as brain activities. What are the issues to be tackled when using each of these technologies
for tracking attentional behaviors in intelligent user interfaces?
- Interpreting attentional behaviors as communicative signals in IUI
How to incorporate attentional information in multimodal understanding
to recognize a variety of user's behaviors/states such as intentions, attitude
towards the system, grounding and engagement in conversational interactions?
- Gaze model for generating eye-gaze behaviors by conversational humanoids
How to select appropriate eye-gaze behaviors for virtual agents and communication
robots. How do users interpret the attentional signals presented by the
humanoids? Is eye gaze expressiveness different between virtual agents
- Analysis of human attentional behaviors in interaction with computer systems
as well as in dyads and multiparty face-to-face conversations
How to use sensing technologies for exploring face-to-face conversation?
What are the implications towards IUI design?
- Evaluation of gaze-based IUI
How to best evaluate attentional IUI? How to define user studies that can
identify the real impact of gaze-based information in IUI?