Text size
  • Small
  • Medium
  • Large
Contrast
  • Standard
  • Blue text on blue
  • High contrast (Yellow text on black)
  • Blue text on beige

    An Eye-gaze Oriented Context Based Interaction Paradigm Design

    HCI 2018

    Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI 2018)

    Belfast, UK, 4 - 6 July 2018

    AUTHORS

    Hao He, Yingying She & Jun Li

    ABSTRACT

    http://dx.doi.org/10.14236/ewic/HCI2018.229

    The human eye's state of motion and content of interest can express people's cognitive status based on their situations. When observing the surroundings, the human eyes make different eye movements to interact with the observed objects which reflects people’s attention and interest intentions. Currently, most of the eye-gaze interactions lack the context information from the environment. To investigate the cognition awareness of people when they are performing eye-gaze interactions to the surroundings, we analyse the composition of the environment, and divide the essential factors of it into interactive subject, interactive object and context.

    The eye-object movement attention model and the eye-object feature preference model are constructed to understand people’s attention and preference diversities through eye-gaze interaction to different interactive objects in different contexts, and furthermore to predict their behavioural intentions. Then, an eye-gaze oriented context based interaction paradigm is designed to explain the relationships among eye movement, eye-gaze interaction and people’s behavioural intentions when they are involved and performing eye-gaze interactions in different environments. The paradigm shows the eye-gaze interaction patterns and people’s cognitive behavioural intentions in different context based environments, which can dynamically adapt the intention prediction results to interact with multiple interfaces properly, such as game, PC system, social robots, HCI & HRI applications and serve as one of the computable modals of cognitive computing.

    PAPER FORMATS

    PDF filePDF Version of this Paper (531kb)