Text size
  • Small
  • Medium
  • Large
  • Standard
  • Blue text on blue
  • High contrast (Yellow text on black)
  • Blue text on beige

    Self-Previewing Gestures and the Gesture-and-Effect Model: Experimentation with Responsive Visual Feedback for New and Unlearned Interactions

    HCI 2017 - Digital make-believe

    Proceedings of the 31st International BCS Human Computer Interaction Conference (HCI 2017)

    University of Sunderland, St Peter’s campus, Sunderland, UK, 3 - 6 July 2017


    Jacques Chueke, George Buchanan, Stephanie Wilson & Luis Anunciação



    Multi-touch gestures embedded in touch-based interfaces and devices present challenges. It can be difficult for users to discover different gestures and understand their effects. The study reported in this paper hypothesises that presenting automatic visual prompts, termed self-previewing gestures (SPGs) in this research, that depict touch and preview gesture execution will mitigate the problems that users encounter with unfamiliar gestural interfaces. A within-subjects experiment (n=45) is reported in which an iPad application with two alternative gestural designs, and five alternative user interface versions (one industry, two research baselines, and two SPGs) was created with the purpose of making the available gestures evident. A rating system that adapts Norman’s Theory of Action to touch-based interactions by making use of known principles within interaction design (perceptible affordances, feedforward and feedback) is proposed. The system was used to assess participants’ perceptions of and interactions with the SPGs, and the results revealed positive and negative aspects of designs and UI versions.


    PDF filePDF Version of this Paper (906kb)