Text size
  • Small
  • Medium
  • Large
  • Standard
  • Blue text on blue
  • High contrast (Yellow text on black)
  • Blue text on beige

    GestureNet: A Common Sense Approach to Physical Activity Similarity

    Electronic Visualisation and the Arts (EVA 2014)

    London, UK, 8 - 10 July 2014


    Angela Chang, Selene Mota and Henry Lieberman



    Generalizing knowledge about physical movement often requires significant amounts of data capture. Despite the large effort to collect and process activity examples, these systems can still fail to classify movements due to many reasons. Our system, called GestureNet, uses a very small dataset of activity templates to get useful query results for a generalized set of movements. Thus, many more movement profiles can be generated for activity recognition systems and gesture synthesis algorithms.

    We demonstrate a system that is able to support a larger set of computer animations based on a small set of base animations. A user can input any motion word recognized by GestureNet, and the system will respond with the closest animation match. GestureNet will also describe the degree to which the new activity is similar to the template profiles. One example is if the user inputs "baseball", the system will show the animation for Run. The commonsense database associates baseball with jogging, which is a type of running. Although the example gesture matrix is small, we demonstrate that our techniques can extend the system to describe variations of these activities (e.g. sitting and squatting) which are not currently represented. We can expect that this solution will be useful in application domains where sensor data capture and activity profiles are costly to acquire (e.g. activity classification, animations and visualisations).


    PDF file PDF Version of this Paper 1,269(kb)

    EVA 2014: Electronic Visualisation and the Arts cover

    Print copies of EVA 2014
    ISBN 978-1-78017-285-9
    RRP £85

    Available from the BCS bookshop