Tool Use as Gesture: new challenges for maintenanceand rehabilitation

People and Computers XXIV Games are a Serious Business

Proceedings of HCI 2010
The 24th British HCI Group Annual Conference
University of Abertay, Dundee, UK

6 - 10 September 2010

AUTHORS

Manish Parekh and Chris Baber

ABSTRACT

There are many ways to capture human gestures. In this paper, consideration is given to an extension to the growing trend to use sensors to capture movements and interpret these as gestures. However, rather than have sensors on people, the focus is on the attachment of sensors (i.e., strain gauges and accelerometers) to the tools that people use. By instrumenting a set of handles, which can be fitted with a variety of effectors (e.g., knives, forks, spoons, screwdrivers, spanners, saws etc.), it is possible to capture the variation in grip force applied to the handle as the tool is used and the movements made using the handle. These data can be sent wirelessly (using Zigbee) to a computer where distinct patterns of movement can be classified. Different approaches to the classification of activity are considered. This provides an approach to combining the use of real tools in physical space with the representation of actions on a computer. This approach could be used to capture actions during manual tasks, say in maintenance work, or to support development of movements, say in rehabilitation.

PAPER FORMATS

PDF filePDF Version of this Paper (970kb)