Gesture recognition from mobile phones

Print
The component is using the mobile phone as a sensor, mapping types of movement defined by expressivity parameters (e.g. graceful/fast tempo) to different emotions by extracting semantic, emotional cues from the way the user is holding and moving a mobile phone, considering sensor-based movements.
Expressivity parameters combined with audio analysis can be used  to further improve the precision in determining the mood of a user, relevant for example in public installations. The implementation of the recognition algorithm is light-weight and it does not cause a large memory load on the phone, enabling running the feature calculation as background processing.


Availability of the component: Reference contact is Elena Vildjiounaite of VTT Technical Research Centre of Finland


Last Updated on Monday, 17 May 2010 12:28