• Increase font size
  • Default font size
  • Decrease font size

Video-Based Gesture Expressivity Features Extraction

E-mail Print
This video-based component detects and tracks the user’s hands to extract and transmit expressivity features’ values such as overall activation, spatial extent, temporal, fluidity, power. The processing applies to low resolution videos to identify gestures representative of spontaneous emotional behaviors. The component handles issues related to environment, context and lighting condition changes. Playback, pre-recorded capabilities and relevant features can be set and adapted to support offline processing of affectively enriched human behaviour.

Availability of the component: Reference contact is Amaryllis Raouzaiou of Institute of Communication and Computer Systems - National Technical University of Athens.
Notes: Interested parties can request the component who will be made available for non commercial use.

Besides functional tests, extensive experimentation of Video-Based Gesture Expressivity Feature extraction component in CALLAS is done in:
  • the Musickiosk : the component is used in this affective edutainment Proof-of-Concept to support the user to control the movements of the boy.
  • the Common Touch: a preliminary experimentation of the component is under evaluation for this Scientific Showcase proposing interaction with the multitouch surface, to test its ability to receive the coordinates of the hands touching the multitouch surface and to producie a separate expressivity features set for every touch session.

Last Updated on Monday, 17 May 2010 12:20