CALLAS

  • Increase font size
  • Default font size
  • Decrease font size
Home

Human Glove Wearable Interface for Motion Capture

E-mail Print
Based on a data glove device as a sensing unit, the component is capturing motion data from sensors to record the full body motion, useful to safeguard the spontaneity of gestures in the interaction of users with virtual environments.
Processing of finger posture applies to flections (and optionally the ad/abduction), the ad/abduction and flex/extension of the wrist, and the acceleration and angular velocity of the forearm from instrumented wireless human glove wearable device, that provides up to 22 high-accuracy joint-angle measurements.
Information from kinematical data about the arm/forearm movement and their acceleration is integrated to analyze the user’s activity (directly from kinematical data or differentiating the acceleration to obtain the motion jerk): the distance of the hand from the body, the hand speed, the hand acceleration and the hand jerk are used to detect implicitly communicated affect. This information can be mapped in a PAD space (Pleasure-Arousal-Dominance) and fused with other modalities.
The component can be used for virtual agent animation both in real-time live performances and in off-line processing and rendering (see poster). Resulting information can be used for puppeteering, for biomedical research, as an advanced human-computer interface, and to some extent to extract emotions from gestures and cinematic features.


Availability of the component: Reference contact is Andrea Scoglio of Humanware srl.
Notes: The hardware Humanglove device is a patented product property of Humanware srl: it can be purchased directly contacting the manufacturer. Software and drivers are opensource.


Besides functional tests, experimentation of the component was done in CALLAS in the Emotional Character , a Proof-of-Concept application where the component is used for recording a vast user data corpus about puppettering sessions.
Watch the video with real actors on stage  interacting with the audience trough the glove with the Emotional Character EUCLIDE.

The component was also evaluated by University of Augsburg for usage in conjunction with its Alfred virtual puppet demonstrator. See also reference paper:

Simplified Facial Animation Control Utilizing Novel Input Devices: A Comparative Study: Abstract
Last Updated on Monday, 28 June 2010 13:08