Classification of human motion based on affective state descriptors


Creative Commons License

Cimen G., İLHAN H., Capin T., GÜRÇAY H.

COMPUTER ANIMATION AND VIRTUAL WORLDS, vol.24, pp.355-363, 2013 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Editorial Material
  • Volume: 24
  • Publication Date: 2013
  • Doi Number: 10.1002/cav.1509
  • Journal Name: COMPUTER ANIMATION AND VIRTUAL WORLDS
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.355-363
  • Hacettepe University Affiliated: Yes

Abstract

Human body movements and postures carry emotion-specific information. On the basis of this motivation, the objective of this study is to analyze this information in the spatial and temporal structure of the motion capture data and extract features that are indicative of certain emotions in terms of affective state descriptors. Our contribution comprises identifying the directly or indirectly related descriptors to emotion classification in human motion and conducting a comprehensive analysis of these descriptors (features) that fall into three different categories: posture descriptors, dynamic descriptors, and frequency-based descriptors in order to measure their performance with respect to predicting the affective state of an input motion. The classification results demonstrate that no single category is sufficient by itself; the best prediction performance is achieved when all categories are combined. Copyright (c) 2013 John Wiley & Sons, Ltd.