Classification of human motion based on affective state descriptors


Creative Commons License

Cimen G., İLHAN H., Capin T., GÜRÇAY H.

COMPUTER ANIMATION AND VIRTUAL WORLDS, cilt.24, ss.355-363, 2013 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Editöre Mektup
  • Cilt numarası: 24
  • Basım Tarihi: 2013
  • Doi Numarası: 10.1002/cav.1509
  • Dergi Adı: COMPUTER ANIMATION AND VIRTUAL WORLDS
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.355-363
  • Hacettepe Üniversitesi Adresli: Evet

Özet

Human body movements and postures carry emotion-specific information. On the basis of this motivation, the objective of this study is to analyze this information in the spatial and temporal structure of the motion capture data and extract features that are indicative of certain emotions in terms of affective state descriptors. Our contribution comprises identifying the directly or indirectly related descriptors to emotion classification in human motion and conducting a comprehensive analysis of these descriptors (features) that fall into three different categories: posture descriptors, dynamic descriptors, and frequency-based descriptors in order to measure their performance with respect to predicting the affective state of an input motion. The classification results demonstrate that no single category is sufficient by itself; the best prediction performance is achieved when all categories are combined. Copyright (c) 2013 John Wiley & Sons, Ltd.