Probabilistic detection of pointing directions for human-robot interaction


Creative Commons License

Shukla D., ERKENT Ö., Piater J.

International Conference on Digital Image Computing: Techniques and Applications, Adelaide, Avustralya, 23 - 25 Kasım 2015, ss.601-608 identifier identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Cilt numarası:
  • Doi Numarası: 10.1109/dicta.2015.7371296
  • Basıldığı Şehir: Adelaide
  • Basıldığı Ülke: Avustralya
  • Sayfa Sayıları: ss.601-608
  • Hacettepe Üniversitesi Adresli: Hayır

Özet

Deictic gestures - pointing at things in human-human collaborative tasks - constitute a pervasive, non-verbal way of communication, used e.g. to direct attention towards objects of interest. In a human-robot interactive scenario, in order to delegate tasks from a human to a robot, one of the key requirements is to recognize and estimate the pose of the pointing gesture. Standard approaches rely on full-body or partial-body postures to detect the pointing direction. We present a probabilistic, appearance-based object detection framework to detect pointing gestures and robustly estimate the pointing direction. Our method estimates the pointing direction without assuming any human kinematic model. We propose a functional model for pointing which incorporates two types of pointing, finger pointing and tool pointing using an object in hand. We evaluate our method on a new dataset with 9 participants pointing at 10 objects.