• Menü menu
  • menu Menü öffnen
Publikationen
Digital

The acoustics of eye contact – Detecting visual attention from conversational audio cues

Beteiligte Autoren der JOANNEUM RESEARCH:
Autor*innen:
Eyben, Florian; Weninger, Felix; Schuller, Bjoern; Paletta, Lucas
Abstract:
An important aspect in short dialogues is attention as is manifested by eye-contact between subjects. In this study we provide a first analysis whether such visual attention is evident in the acoustic properties of a speaker's voice. We thereby introduce the multi-modal GRAS2 corpus, which was recorded for analysing attention in human-to-human interactions of short daily-life interactions with strangers in public places in Graz, Austria. Recordings of four test subjects equipped with eye tracking glasses, three audio recording devices, and motion sensors are contained in the corpus. We describe how we robustly identify speech segments from the subjects and other people in an unsupervised manner from multi-channel recordings. We then discuss correlations between the acoustics of the voice in these segments and the point of visual attention of the subjects. A significant relation between the acoustic features and the distance between the point of view and the eye region of the dialogue partner is found. Further, we show that automatic classification of binary decision eye-contact vs. no eye-contact from acoustic features alone is feasible with an Unweighted Average Recall of up to 70%.
Titel:
The acoustics of eye contact – Detecting visual attention from conversational audio cues
Seiten:
7-12
Publikationsdatum
2013-12

Publikationsreihe

Adresse
Sydney, Australia
Proceedings
Proc. 6th Workshop on Eye Gaze in Intelligent Human Machine Interaction, (GAZE-IN 2013), held in conjunction with the ACM ICMI 2013

Ähnliche Publikationen

Skip to content