P45Session 1 (Thursday 11 January 2024, 15:35-18:00)Co-speech and listening gestures during casual conversation in a noisy situation
In face-to-face communication, people use gestures while they speak, but less is known about gestures during listening. In the current work, we test whether body movement behavior during conversation indicates the difficulties related to the increased noise level during the conversation. We developed a categorization system for conversational body movements in which we characterize movements on a physical level in terms of head, arm, leg and trunk movements. We conducted an experiment in which groups of three had casual face-to-face conversation while standing in a noisy audio-visual scene of an underground station created by the real-time Simulated Open Field Environment (rtSOFE). Full-body movements of one of the three participants were recorded using a motion capture system. Speech of the participants was reverberated and recorded. In the preliminary analysis, the motion capture data were labeled by one observer. We observed an increase in palm swipe gestures and complex hand gestures when the participant was speaking and an increase of contractive postures when they were not speaking. Nodding head movements were slightly increased during listening. Presence of noise (72 dB SPL) had slight influence on relative occurrences relative to No Noise condition. Body posture and head nods are typical backchannel cues during the conversation, which is in line with our statistical analysis. The present categorization approach of conversational body movements can complement existing linguistic tools to analyze personal communication from the audiological perspective.
Funding: This work was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – Project ID 352015383 – SFB 1330, Project C5 and Bundesministerium für Bildung und Forschung (BMBF) (grant number 01 GQ 1004B).