Perception of the self voice and other voices during speech motor control
Processing of self-generated speech auditory feedback is known to play a critical role in speech motor control. This has been demonstrated using the altered auditory feedback paradigm. Specifically, speakers exposed to a predictable sustained perturbation of real-time speech auditory feedback (e.g. a change in the first or second formant) gradually start to adapt to this perturbation; for example, by shifting their produced formant frequencies in an opposite direction to the perturbation. Less is known however about the impact of simultaneous perception of other voices on such speech motor adaptation, and whether adaptation is robust across different speaking contexts. In particular, despite speech typically being a social act, few previous studies have examined speech adaptation in contexts involving a social element. In this talk, I will firstly present some of my work looking at speech motor control during synchronous speech; the act of speaking in time with another speaker. Through a series of studies this research has shown that (1) synchronous speech induces vocal convergence (that is, causes a participant’s voice to become more similar acoustically to their synchronisation partner), (2) synchronous speech affects speech motor adaptation to simultaneous formant perturbations, and (3) the effect of synchronous speech on speech motor adaptation depends on whether convergence aligns or conflicts with such adaptation, and not simply on auditory masking. In the second part of my talk, I will present my ongoing work investigating speech motor control and reports of perceptual experience of the self-voice under conditions of speaking with degraded auditory feedback. This degradation of speech feedback is implemented using real-time noise vocoding, a technique that allows one to vary the level of spectral detail present in the speech signal. This provides an (incomplete) simulation of speech perception in individuals with cochlear implants. This work will investigate whether speech motor compensation for formant perturbations is intact when speaking with noise-vocoded speech feedback, as well as the effects of both expectedness and clarity of speech feedback on explicit reports of perceptual experience of the self-voice. I will discuss implications for the relationship between implicit motor control and conscious perception for self-produced speech, and for speech motor control in individuals with cochlear implants.