integrating speech information

integrating speech information - Perception & Psychophysics...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Perception & Psychophysics 1991, 50 (6). 524-536 Integrating speech information across talkers, gender, and sensory modality: Female faces and male voices in the McGurk effect KERRY P. GREEN University of Arizona, Tucson, Arizona and PATRICIA K. KUHL, ANDREW N. MELTZOFF, and ERICA B. STEVENS University of Washington, Seattle, Washington Studies of the McGurk effect have shown that when discrepant phonetic information is deliv- ered to the auditory and visual modalities, the information is combined into a new percept not originally presented to either modality. In typical experiments, the auditory and visual speech signalsare generatedby the sametalker. Thepresent experiment examined whether a discrepancy in the gender of the talker between the auditory and visual signals would influence the magni- tude of the McGurk effect. A male talker’s voice was dubbed onto a videotape containing a fe- male talker’s face, and viceversa. The gender-incongruentvideotapeswere compared withgendnr- congruent videotapes, in which a male talker’s voice was dubbed onto a male face and a female talker’s voice was dubbed onto a female face. Even though there was a clear incompatibility in talker characteristics between the auditory and visual signals on the incongruent videotapes, the resulting magnitude of the McGurkeffectwas not significantly different for the incongruent as opposed tothe congruent videotapes. The results indicate that the mechanism for integrating speech information from the auditory and the visual modalities is not disrupted by a gender in- compatibility even when it is perceptually apparent. The findings are compatible with the theo- retical notion that information about voice characteristics of the talker is extracted and used to normalize the speech signal at an early stage of phonetic processing, prior to the integration of the auditory and the visual information. Over the past four decades, extensiveresearch has been done on the psychological processes underlying the per- ception and production of spoken language. Much of this research has focused on how the listener processes the acoustic structure of speech in order to arrive at the in- tended meaning of an utterance. Although speech perception has primarily been consid- ered an auditory process, recent studies have shown that visual information provided by movements of a talker’s mouth and face strongly influences what an observer per- ceives (Green & Kuhi, 1989, 1991; Green & Miller, 1985; MacDonald &McGurk, 1978; Massaro& Cohen, 1983; McGurk & MacDonald, 1976; Reisberg, McLean, & Goldfield, 1987; Summerfield & McGrath, 1984). A par- This research was supported by National Institutes of Health Grant NS-26475 to Kerry P. Green and National Institutes of Health Grant HD-18286 to Patricia K. Kuhl. We would like to thankVirginia Mann and an anonymous reviewer for helpful comments on a previous ver- sion of the manuscript. A portion of these data were presented at the spring meeting of the Acoustical Society of America, State College, Pennsylvania, 1990. Correspondence concerning the article should be addressed to Kerry P. Green, Cognitive Science, Psychology Building,
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 06/16/2009 for the course PSYCH 120 taught by Professor Hald during the Spring '09 term at University of California, Berkeley.

Page1 / 13

integrating speech information - Perception & Psychophysics...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online