dc.contributor.authorHo, Danyuan
dc.date.accessioned2016-05-11T02:02:20Z
dc.date.available2016-05-11T02:02:20Z
dc.date.issued2016
dc.identifier.urihttp://hdl.handle.net/10356/67032
dc.description.abstractThe role of visual information in speech perception has not been adequately addressed in the current exemplar-based approach despite evidence showing that speech perception is multimodal in nature. This study investigates the role of visual information in speech perception by examining the co-encoding of visual and auditory information. Through a lexical decision paradigm, participants were repeatedly exposed to audio-visual targets consisting of sound tokens co-presented with non-linguistic visual cues. The exemplar-based approach predicts an interaction between the auditory and visual cues due to the concurrent processing of both modalities. However, visual cues did not seem to have any effect in the experiment, suggesting that there was little to no co-encoding of visual and auditory information. Lack of perceptual salience and linguistic remoteness of the visual stimuli are postulated to be likely factors for the non-effect of the visual cues. Taken together, an attentive mechanism that filters information is proposed to refine the current model.en_US
dc.format.extent99 p.en_US
dc.language.isoenen_US
dc.subjectDRNTU::Humanities::Linguistics::Phonologyen_US
dc.subjectDRNTU::Humanities::Linguistics::Psycholinguisticsen_US
dc.titleSeeing sounds around you : non-linguistic visual information and speech perceptionen_US
dc.typeThesis
dc.contributor.supervisorJames Sneed Germanen_US
dc.contributor.schoolSchool of Humanities and Social Sciencesen_US
dc.description.degreeMaster of Artsen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record