Visual Speech Benefit in Auditory-Visual Speech Perception: Infants, Children, Hearing Impairment

PhD Student Project

Jessica is interested in investigating how visual speech information influences infants’ speech perception.

Visual speech cues such as the speaker’s mouth movements aid children and adults in decoding the speech stream more effectively, especially in noisy environments. Individuals with hearing impairment are more reliant on visual cues and are better at processing visual information compared to their counterparts with normal hearing. Her thesis examines the integration of auditory and visual cues in infants’ speech perception and how visual speech information augments the ability to segment words from fluent speech especially in infants with hearing impairment.

Jessica's project has two main aims:

  1. to investigate if and how visual speech cues from a speaker's talking face may augment infants's and children's speech perception.
  2. to examine if the visual speech benefit differs between those with normal hearing and those with hearing impairment.

Jessica is recruiting 5 month old, 7.5 month old & 4 year old monolingual children - both with no hearing impairment and hearing impaired to participate in this study.  Please register or contact Jessica directly to take part.


Jessica Tan(opens in a new window)

Academic Supervisors

Professor Denis Burnham(opens in a new window)

Dr Benjawan Kasisopa(opens in a new window)

Partner / Funding Body

Human Research Ethics Committee Approval Number: H11517

For more information or to register your interest, please contact:
NameJessica Tan
Phone+61 2 9772 6535