Area:
Auditory system, speech motor system, speech pathology
Website:
http://scai.io/
We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Shanqing Cai is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2011 — 2012 |
Perkell, Joseph [⬀] Cai, Shanqing |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Doctoral Dissertation Research: Online Control of Multisyllabic Speech Articulation Based On Auditory Feedback @ Massachusetts Institute of Technology
For normal adult talkers, articulating speech feels effortless and requires little, if any, conscious thought. However, this subjective impression belies the complexity of the neural processes underlying speech production. During speech, the articulators, including the tongue, the lips and the jaw, are moved in highly complex ways to achieve the rapid altering sequences of desired speech sounds. Our current knowledge of the brain mechanisms for the control of speech articulation remains primitive. This project aims to advance this knowledge by focusing on an important component of the speech system: auditory feedback. Auditory feedback refers to the sound of speech heard by the talker him/herself when speaking. It is has been shown to participate in correcting articulatory deviations online (with very short delays) during the production of single phonemes. However, it remains unclear how auditory feedback is used by the speech motor system to control the production of multisyllabic connected speech, the most frequently occurring type of speech in real-life communication. This is the main issue that will be addressed by this project. A computer setup based on fast digital audio processing will be used to introduce subtle perturbations of the auditory feedback of specific acoustic properties of vowel sequences when the subject utters a simple sentence. Two types of perturbations will be used to investigate separately the spatial (frequency-related, or 'spectral') and temporal aspects of feedback-based articulatory control. The compensatory changes in the subjects' articulation will be measured indirectly by extracting changes in acoustic parameters from the audio recording of the subject's speech. In addition to characterizing the role of auditory feedback in the online control of the spatiotemporal trajectories of articulation, this project will also use functional MRI to map out the brain regions involved in this feedback control process. The knowledge gained from this project will expand our knowledge of the normal process of speech production, and will also provide clues for addressing questions in future research about the mechanisms of disorders of speech production, such as stuttering.
|
1 |