We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Amy L. Hubbard is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2007 — 2008 |
Hubbard, Amy Lynn |
F31Activity Code Description: To provide predoctoral individuals with supervised research training in specified health and health-related areas leading toward the research degree (e.g., Ph.D.). |
Multisensory Integration of Speech and Gesture in the Normal and Autistic Brain @ University of California Los Angeles
[unreadable] DESCRIPTION (provided by applicant): Engaging in face-to-face communication involves the integration of co-occurring auditory and visual cues displayed by the entire bodies of speakers. The proposed research will examine how the brain integrates two of these cues-speech and beat gesture (an important subcategory of hand gesture that represents speech prosody)-during natural social communication in typically developing children, normal adults, and children with autistic spectrum disorder (ASD). First, normative patterns in the typically developing brain, developmental endpoints, and multisensory integration sites of speech and beat gesture will be characterized. Typically developing children and normal adults will undergo functional magnetic resonance imaging (fMRI) while viewing extracts of a video recording where speech is accompanied by naturally-occurring beat gesture, random meaningless movements with speech, and a still body with speech. Comparing brain activation in response to viewing these three audiovisual combinations will reveal the brain areas that respond more significantly to speech when it is accompanied by naturally-occurring beat gesture as well as the neural areas underlying multisensory integration of speech-related audio and visual information. Second, children with ASD and typically developing matched controls will undergo fMRI while viewing the same gesture stimuli. Comparisons between these groups will allow us to assess whether children with ASD-who have well-documented deficits in social communication and production of naturalistic speech prosody-utilize abnormal networks and/or demonstrate significant differences in activation when integrating speech and beat gesture. As the first fMRI studies of natural speech-accompanying beat gesture, the proposed research will have important implications for public health. These findings will not only lead to a better understanding of the neural foundations of ASD but will also inform developmental interventions for ASD. In addition, the findings will impact speech therapy techniques for a variety of language and communication disorders. [unreadable] [unreadable] [unreadable]
|
0.942 |