Area:
Brain-computer interface
We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Emily M. Mugler is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2016 — 2017 |
Mugler, Emily M |
F32Activity Code Description: To provide postdoctoral research training to individuals to broaden their scientific background and extend their potential for research in specified health-related areas. |
The Cortical Dynamics of Motor Activity During Speech Production @ Northwestern University At Chicago
Speech is critical for human communication, but the cortical control of the fundamental movements of speech production is not clearly understood. Current knowledge primarily stems from functional imaging and lesion studies, which lack the ability to observe rapid changes in activity. Advances in electrocorticography (ECoG) now enable investigation of rapid changes over multiple cortical regions. Early ECoG studies have found ventral motor cortex (M1) activity broadly related to large vocal pitch change and loosely correlated with production of speech sounds, or phonemes. I have demonstrated that even single instances of phonemes can be identified during word production from patterns in activity in M1. However, motor control studies show M1 activity primarily represents movements. Correspondingly, the movements of speech, such as those of the laryngeal musculature involved in vocal pitch, are well understood. Similarly, phonology literature suggests the basic units of speech production are simple articulator movements, or gestures, such as tongue tip closure. Despite strong evidence for movement representation in M1, speech movements have yet to be been investigated with ECoG. The objective of this proposal is to elucidate how the motor cortices control speech movements of articulators and vocal pitch. The research aims of this proposal are to determine the cortical representation of articulatory gestures and vocal pitch during speech and non-speech movements. I will decode high-gamma activity (70-200 Hz) of M1, premotor cortex (PM) and inferior frontal gyrus (IFG) to estimate representation of these movements. Decoding accuracy will quantify the extent to which gestures are represented in each cortical area during speech and non-speech movements. Preliminary results reveal distinct cortical signatures in M1 related to gesture production. Completion of these goals will result in a model of cortical representation of speech movements. This work will enable substantiation of neurophysiological theories, lead to practical applications for patients, and enable investigations of motor activity during higher- order processes of speech. Moreover, we enable comparison of speech to other motor control processes. Results will improve understanding of the cortical representation of the larynx, which remains poorly understood in human physiology research.
|
0.967 |