Area:
Speech Pathology, Cognitive Psychology, Psychobiology Psychology, Physiological Psychology
We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Susan Shaiman is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2005 — 2007 |
Shaiman, Susan |
R03Activity Code Description: To provide research support specifically limited in time and amount for studies in categorical program areas. Small grants provide flexibility for initiating studies which are generally for preliminary short-term projects and are non-renewable. |
Kinematic Comparison of Speech and Nonspeech Oral Tasks @ University of Pittsburgh At Pittsburgh
DESCRIPTION (provided by applicant): Nonspeech oral movements are commonly used in the evaluation and treatment of motor speech disorders. This approach is based on the assumption that the behavior of these structures provides insight into the underlying motor control mechanisms and neurophysiological dysfunction; however, the use of nonspeech movements is a contentious issue. While often criticized, there is little basic neuromotor evidence to either support or refute this assumption. The long-term objectives of this research are to evaluate the hypothesis that speech movements and volitional nonspeech oral movements share underlying motor control mechanisms. Previous tasks selected to examine the relationship between speech and nonspeech oral motor control have exhibited substantive differences in organization and complexity that limit the extent to which a legitimate comparison can be made. This research attempts to more closely equate speech and nonspeech oral gestures to comparable levels of organization and complexity. A motor learning paradigm of nonspeech tasks will be implemented that will consider the goals of movement (e.g., intraoral air pressure, labial displacement), the overlapping nature of sequential gestures, and the degree of learning and automaticity. Articulatory kinematic characteristics of speech and nonspeech gestures will be compared both prior to and subsequent to implementation of the motor learning paradigm. Additionally, generalization of the highly-learned nonspeech tasks to other nonspeech tasks of varying similarity will be evaluated. Once the associations and disparities between speech and nonspeech oral movements have been explored, future research will evaluate whether complex, coordinated nonspeech tasks trained to automaticity will generalize to speech tasks. The transfer of learning from nonspeech behaviors to speech production would implicate a common or related control process for the two motor behaviors.
|
1 |