Area:
Visual-vestibular interaction
We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Paul Ryan MacNeilage is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2017 — 2020 |
Macneilage, Paul Ryan |
P20Activity Code Description: To support planning for new programs, expansion or modification of existing resources, and feasibility studies to explore various approaches to the development of interdisciplinary programs that offer potential solutions to problems of special significance to the mission of the NIH. These exploratory studies may lead to specialized or comprehensive centers. |
Project 8: Role of Motor Signals For Perception During Self-Motion @ University of Nevada Reno
PROJECT SUMMARY/ABSTRACT Role of Motor Signals for Perception during Self-Motion Sensing and reconstructing self-motion is essential for normal everyday function of most mobile organisms. In humans, visual and vestibular systems provide valuable sensory information for self-motion processing, but motor signals also contribute. In particular, oculomotor, neck-motor, and locomotor signals provide information about movement of the eye in the head, the head on the body, and the body through space. These signals serve multiple purposes: they allow transforming sensory estimates between eye, head and body reference frames; they can be used to calibrate sensory signals; and once signals are calibrated, they can be integrated to yield improved sensory-motor estimates of self-motion. Research proposed here will advance our understanding of these sensory-motor interactions during self-motion in the context of the following specific aims. Aim 1 is to characterize principles of sensory-motor comparison and adaptation for head movements. Sensory-motor conflict will be introduced by rotating the body under the head during active head turns to introduce a mismatch between the predicted and observed vestibular sensory signal. Perceptual sensitivity to the conflict and rate of adaptation to the perturbed signal will be measured and compared with predictions of a probabilistic model. Aim 2 is to examine how retinal speed signals are combined with eye and head movement signals to estimate object motion in the world. Prior psychophysics and modeling work has shown that perceived speed is reduced during pursuit and passive head movement, suggesting reduced perceptual gain on eye movement and vestibular signals. This project will extend these approaches to investigate the role of neck-motor signals during active head movement and interactions among neck-motor, oculomotor and vestibular signals. Aim 3 is to measure and model impact of locomotor signals on perceptual thresholds. This project will test the hypothesis that previously observed suppression of vestibular signals during locomotion depends on head movement variability, and thus reliability of locomotor efference copy signals. Head movement variability will be measured and used to predict locomotion-induced changes in perceptual sensitivity to vestibular and other sensory stimuli during different speeds and at different time points during walking on a treadmill. 1
|
0.932 |