We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Julie D. Golomb is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2015 — 2021 |
Golomb, Julie D |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Neural and Perceptual Mechanisms of Spatial Stability Across Eye Movements
? DESCRIPTION (provided by applicant): The overarching goal of this research is to better understand the human visual system, and how objects and their locations are perceived and represented in the brain. The proposal investigates a fundamental challenge for our visual systems: Visual information is coded relative to the eyes, but the eyes are constantly moving. How, then, do we achieve spatial stability? The world does not appear to jump with each eye movement, but this seamless percept belies a complicated computational process. Prior work by the PI has made significant first steps in understanding how spatial attention and memory are represented (or remapped) across eye movements. The current proposal makes a critical advance over prior studies in two key ways: by taking into account (1) 3D depth information, and (2) feature/object recognition. The questions of spatial remapping, object recognition and depth perception are central to our understanding of human perception and brain functioning. These topics are typically investigated separately; the proposal takes the novel approach of trying to integrate them together into a single theory of visual stability. In Aim 1 we investigate how remapping accounts for 3D spatial information: first by establishing a clearer picture of how 3D spatial information is represented in the brain (Aim 1.1), and then building off this knowledge to ask how depth information is remapped across eye movements (Aim 1.2). Aim 2 investigates how remapping interacts with feature and object recognition: first asking what type of location information is bound to object features/identity (Aim 2.1) and then testing if (and when) feature content is remapped across eye movements (Aim 2.2). The interdisciplinary research plan combines several techniques - behavioral, eye-tracking, and neuroimaging (fMRI and EEG) - to achieve a more thorough and extensive understanding of spatial stability across eye movements. The research proposed here will have an immediate impact on our understanding of typical visual functioning in healthy human populations. These advances could also have a longer-term impact on a variety of clinical applications, informing our knowledge and assessment of visual disorders resulting from eye disease, injury, brain damage, and development/aging.
|
1 |