Area:
Face perception, Cortical reorganization, Auditory motion, fMRI
We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Fang Jiang is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2013 — 2017 |
Jiang, Fang |
K99Activity Code Description: To support the initial phase of a Career/Research Transition award program that provides 1-2 years of mentored support for highly motivated, advanced postdoctoral research scientists. R00Activity Code Description: To support the second phase of a Career/Research Transition award program that provides 1 -3 years of independent research support (R00) contingent on securing an independent research position. Award recipients will be expected to compete successfully for independent R01 support from the NIH during the R00 research transition award period. |
Cross-Modal Motion Responses in Blind and Deaf Humans @ University of Washington
Project Summary Although cross-modal plasticity as a result of sensory deprivation has been documented across a wide variety of brain areas for a wide range of tasks we still have very little idea how or why cross-modal develops as it does. It is generally supposed that, when deprived of normal input, brain areas continue to fulfill their functional role, but use input from other modalities. In support of this, it has been shown that MT+, a brain region highly selective for visual motion in normally sighted individuals, responds to auditory and tactile motion after early blindness. Similarly, primary auditory and surrounding cortex (PAC+), which are sensitive to apparent movement of synthesized sounds, can be activated by visual motion in profoundly deaf subjects. However, it is not clear whether these cross-modal responses carry any behavioral relevance, or whether such responses share common architecture with normal functions. The goal of this proposed project is to address these questions using a combination of BOLD multivoxel pattern analysis and adaptation techniques within both deaf and blind individuals.
|
1 |
2020 |
Jiang, Fang |
P20Activity Code Description: To support planning for new programs, expansion or modification of existing resources, and feasibility studies to explore various approaches to the development of interdisciplinary programs that offer potential solutions to problems of special significance to the mission of the NIH. These exploratory studies may lead to specialized or comprehensive centers. |
Multisensory Temporal Processing in Early Deaf Adults @ University of Nevada Reno
The loss of one sense early in life can lead to an enhancement of the remaining senses. The behavioral enhancements are often accompanied by cortical reorganization, where sensory-deprived auditory regions are recruited for processing visual and tactile stimuli. To date, studies of compensatory plasticity in early deaf individuals have tended to focus on unisensory (either tactile or visual) spatial processing, and the recruitment of deprived auditory areas. Combining behavioral and neuroimaging measurements, the goals of this proposed project are to examine the effects of auditory deprivation on multisensory temporal processing, and to characterize the neural mechanisms underlying these effects. Specifically, we will determine whether early deafness impairs both the precision and the malleability of multisensory temporal integration and whether these impairments are spatially modulated. Furthermore, we will characterize neural correlates and neural dynamics of multisensory temporal processing in deaf individuals focusing on the right superior temporal sulcus (rSTS), a multisensory region that is the likely candidate mediating behavioral alterations in multisensory temporal functions following auditory deprivation. Taken together, this work will provide insights into the brain mechanisms underlying multisensory temporal perception in deaf individuals.
|
1 |