
Jeffrey M. Yau, Ph.D. - US grants
Affiliations: | Neuroscience | Baylor College of Medicine, Houston, TX |
Area:
sensory systems, touch, multisensory processing, body representationsWebsite:
https://yaulab.comWe are testing a new system for linking grants to scientists.
The funding information displayed below comes from the NIH Research Portfolio Online Reporting Tools and the NSF Award Database.The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please sign in and mark grants as correct or incorrect matches.
High-probability grants
According to our matching algorithm, Jeffrey M. Yau is the likely recipient of the following grants.Years | Recipients | Code | Title / Keywords | Matching score |
---|---|---|---|---|
2016 — 2020 | Yau, Jeffrey M | R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Supramodal Human Brain Networks For Temporal Frequency Processing @ Baylor College of Medicine PROJECT SUMMARY/ABSTRACT Temporal frequency is a fundamental sensory domain that is critically important to how we communicate (e.g., speech processing by audition) and interact with objects in our environment (texture processing by touch). Our auditory and tactile senses redundantly signal temporal frequencies spanning tens to hundreds of cycles per second. This overlap enables audition and touch to interact, which can be beneficial because the information available by combining across independent sensory cues is more accurate than that provided by a single sensory cue. Despite this background, we do not have a clear understanding of the relationship between auditory and tactile frequency processing mechanisms. Previous investigations of the neural substrates supporting audition and touch have traditionally focused on a single sensory modality. Here we will test the hypothesis that common brain regions and neural mechanisms, termed supramodal, support auditory and tactile frequency processing. We will develop a computational model of how sensory neurons may combine auditory and tactile frequency information and how these neurons may be changed by adaptation. We will compare the model's predictions to behavioral data acquired in human psychophysical experiments. Using blood oxygen-level dependent functional magnetic resonance imaging (BOLD fMRI) and sensory adaptation, we will localize brain regions whose response patterns are consistent with neural adaptation, and we will use this approach to test whether brain regions represent (i.e., adapt to) both auditory and tactile frequency information. Using fMRI and multivariate pattern analysis (MVPA), we will identify the brain regions from which auditory and tactile frequency information can be decoded. We will determine whether regions support decodable frequency representations for both senses. Our preliminary modeling, psychophysics, and imaging results suggest that multiple regions in perisylvian cortex, including areas classically defined as unimodal, display frequency-selective responses to both auditory and tactile stimulation. This pattern suggests that perisylvian areas may serve as a supramodal network for frequency processing. We hypothesize that attention to vibration frequency enhances the functional connectivity in this frequency network. We will causally probe functional connectivity between somatosensory cortex and auditory cortex by combining transcranial magnetic stimulation (TMS) with fMRI (in concurrent TMS-fMRI experiments) and behavior (in psychophysical experiments). According to our hypothesis, we predict that neural changes caused by TMS of somatosensory cortex should propagate to auditory cortex when subjects attend to vibration frequency. This propagation should modulate auditory cortex activity and auditory perception. These predicted results would support the notion that classically defined somatosensory and auditory areas collaborate to process temporal frequency information as a supramodal network. Supramodal networks may support other fundamental sensory operations like shape and motion processing. |
1 |
2020 — 2023 | Yau, Jeffrey | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Visuospatial Modulation of Bimanual Touch Perception in Real and Virtual Environments @ Baylor College of Medicine In order to use our hands to manipulate objects, the nervous system must be able to coordinate and interpret a large amount of disparate sensory information about the position of our hands, the arrangement and positioning of the fingers, and what each is contacting. The computations are complicated by the fact that the information from the two hands must be interpreted relative to their spatial position. For example, the left and right index fingers occupy adjacent spaces when the hands are held together but the fingers can occupy vastly different spaces when the arms are spread apart. In these different situations, we can conceivably rely on our sense of vision to help interpret our sense of touch. Vision can provide information about where our hands are in space as well as what objects we touch and manipulate. Despite the obvious importance of vision for interpreting touch information from the hands, the specific ways in which the visual influence occurs remains poorly understood. This research is aimed at investigating the effects of simple and complex visual cues on touch information from the two hands in bimanual tasks. Understanding bimanual touch processing will facilitate effective nonverbal physical communication between people and intelligent machines and it will support the development of highly dexterous robotic systems, advanced neuroprosthetics, and sensorimotor rehabilitation strategies. Establishing how vision influences our perception of our body in real and virtual environments has clear implications for society?s use of immersive and interactive technologies. The project will also promote the participation and education of young women in neuroscience and related STEM fields. |
0.915 |