2001 — 2005 |
Romanski, Lizabeth M |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Auditory &Integrative Processes in Prefrontal Cortex @ University of Rochester
The integration of auditory and visual stimuli is crucial for recognizing objects by sight and sound, communicating effectively, and navigating through our complex world. While it is not clear where in the brain information from these two modalities are combined and processed to govern specific behaviors, the frontal lobes have been identified as a region associated with memory and language, which depend on cross- sensory integration. Moreover, recent analysis of the inferior convexity (IFC) of the prefrontal cortex in the macaque monkey has revealed juxtaposed and overlapping areas of visual and auditory processing, indicating a possible role in multimodal sensory integration. Previous research on the prefrontal cortex has focused on its role in visual memory processing. The goal of this project is to obtain a fundamental understanding of how the frontal lobes process complex auditory, visual and combined stimuli which serve meaningful communication and object recognition. Our experiments will focus on the neurophysiological and anatomical analysis of the IFC in the macaque monkey. Specifically, experiments will characterize the selectivity, specificity and organization of auditory and visual responses within the IFC. In addition, we will ascertain whether auditory or visual responsive neurons within the IFC are, in fact, multimodal (i.e. respond to combined visual and auditory stimuli). Moreover we will compare the modulation of unit responsiveness to multimodal stimuli presented passively or in an associative memory task. Finally, we will elucidate the neuronal circuitry underlying auditory, visual and multimodal responses in the frontal lobes by determining the anatomical connections of physiologically characterized regions of the IFC. Findings from the present study will help to reveal the circuitry and cellular mechanisms underlying communication processes relevant to speech perception. Our findings will also have implications for understanding neurological disorders which affect communication and language, including schizophrenia and autism in which dysfunction has been implicated in the frontal lobes.
|
1 |
2007 — 2011 |
Romanski, Lizabeth M |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Auditory and Integrative Processes in Prefrontal Cortex @ University of Rochester
The integration of auditory and visual stimuli is crucial for recognizing objects by sight and sound, communicating effectively, and navigating through our complex world. While auditory and visual information are combined in many sites of the human brain, the frontal lobes have been identified as a region associated with memory and language, which depend on multisensory integration of complex auditory and visual stimuli. Previous studies in non-human primates have revealed revealed juxtaposed and overlapping areas of visual and auditory processing in the ventral prefrontal cortex, and recently we have observed neurons that respond to combined face and voca stimuli indicating a possible role in multimodal sensory integration.The goal of this project is to obtain a fundamental understanding of how the ventral prefrontal cortexprocesses complex auditory, visual and combined audio-visual stimuli which serve meaningful communication and object recognition. Our experiments will focus on the neurophysiological and anatomical analysis of the primate ventral prefrontal cortex. InAim 1we will characterize the selectivity, specificity and organization of prefrontal single unit electrophysiological responses to face, vocalization and combined face-vocalization stimuli. Our studies will utilize both static (picture of face)and dynamic (short movie) natural stimuli combined with vocalizations. In Aim 2 we will examine the cellular mechanisms which underlie sensory integration in a cross-modal memory task and the cellular changes which take place during the learning of audio-visual associations in this task. Finally, we will elucidate the neuronal circuitry underlying auditory, visual and multimodal responses in the frontal lobes by determining the afferent and efferent connectionsof auditory, visual and multimodal regions of the ventral prefrontal cortex. Since our studies are aimed at determining the prefrontal neuronal mechanisms underlying the perception of complex communication stimuli and their integration, our findings will have implications for understanding neurological disorders which affect communication, language and sensory integration. These disorders include schizophrenia and autism in which disturbance of prefrontal cortical function has been described.
|
1 |
2012 — 2017 |
Romanski, Lizabeth M |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Auditory and Integrative Functions of the Prefrontal Cortex @ University of Rochester
DESCRIPTION (provided by applicant): The integration of facial gestures and vocal signals is an essential process in communication and involves many brain regions. One brain region that has been shown to play a crucial role in the processing of face and vocal information is the ventral frontal lobe, home to our language processing areas. While previous studies have described the role of the frontal lobes in working memory, decision-making, and goal-directed behavior with regard to visual information few have examined the cellular basis of auditory and integrative processes in the primate prefrontal cortex. The goal of the experiments in this proposal is to understand how neurons in the ventral prefrontal cortex process and integrate complex auditory and visual information, especially face and vocalization information that is relevant to social communication. We have previously shown that single neurons which respond to faces, to vocalizations, and to their integration are found in the ventral lateral prefrontal cotex (VLPFC), a region homologous to language regions in human inferior frontal gyrus. However, vocalization responsive neurons, which also respond to faces, are mainly found in anterior VLPFC. This suggests that portions of VLPFC may be specialized to process and integrate social stimuli, such as faces and vocalizations, or speech and gestures in the human brain. We will assess neuronal activity in primate prefrontal cortex during audiovisual memory tasks which will determine: 1) Is vocalization processing a multisensory process; 2) What factors contribute to multisensory processing and integration of social versus non-social stimuli; and 3) Is the ventral prefrontal cortex necessary in remembering a face-vocalization stimulus? If we can understand the neural processes that normally occur when a facial gesture is integrated with, or enhanced by vocal information, we may begin to understand the problems that occur when this process is disrupted as it may be in language disorders and in autism spectrum disorders. PUBLIC HEALTH RELEVANCE: The integration of face and voice information, as well as other audiovisual information is a basic part of many cognitive functions including communication and recognition. Audiovisual integration relies on several brain regions, including language regions in the ventral frontal lobe. In this proposal we will investigate how neurons in the frontal lobe bring together audiovisual information that is social, or stimuli that are non-social.
|
1 |
2017 — 2021 |
Romanski, Lizabeth M |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Auditory & Integrative Functions of the Prefrontal Cortex @ University of Rochester
SUMMARY The integration of facial gestures and vocal signals is an essential process in communication and relies on an interconnected circuit of brain regions. The neural basis of face and vocal integration is a topic of great importance since sensory integration is essential to speech perception and object recognition and is impaired in autism spectrum disorders (ASD) and other intellectual disabilities. Our previous investigations have shown that primate VLPFC, a proposed homologue of human IFG, has neurons that are responsive to species-specific faces and that it integrates these stimuli. Furthermore, VLPFC neurons are active during a crossmodal working memory task with neurons demonstrating increased delay activity, match suppression & enhancement, and activity that is tied to both task and stimulus factors. Importantly, VLPFC neuronal activity is also critical for audiovisual working memory since temporary inactivation results in a decrease in working memory performance. In this proposal we will focus on the role of two additional cortical areas that are connected to VLPFC. Both the dorsolateral prefrontal cortex (DLPFC) and the medial prefrontal cortex (MPFC) are connected to VLPFC and have been shown to support cognitive functions that are a part of audiovisual working memory. Furthermore, we will address the lingering question of which features of face and vocal stimuli are integrated by VLPFC neurons during audiovisual processing and we will expand our focus to ensembles of VLPFC neurons in order to appropriately address this in Aim 3. We hypothesize that the connected cortical areas VLPFC, DLPFC and MPFC play different but contributing roles in the processing, integration and remembering of face and vocal information during social communication.
|
1 |
2018 — 2019 |
Romanski, Lizabeth M |
R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
Audiovisual Processing in Temporal-Prefrontal Circuits @ University of Rochester
SUMMARY The integration of facial gestures and vocal signals is an essential process in communication and relies on an interconnected circuit of brain regions. The neural basis of face and vocal integration is a topic of great importance since sensory integration is essential to speech perception and object recognition and is impaired in autism spectrum disorders (ASD) and other intellectual disabilities. Neuroimaging studies have highlighted the ventral frontal lobe and the cortex surrounding the superior temporal sulcus region (STS) as locations where face and vocal information are brought together, and anatomical studies have described their robust connections. Thus a network where temporal lobe regions send unisensory and multisensory information to the ventral prefrontal cortex (VLPFC) for integration and maintenance in working memory likely underlies social communication. In this proposal we will identify the regions of the STS where neurons integrate face and vocal stimuli (Aim 1a). We will also determine whether STS neurons demonstrate selectivity in their integrative responses to social (face and vocal stimuli) vs. non-social (object) pairs of stimuli. Moreover we will establish whether STS neurons contribute to social cognition by demonstrating their activity during an audiovisual memory paradigm (Aim 1b). Finally, we will demonstrate STS-VLPFC connectivity in audiovisual working memory by revealing changes in STS neuronal responses while VLPFC is inactivated during the performance of audiovisual working memory (Aim 2). We have previously shown that inactivation of VLPFC disrupts audiovisual memory (Plakke et al., 2015) and we will extend these studies to quantify how STS neurons are affected by VLPFC inactivation. These studies will investigate which regions of the STS are affected by VLPFC inactivation and are necessary in the evaluation of audiovisual stimuli. We hypothesize that VLPFC inactivation will not affect STS neuronal responses during the sample period of the audiovisual memory task but will affect STS activity during the decision phase of the task when STS-VLPFC interaction is needed to compare the stimulus at hand (STS perception from bottom up afferents) with the sample stimulus held in memory (VLPFC top down influence). The identification of connected STS-VLPFC regions will help us to better understand audiovisual processing, integration and memory, in both health and disease.
|
1 |
2020 |
Romanski, Lizabeth M |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Auditory & Integrative Functions of the Prefrontal Cortex Diversity Supplement @ University of Rochester
SUMMARY In this diversity supplement we are proposing a year of research and training for the candidate to perform research in the laboratory of Dr. Lizabeth Romanski. Three experiments are proposed. In supplemental experiment 1, the candidate will examine the role of the medial prefrontal cortex in performance monitoring and error detection. To do this he will train subjects to perform a more difficult version of the audiovisual working memory task. This will allow for the examination of the neural activity during errors and increased difficulty. In Experiment 2, the candidate will use the Oddball task to determine how ventral prefrontal ensembles detect incongruent audiovisual pairs of stimuli. Use of this easy-to-train task will allow the testing of many more stimulus pairs than traditional working memory paradigms. In Experiment 3, the temporal binding of face-vocalization communication stimuli will be examined by altering the SOA of the auditory and visual components and recording the expected change in ventral prefrontal neurons to these asynchronies. The results of these studies will inform us of the role of prefrontal cortex in detecting and integrating specific features of communication information that are directly related to contextual relevance of stimuli and the importance of the temporal relationship of vocal information and associated mouth movements. The experiments are logical extensions of the parent grant and are an efficient approach to gaining more information from valuable resources already in place in our laboratory.
|
1 |
2020 — 2021 |
Majewska, Anna K Romanski, Lizabeth M |
T32Activity Code Description: To enable institutions to make National Research Service Awards to individuals selected by them for predoctoral and postdoctoral research training in specified shortage areas. |
Graduate Training in Neuroscience @ University of Rochester
The goal of this Neuroscience Training Program is to generate highly creative and productive neuroscientists who are broadly trained in neuroscience, well trained in their research specialty and equipped to address tomorrow?s important neuroscience questions in a broad range of neuroscience-related careers. The University of Rochester has recently made neuroscience the number one priority in its strategic plan, allowing growth in neuroscience research resources, and an expansion of training opportunities. This application takes advantage of this opportunity to expand and enhance graduate training in neuroscience, creating a catalytic and diverse cohort of trainees. Training support is requested for 6 predoctoral students in each of 5 years. This support will be used exclusively for broad and basic support of students in their first two years of graduate study. A diverse and interactive faculty composed of both basic and translational researchers dedicated to excellence in teaching and collaborative research, and committed to neuroscience training fosters a nurturing training environment. These experienced training faculty, with strong records of extramural funding, offer research training opportunities in a wide variety of neuroscience disciplines. Core elements of our training program include: 1. Core and elective coursework as well as discussion of current literature in required weekly journal clubs and seminars; 2. Training in oral and written communication; 3. Training in ethical conduct of research; 4. Training in experimental design and statistical analysis; 5. A strong focus on mentoring from thesis advisors, committees, as well as other faculty and students; 6. Career development leveraging the program Broadening Experiences in Scientific Training (UR BEST) and the Center for Professional Development to provide students with opportunities to follow their own career path. We propose to further enhance training by developing a new curriculum that allows students to develop quantitative skills through a required summer-long course on data analysis and computation using MATLAB as a platform, and a separate annual workshop that addresses issues in rigor, reproducibility and ethics in an on-going hands-on fashion. We will continue to strengthen our already successful strategies to recruit and nurture minority and women students and to provide all of the trainees with the essential skills to become independent scientists with an appreciation for the ethical conduct of research. These trained scientists will provide the next generation of neuroscientists who will further advances in basic science and translational studies, teach future generations, set science policy and alter the landscape of neuroscience through many different paths.
|
1 |