1980 — 1984 |
Kutas, Marta (co-PI) [⬀] Hillyard, Steven |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Semantic Processing and Event-Related Brain Waves @ University of California-San Diego |
1 |
1985 — 2008 |
Hillyard, Steven A |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. R37Activity Code Description: To provide long-term grant support to investigators whose research competence and productivity are distinctly superior and who are highly likely to continue to perform in an outstanding manner. Investigators may not apply for a MERIT award. Program staff and/or members of the cognizant National Advisory Council/Board will identify candidates for the MERIT award during the course of review of competing research grant applications prepared and submitted in accordance with regular PHS requirements. |
Electrophysiological Studies of Selective Perception @ University of California San Diego
This project aims to investigate mechanisms of human selective attention by means of recording event-related brain potentials (ERPs) from the scalp in normal volunteer subjects. The ERPs are small voltage fluctuations in the ongoing EEG that are time locked to sensory and cognitive events and reflect the summated activity of neuronal populations engaged in the processing of information. Specific ERP components have been identified with processes of selective attention in both auditory and visual modalities. A series of seven experiments is proposed to increase our understanding of the physiological mechanisms that underly these attention-sensitive ERPs and the anatomical pathways that are involved. Experiments in both auditory and visual modalities will test for attention-related modulations in early evoked components that have been tentatively localized to subcortical or early cortical levels of processing. Inferences about the localization of neural generators will be supported by detailed topographical mappings of the voltage fields on the scalp. An overall goal in these experiments is to find out whether selective attention acts to gate or modulate sensory processing at an early (perhaps subcortical) level. Experiments with auditory attention will follow up our recent observations of a very early (20-50 msec) ERP change during dichotic distening to tone sequences. For visual-spatial attention, ERP recordings will be used to address the controversy of whether the improved processing of flashes at attended locations results from an early sensory facilitation or a subsequent change in decision criteria. The spatial extent of the "attentional spotlight" of improved processing will also be evaluated. This research relates to important mental health problems, since disturbances of selective attention are a characteristic of many clinical disorders including schizophrenia, attention deficit disorder and learning disability. The ERP studies proposed here should lead to an improved understanding of the basic mechanisms of both normal and disordered attention.
|
0.958 |
2010 — 2015 |
Hillyard, Steven |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Neural Basis of Cross-Modal Influences On Perception @ University of California-San Diego
Many of the objects and events that we encounter in everyday life, such as a barking dog or a honking car, are both seen and heard. A basic task our brains must carry out is to bring together the sensory information that is received concurrently by our eyes and ears, so that we perceive a world of unitary objects having both auditory and visual properties. With funding from the National Science Foundation, Dr. Steven A. Hillyard and colleagues, of the University of California, San Diego, are investigating when and where in the brain the visual and auditory signals are combined and integrated to produce coherent, multi-dimensional perceptions of objects in the environment. The sensory inputs from the eyes and ears are projected initially to separate regions of the brain specialized for perceiving the visual and auditory modalities, respectively. The timing and anatomical localization of neural interactions between auditory and visual inputs is being analyzed by means of scalp recordings of brain potentials that are triggered by these sensory events, together with magnetic resonance imaging of stimulus-induced brain activity patterns. A major aim is to analyze the brain interactions that cause a stimulus in one modality (auditory or visual) to alter the perception of a stimulus in the other modality. Three types of such auditory-visual interactions are being studied: (1) the brightness enhancement of a visual event when accompanied by a sound, (2) the ventriloquist illusion, which is a shift in perceived sound location towards the position of a concurrent visual event, and (3) the double-flash illusion that is induced when a single flash is interposed between two pulsed sounds. In each case, the precisely timed sequence of neural interactions in the brain that underlie the perceptual experience will be identified, and the influence of selective attention on these interactions will be determined. The overall aim is to understand the neural mechanisms by which stimuli in different sensory modalities are integrated in the brain to achieve unified perceptions of multi-modal events in the world.
Because much of our everyday experience involves recognizing and reacting to the sights and sounds of our surroundings, understanding the principles by which these auditory and visual inputs are synthesized in the brain is important. The ability to combine auditory and visual signals effectively is particularly important in teaching and learning situations, where spoken words must be put together with a variety of pictorial, graphic, and written information in order to understand the material. This research program can contribute to the development of more efficient learning environments and teaching techniques, and lead to improved designs for communication media such as audio-visual teaching tools, information displays, and critical warning signals. These studies are exploring the role of selective attention in synthesizing auditory and visual signals, research that can lead to improved teaching techniques that emphasize the training of attention. By studying the brain systems that enable auditory and visual inputs to be combined into perceptual wholes, this research can also help to understand what goes wrong in the brains of patients who suffer from abnormalities of perception, including those with learning disabilities, attention deficit disorder, autism, and schizophrenia.
|
1 |