1999 — 2001 |
Smotherman, Michael S |
F32Activity Code Description: To provide postdoctoral research training to individuals to broaden their scientific background and extend their potential for research in specified health-related areas. |
Audiomotor Integration in Midbrain @ University of California Riverside
The long-term goal of this proposal is to elucidate the neural substrate underlying audio-vocal integration in mammals. Echolocating bats are ideal for studying audio-vocal integration because they continually modify the parameters of their vocalizations based upon characteristics of the returning echo. In the horseshoe bat, the midbrain paralemniscal tegmentum (PL) is believed to be centrally involved in the coordination of auditory input with vocalization. Afferent projections from several places along the main auditory pathways converge within the PL, and PL efferents are known to influence vocalization. The PL is composed of at least five subpopulations distinguishable by their response properties and cytoarchitecture, and each is potentially involved in the modulation of different motor patterns. The goal of this proposal is to identify the pharmacological nature of the afferent and efferent projection patterns of each PL subdivision, and the contributions of each region to audio-motor integration. Specifically, biotin-conjugated Ibotenic acid (NMDA agonist) and muscimol (GABA agonist) will be injected into each subdivision while monitoring vocalizations in the awake and behaving animal. Spontaneously vocalizing bats will be presented with artificial echoes, and the parameters of succeeding vocalizations will be analyzed. Afterwards, the brain will be examined histologically. The contributions of each PL subdivision to audio-vocal integration will be assessed, along with pharmacology, a specific mapping of injection sites, cytoarchitectures, and projection patterns. Ultimately this work should lead to improved speech therapies for hearing impaired humans.
|
0.951 |
2006 — 2010 |
Smotherman, Michael S |
P41Activity Code Description: Undocumented code - click on the grant title for more information. R03Activity Code Description: To provide research support specifically limited in time and amount for studies in categorical program areas. Small grants provide flexibility for initiating studies which are generally for preliminary short-term projects and are non-renewable. |
A Neural Interface For Coordinating Speech and Breathing @ Texas a&M University System
[unreadable] DESCRIPTION (provided by applicant): The neurological basis of many common speech and language disorders remains elusive, largely because very little is known about how vocal motor patterns are generated in the brain. One of the key questions yet to be addressed is how speech patterns come to be seamlessly interweaved with breathing patterns. Where and how in the brain speech and breathing interact remains unknown. Breathing patterns must be altered to accommodate the highly variable speech patterns utilized by a speaking or singing human, yet the temporal flow of speech is naturally constrained by the need to breath regularly. This process betrays the presence of a complex neural substrate that regulates the descending flow of vocal motor commands into the central respiratory rhythm centers using current information obtained from pulmonary and laryngeal somatosensory feedback. Although human speech is uniquely complex, echolocating bats compare favorably within the context of vocal-respiratory dynamics. Echolocation behavior relies upon a precise, yet highly dynamic set of distinct respiratory modes which can be selectively evoked in the lab by computer-generated auditory stimuli. This application will combine in vivo electrophysiological methods with computer-generated auditory stimuli to investigate the role of a newly discovered neuronal subtype in the lateral region of the parabrachial nucleus that appears essential for the normal changes in breathing patterns associated with vocalizing. It is hypothesized that these vocal-respiratory integration neurons are responsible for creating phase-shifts in breathing rhythms to accommodate vocalizing. If so, these neurons would represent an important interface between the vocal motor pathway and the neural circuitry regulating respiratory rhythms. The parabrachial nucleus is a highly conserved region of the vertebrate brain known to possess profound influences over breathing patterns, and is considered an essential component of the vocal motor pathway in mammals. Any pathological or developmental disruption of the vocal-respiratory integration mechanisms would unavoidably degrade the normal flow of speech. Thus, a more thorough understanding of how and where in the brain this is achieved could reveal the neurological basis of some human speech disorders, such as stuttering, disarthria and apraxia. [unreadable] [unreadable] [unreadable]
|
1 |
2014 — 2017 |
Smotherman, Michael |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Networking Strategies Used by Bats to Improve Social Sonar @ Texas a&M University Main Campus
The biosonar behavior of echolocating bats provides an example of how natural and artificial active sensing systems may be optimized to meet rapidly changing social and environmental conditions. Bats and artificial sonar, radar and communication systems all face a similar suite of problems arising from mutual interference. Explaining how bats minimize interfering with one other's sonar while flying in dense swarms or within noisy crowded roosts may help improve a wide range of artificial sensing and wireless communications systems. Bats are highly social and their exceptional resilience to jamming by conspecifics far exceeds the strategies currently employed in artificial systems. This proposal will investigate how groups of bats manage this by applying recent advances in computer networking to sonar/radar theory. The experiments outlined will characterize the behavioral strategies used by bats to improve their signaling efficiency when echolocating in groups of different sizes and across tasks of varying navigational demands. Computer simulations will be used to identify optimum algorithm parameters and explore how these behavioral strategies manifest across a wider range biosonar conditions. By deciphering the behavioral algorithms by which bats can cooperatively share the same acoustic space this proposal may lead to better and more efficient strategies for optimizing artificial communications network usage to support an increasingly information-based economy. This proposal will also offer unique exposure, training and educational opportunities for STEM students interested in exploring biological systems as a source of inspiration for artificial systems. In partnership with local science teachers, this proposal creates an inquiry-based dual language (Spanish/English) teaching website that will offer access for 5th/6th grade science classes to remotely interact with live animals.
Interference problems arise in artificial systems such as sonar and radar, wireless communications and computer network when many users attempt to transmit simultaneously over a single shared channel. Wireless communications networks partially mitigate this by employing protocols constraining user transmission rates during periods of high channel traffic to minimize interference rates between users. This proposal tests the hypothesis that echolocating bats adapt their sonar pulse emission rates to optimize sonar performance in social situations following protocols similar and applicable to international standards in wireless communications systems. The proposal uses a combination of behavioral studies with bats in the lab and computer simulations to decode the bat's retransmission delay algorithm. First, trained bats are flown alone and in small groups of 2-5 through a sensor maze to quantify the bats' navigational performance and measure whether they systematically change pulse emission rates when performing the same task in a social context. Secondly, bats will be flown through the maze in the presence of artificial acoustic stimuli mimicking the sounds of other bats passing through the maze to test whether bats can realize a net improvement in sonar performance by decreasing their own pulse emission rates proportional to population density, counterintuitively reaping a net increase in information flow by emitting fewer pulses. Computational models of the behavioral algorithm will be developed to simulate the behavior, establish optimal parameters and precisely quantify its benefits to sonar performance in social contexts. The bat's sonar algorithms will be made available through scientific journals and Matlab-based biosonar simulation tools developed for the project will be made freely available via multiple sources on the Internet.
|
0.915 |
2021 |
Smotherman, Michael |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
I-Corps: Hand-Held Assistive Mobility Device For the Visually Impaired Using Sensors to Feel Obstacles From a Distance and Track Their Movements
The broader impact/commercial potential of this I-Corps project is the development of a hand-held assistive mobility device for the visually impaired. The Center for Disease Control (CDC) estimates that approximately 12 million people over the age of 40 suffer from visual impairment in the United States. Vision impairment and loss of sight are among the top 10 disabilities for individuals over the age of 18 and can have a substantial social and economic toll including, but not limited to, significant loss of productivity and diminished quality of life. Further, the annual economic impact of major vision problems in those 40 and older is estimated to be $145 billion. The proposed device holds the potential to significantly improve mobility, accessibility, and quality of life for the millions of people with no or low-vision that currently use an assistive mobility device such as the widely used “white cane.” The technology also offers potential benefits to people working in low-visibility or disaster conditions, such as emergency first-responders or military personnel.
This I-Corps project is based on the development of a hand-held device that uses remote sensing technology to control a dynamic tactile display on the hand, allowing the visually impaired to detect and classify obstacles in their environment sooner and more broadly than is possible using current assistive technology devices. Unlike existing technologies that involve touching or poking items with a white cane or rely on the individual to follow auditory cues, this device may enable individuals to feel the presence of obstacles and targets from a distance, including the relative location of multiple obstacles and being able to track the relative movements of nearby obstacles and targets. The proposed technology is an extension of research into optimized graphical displays of information derived from Sound Navigation and Ranging (SONAR) and Light Detection and Ranging (LIDAR)-based sensor systems. People in low-vision situations may be better equipped to rapidly process and exploit multi-channel spatial information when it is delivered via the hand instead of the ear. The proposed technology exploits the natural active-sensing behaviors used by humans when they reach out to explore their environment.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
0.915 |