We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Daniel J. Thengone is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2018 — 2020 |
Thengone, Daniel James |
F32Activity Code Description: To provide postdoctoral research training to individuals to broaden their scientific background and extend their potential for research in specified health-related areas. |
Auditory Brain-Computer Interface For Communication
Project Summary A fundamental end-goal of brain-computer interfaces (BCI) is to enable communication in individuals with severe motor paralysis. BCIs decode the neural signals and accomplish the intended goal via an effector, such as a computer cursor or a robotic limb. The BCI user relies on the realtime feedback of the effector's performance to modulate their neural strategy to control the external device. To date, this feedback is predominantly visual. However patients with the most severe paralysis resulting from amyotrophic lateral sclerosis (ALS), some forms of stroke and traumatic brain injuries can have severe visual impairments including oculomotor fatigue, nystagmus and ophthalmoparesis - that make the reliable use of a visual-based BCI impossible. This puts a premium on developing novel solutions that can leverage sensory modalities that are intact. In this research, I will develop and test the feasibility of an auditory-based interface to establish BCI control in motor-impaired patients with severe neurological insults In Aim 1, I propose to implement a novel paradigm using auditory cues in lieu of visual signals, and test its feasibility in controlling an effector (ie, computer cursor) to perform a cued target-acquisition task in healthy participants. This will validate the range of parameter values of the four tested auditory input signals: 1) frequency, 2) amplitude, 3) spatial azimuth and 4) spatial elevation. This approach is distinct from most binary class auditory BCI solutions, since it relies on both the natural ability of humans to localize sounds, and the ability to associate new tones to a virtual space, thus allowing a truly multi-class auditory approach. In Aim 2, I propose to implement the auditory interface into the realtime xPC used for visual presentation in clinical trial participants with intracortical BCIs, and test their performance on the cued target-acquisition task. Although much success has been demonstrated in this task using visual feedback, this auditory approach will permit BCI use by people with visual impairments further compounding their paralysis. Finally in Aim 3, I will test the feasibility of BrainGate BCI users to utilize an auditory BCI speller to perform a copy-typing task and free- typing task. The accomplishment of the goals of this research will be a critical step towards enabling severely paralyzed individuals with visual impairments to re-establish communication independently, continuously and reliably.
|
1 |