2006 — 2009 |
Fortune, Eric (co-PI) [⬀] Cowan, Noah |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Asm: Multi-Sensory Control of Tracking Behavior in Weakly Electric Fish @ Johns Hopkins University
Project Abstract ASM: Multi-Sensory Control of Tracking Behavior in Weakly Electric Fish PI: Noah J. Cowan, Dept. of Mechanical Engineering, Johns Hopkins University Co-PI: Eric S. Fortune, Dept. of Psychological and Brain Sciences, Johns Hopkins University Intellectual Merit- A tightly integrated and multidisciplinary approach using a uniquely suited model system will help answer a fundamental question in sensorimotor integration: how is information processed by the nervous system to control locomotion? In the model system, weakly electric fish robustly and naturally swim back and forth to stabilize visual and/or electrosensory images, just as humans smoothly track moving objects with their eyes to stabilize visual images. This collaborative work builds on the strengths of the PI, a modeler of sensorimotor locomotion systems and the Co-PI, an organismal sensory neurobiologist. The approach incorporates mathematical modeling, behavioral experiments, and neurophysiological analyses. A mathematical model of the tracking behavior makes specific, testable predictions of both behavior and neural processing. The model's predictions of behavior will be tested by systematically varying visual and electrosensory information available to the fish for tracking a moving sensory image. The model's predictions will also be tested using central nervous system recordings in awake, behaving animals. The stimuli will include signals identical to those used for behavioral experiments, whose input-output relations are predictable from the model. Broader Impacts Undergraduate and graduate trainees will receive multidisciplinary training in neuroscience, experimental design, data collection and analysis, and computational modeling. Further, we will co-teach a new undergraduate course in sensor-based animal locomotion, and a companion graduate seminar. Finally, the study of sensorimotor animal behaviors has great potential to inspire novel strategies for autonomous control in artificial systems.
|
1 |
2006 — 2007 |
Hager, Gregory Cowan, Noah |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Sger: Vision-Based Control of Mechanical Systems Via Spatial Sampling Kernels @ Johns Hopkins University
Vision-based control exemplifies problems in which complex high-dimensional data must be transformed into meaningful action, a fundamental problem in control systems theory. In typical vision-based control systems, information in the visual signal is abstracted and sent to a control algorithm as a geometric measurement. This framework, predicated on the a priori collapse of information-rich visual signals to a few image coordinates, relies on infallible visual tracking and perfect feature correspondence. While conceptually convenient, this approach is fundamentally limited. This project aims to create a unified approach to vision-based control that directly utilizes the underlying visual signals, not simply visual geometry. The approach combines spatial sampling kernels with Lyapunov stability theory. These techniques will facilitate the design of provably stable vision-based control systems for wide classes of visual signals, geometric motions, and system dynamics.
How does one convert high-dimensional data into useful action? The answer lies in the ability to distill massive amounts of data for example, a video stream into to low-dimensional, task-specific information for example, robotic movements in real time. This project seeks to develop a scientific basis for the efficient reduction of complex data streams into effective action. In particular, this project uses vision-based control to investigate new mathematical methods for collapsing complex signals in a task-specific way that admits clear analysis and provable performance. Central to the research is the idea that the connection between information processing and control need-not be ad hoc; instead, mathematical theorems that link information to action can provide performance guarantees under real-world circumstances.
|
1 |
2007 — 2011 |
Okamura, Allison (co-PI) [⬀] Cowan, Noah Webster, Robert |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Active Cannulas For Bio-Sensing and Surgery @ Johns Hopkins University
0651803 Cowan The goal of this research is to develop a dexterous, small-size, cannulas type robot for Minimal Invasive Surgery (MIS) and bio-sensor delivery applications based on coaxially assembled elastic tubular structures. The movement and control of the cannulas is based on the storage and release of elastic energy via translational and rotational motions induced by the user either locally or remotely via tele-operation systems. The research will provide better understanding and characterization of how actuation can be encoded in flexible structures and how it can be controlled. It mimics various natural creatures such as snakes, octopus tentacles, elephant trunks, etc., hence it can be considered as a biologically-inspired device. Issues such as miniaturization, complex tissue-device interfacing, and tele-operation add onto the scientific challenges of the project. This activity integrates research activities with educational opportunities for undergraduate and high school students in a coherent fashion. At the same time, it seeks to develop an improved technology for minimally invasive surgery which could benefit many people through improved quality of health care.
|
1 |
2007 — 2010 |
Okamura, Allison (co-PI) [⬀] Hager, Gregory Cowan, Noah |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Manipulating and Perceiving Simultaneously (Maps) For Haptic Object Recognition @ Johns Hopkins University
This SGER project is a feasibility study of algorithms for manipulation and recognition using haptic information. The Manipulation-and-Perceiving-Simultaneously (MAPS) paradigm borrows key ideas from recent advances in simultaneous localization and mapping (SLAM) and visual object recognition with modifications necessary to adapt them to haptic exploration.
In MAPS, exploration is driven by the objective of locating "interest areas" that carry high information content about the object. The haptic properties of these interest points will be encoded, together with location information, in a coordinate-system independent fashion. During a learning phase, a large set of such points will be acquired and stored in a database. During recognition, newly acquired points will be compared with those in the database to generate plausible model hypotheses. The final recognition decision will also incorporate the geometric constraints implicit in the location and orientation of the surface structure.
The key contributions of this project will be: - The development of active exploration strategies to extract, localize and map distinctive surface features; - The development and evaluation of haptics-based object recognition algorithms using both spatial and haptic "appearance" information; - The development of an experimental environment to implement and evaluate MAPS algorithms.
|
1 |
2007 — 2012 |
Okamura, Allison [⬀] Hager, Gregory Taylor, Russell (co-PI) [⬀] Cowan, Noah Kazanzides, Peter (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Mri: Development of Infrastructure For Integrated Sensing, Modeling, and Manipulation With Robotic and Human-Machine Systems @ Johns Hopkins University
Proposal #: CNS 07-22943 PI(s): Okamura, Allison M. Cowan, Noah J.; Hager, Gregory D.; Kazanzides, Peter.; Taylor, Russell H/ Institution: Johns Hopkins University Baltimore, MD 21218-8268 Title: MRI/Dev.: Infrastructure for Integrated Sensing, Modeling, and Manipulation with Robotic and Human-Machine Systems
Project Proposed:
This interdisciplinary project, developing infrastructure for sensorimotor integration that fosters new studies and enhances existing work in the area of manipulation for robotics and human-machine systems, aims to offer a publicly available systems framework, including software, mechatronics, and integrated hardware. Enforcing a general development approach that can be easily extended to other robotic and human-machine applications, two complementary robotics platforms will built addressing two different application domains: a . Bimanual dexterous manipulation system with integrated environment sensing and the capability for modeling rigid objects commonly found in human environments, and . Teleoperated surgical robotic system with integrated sensors that can acquire patient-specific deformable tissue models. Moreover, via the project's website, dissemination is planned for: . Open-source software for real-time system control and sensor/model/manipulation/display integration; . Design of a complementary mechatronic firewire controller board that includes A/D, D/A, encoders, amplifiers, and low-level control capabilities via FPGS, and . Detailed descriptions of hardware integration, including WAM arms, Barrett hands, a tactile sensing suite from Pressure Profile Systems, surgical robots, cameras, ultrasound, OCT, a vision-based tracking system, visual and haptic displays, and more.
Broader Impact: Integrated robotic systems that fuse multimodal sensory information to enhance models and manipulate the environment positively impact human lives, particularly in health care, safety, and human assistance. The research also impacts related disciplines, including neuroscience, rehabilitation, and surgery. Researchers will have access to open software and designs. The platforms clearly impact education, people at many career stages, from high school students to senior faculty. The system, and detailed directions on how to produce it, will be made available. The design framework will be used in undergraduate classes in conjunction with existing educational hardware; experimental platforms will be used for course projects. Visiting students and faculty (including WISE girls) will make positive use of the integrated testbeds. There is an ongoing collaboration with Morgan State U. Involving REU and RET participants, outreach is well planned. The dissemination plan includes an active web site; the software will be made available via open-source repository. Furthermore, the website documents all hardware and respective vendors.
|
1 |
2008 — 2012 |
Cowan, Noah Fortune, Eric [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Enhancement of Electrosensory Function Via Social Interaction @ Johns Hopkins University
How do brains control behavior? For most behaviors, animals use information gathered by sensory receptors (like vision, touch, and receptors in muscles) to control movement. When an animal or a person moves through its environment, its movement generates widespread stimulation of sensory receptors, such as the flow of visual images on the retina. These studies examine the relationships between movement, sensing, and brain mechanisms to identify fundamental principles for the control of behavior. The work includes a tightly integrated combination of behavioral experiments, mathematical modeling, and neurophysiological experiments that will be conducted in an ideally suited biological system, weakly electric fishes. These fishes have the ability to produce and detect electric fields, which permit experiments and analyses that are not possible in other animals. The behavioral experiments will measure the natural movements and social signals of these fish in the natural habitat of the Amazon basin and in the laboratory. The results will be mathematically modeled to determine the sensory images experienced by the fish in these settings. Reproductions of these sensory experiences will then be presented to the fish in neurophysiological experiments to examine the neural codes and mechanisms for sensory processing and locomotor control. The ultimate goal of the project is to describe basic principles for sensorimotor control that will be applicable to all animals, and can be translated for use in robotic systems. Interestingly, the behavioral studies in the Amazon basin will require the development and deployment of new electronic systems that will be used for environmental monitoring of critical aquatic habitats. Finally, these experiments will involve the training of promising students in both Ecuador and at the Johns Hopkins University in integrative biology and mathematical modeling.
|
1 |
2009 — 2014 |
Cowan, Noah |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Career: Sensory Guidance of Locomotion: From Neurons to Newton's Laws @ Johns Hopkins University
How do nervous systems transform sensory signals into motor commands to guide locomotion? To address this question, this research examines a class of sensory guided locomotion tasks uniquely amenable to computational and engineering analyses: sensorimotor stabilization tasks. In these tasks, animals automatically modulate muscle commands to drive sensory signals (e.g. electrosensory, tactile, visual) to desired equilibria or limit cycles.
Specifically, this research examines three remarkably divergent animal locomotor behaviors: multisensory control of swimming in weakly electric knifefish, high-speed antenna-based wall following in cockroaches, and visual yaw control in fruit flies. While performing sensory guided locomotion tasks, these animals are subjected to behaviorally naturalistic perturbations that lead to rich transient dynamics during recovery. Mechanical signals (e.g. positions, forces) and neural activity (e.g. action potentials) during recovery to these perturbations are used to validate (or refute) specific closed-loop sensorimotor control models. These models identify the roles of mechanics versus neural computation in the stability and performance of each behavior. The same approach to modeling, analysis, and experimentation is vetted in a biomorphic robot, establishing an experimental baseline in a highly controlled context, as well as enabling the translation of biological control strategies to a robotic platform.
This research, disseminated broadly through both the engineering and biological literatures, lays a scientific foundation on which to develop biomorphic robots for critical applications such as disaster recovery, space exploration, and security. Longer term, the unified approach to biological and robotic modeling developed in this project may lead to enhanced neural prostheses and brain--machine interfaces.
|
1 |
2009 — 2015 |
Patankar, Neelesh (co-PI) [⬀] Maciver, Malcolm [⬀] Lauder, George (co-PI) [⬀] Cowan, Noah Fortune, Eric |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cdi-Type Ii: Cyber-Enabled Discovery in Neuromechanical Systems @ Northwestern University
In the traditional view, the nervous system performs the computational "heavy lifting" in an organism. This view neglects, however, the critical role of biomaterials, passive mechanical physics, and other pre-neuronal or non-neuronal systems. Given that neurons consume forty times more energy per unit mass than structural materials such as bone, it is better, when possible, that biological systems employ relatively inexpensive structural materials rather than relying on more costly neuronal control. In this "bone-brain continuum" view, animal intelligence and behavioral control systems can only be understood using integrative modeling approaches that expose the computational roles of both neural and non-neural substrates and their close coupling in behavioral output. To this end, a group of researchers from Northwestern University, The Johns Hopkins University, and Harvard University propose to create a unique high fidelity neuromechanical model of a vertebrate. The effort is divided between the development of a general purpose computational tool set for neuromechanics research and application of these tools to an ideally suited model system, weakly electric knifefish.
The research will lead to breakthroughs in fundamental problems of how nervous systems work together with biomechanics to generate adaptive behavior. The final goal of the research is to construct an integrated neuromechanical model of a unique biological system - weakly electric knifefish - that places biomechanics and neural control on equal footing. Prior such neuromechanical models have used highly simplified models of mechanics and highly abstracted neuronal control approaches. This research advances the state of the art by incorporating high-fidelity mechanics with neuronal mechanisms motivated by direct neurophysiological evidence. Ultimately, this computational approach will help elucidate how animals distribute computations between brain and bone.
|
0.946 |
2010 |
Cowan, Noah John |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Steering Flexible Needles in Soft Tissue @ Johns Hopkins University
DESCRIPTION (provided by applicant): Minimally invasive surgical techniques have been highly successful in improving patient care, reducing risk of infection, and decreasing recovery times. This project aims to further reduce invasiveness by developing a technique to insert thin, flexible needles into the human body and steer them from outside. This approach will potentially improve a wide range of procedures, from chemotherapy and radiotherapy to biopsy collection and tumor ablation, by enhancing physicians'abilities to accurately maneuver inside the human body without additional trauma to the patient. Building on emerging methods in robotics and highly encouraging results obtained under an R21 exploratory grant, we propose to design, prototype, and evaluate a working system that will steer flexible needles through deformable tissue and around internal obstacles to precisely reach specified 3D targets. This research program will significantly advance our understanding and practice of needle therapies through integrated needle design and modeling, preoperative visualization and needle motion planning, and image-guided intraoperative needle control. The scientific and engineering advances will culminate in a set of pre-clinical trials with imaging (fluoroscopy, ultrasound, and MRI) using phantom and natural ex vivo and in vivo models. The designs, analyses, and experiments of this study will determine the merits and weaknesses of flexible needle steering, with the goals of improving current clinical applications and leading to new ultra-minimally invasive surgical procedures. The results of this project could significantly improve public health by lowering patient recovery times, infection rates, and treatment costs. By increasing the dexterity and accuracy of minimally invasive procedures, anticipated results will not only improve outcomes of existing procedures, but also enable percutaneous procedures for many conditions that currently require open surgery.
|
0.958 |
2012 — 2016 |
Cowan, Noah |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Understanding the Rules For Human Rhythmic Motor Coordination @ Johns Hopkins University
Walking for a healthy adult seems easy. However, underlying this apparent simplicity our nervous system is performing a task of astounding complexity. Using sensory information about the body's movement, the nervous system coordinates the activation of dozens of muscles so that we stably and efficiently move through our environment. For example, if our nervous system senses that our foot will strike the ground too soon, it will adjust muscle activations so we do not stumble and fall. In this project, an interdisciplinary team of investigators aims to uncover the rules the nervous system uses to make such adjustments. Using a general computational and theoretical framework taken from engineering (used to understand, for example, the rhythmic control of the angle of attack of rotating helicopter blades), the method depends on gently perturbing a person's senses and body in various ways and observing how the nervous system adjusts muscle activations in response. The investigators will first test their methods on a simpler type of rhythmic movement, repetitive hitting of a virtual ball with a paddle, then extend the findings to coordination during walking.
By constructing a general approach to understanding the control of rhythmic movements, including swimming in fish, flying in insects and birds, and walking in people and robots, the investigators may provide a foundation for understanding how control breaks down for people with neurological conditions such as stroke and incomplete spinal cord injury. This has the potential to advance neuromuscular rehabilitation and the design of assistive devices.
[Co-funded by CISE and SBE]
|
1 |
2015 — 2021 |
Cowan, Noah John |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
A Control Theoretic Approach to Addressing Hippocampal Function @ Johns Hopkins University
? DESCRIPTION (provided by applicant): The hippocampal formation is critically involved in learning and memory. Neurodegenerative disorders such as Alzheimer's Disease dramatically impact this area, leading to severe and progressive memory loss. The hippocampus appears to be the locus of an allocentric, cognitive map of the external world. This map is critical not only for spatial cognition, but also for the conscious recollection of past experience. The hippocampus is thought to bind the individual items and events of experience within a coherent spatiotemporal framework, allowing the experience to be stored and retrieved as a conscious memory. Decades of investigation of hippocampal place cells and the recent discovery of grid cells have revealed that this cognitive map arises from the interaction of external sensory inputs with endogenously generated neural dynamics (underlying the navigational strategy known as path integration). Classical neurophysiological studies with behaving animals have amply characterized the powerful in?uence of environmental landmarks on the ?ring locations of these spatial cells. Extending this approach to quantitatively investigate the internal processes of path integration has proven technically challenging. Virtual reality technology, in combination with systems theory, offers opportunities to solve these problems. We have designed and constructed a novel apparatus that allows us to manipulate the visual inputs (both landmarks and optic ?ow) available to a rat navigating a real circular track as a function of its movements while preserving normal ambulatory and vestibular experience. Place cells recorded in this apparatus replicate known standard phenomenology. In preliminary experiments, we induced a sustained, increasing con?ict between landmark information and path integration that caused a graceful, coherent dissociation of the place ?elds from the landmarks. We will test the capacity of the system to recalibrate the path integrator when challenged with this sustained con?ict. Further, we will develop a novel approach for isolating the contribution of optic ?ow and other selfmotion cues to the update of the neural representation of position, free of the competing in?uence of landmarks. Speci?cally, we will attempt to decode and control this cognitive representation during behavior through realtime manipulations of the optic ?ow. This approach will form the foundation of a novel research program aimed at a comprehensive analysis of the external vs. internal determinants of the cognitive map. Furthermore, this program may reveal important principles of neural computation relevant to general problems of how the brain integrates external sensory input with internal, cognitive representations, and may generate insights into the disordered thinking and hallucinations that are characteristic of schizophrenia and other mental disorders.
|
0.958 |
2016 — 2020 |
Cowan, Noah |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Neural Mechanisms of Active Sensing @ Johns Hopkins University
Animals, including humans, routinely use movement to sense the world around them. For example, to sense the texture of an object, a person might move her hand over the surface, whereas to measure the object's weight, she might hold it in her palm and move it up and down. This use of different movements to sense features of the environment is called Active Sensing. Although active sensing is commonplace in human behavior, how the brain generates and controls these movements is poorly understood. The goal of this project is to reveal and describe (in mathematical equations) the brain's strategies for active sensing. This will be achieved by studying a specialized animal species, the weakly electric glass knifefish. This animal was chosen because it has a suite of properties that make it ideally suited for the experimental approach. The expected findings will have broad implications for active sensing in other animals (including humans) because active sensing behaviors are similar across species. This work will have broad societal impacts, including the possible transformation of robotic control systems and enhanced understanding of the brain that may ultimately improve our understanding of neurological disorders. Further this work includes multidisciplinary training of promising students in critical STEM fields.
The central hypothesis for this research is that organisms adjust active movements in order to tune the resulting sensory feedback to match processing features of CNS circuits. This is a challenging problem because sensory inputs and motor outputs are linked by a closed loop. The experimental approach overcomes this challenge by (1) exploiting unique features of a well-suited model system, weakly electric fishes, (2) developing a closed-loop behavioral control system, and (3) performing chronic neurophysiological recordings in freely swimming fish. This integrated approach will enable the quantification of neuromechanical control strategies that organisms use to produce and modulate movements for active sensing, identification of cellular and synaptic mechanisms underlying neural responses to feedback from active movements, and discovery of how these changes in active movements affect sensorimotor integration in midbrain circuits.
|
1 |
2018 — 2021 |
Cowan, Noah Bastian, Amy (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Identifying Model-Based Motor Control Strategies to Enhance Human-Machine Interaction @ Johns Hopkins University
Humans are increasingly asked to cooperate with machines and robots in many occupational and recreational settings, including teleoperation, driving vehicles, surgery, rehabilitation, and object manipulation. Not only must humans learn to share control with a machine, the machines must in turn be better designed to enable human-machine interaction. The main objectives of this collaborative project are: to perform fundamental research that will identify multisensory (visual and haptic), human-in-the-loop, sensorimotor control models that capture predictive and reactive aspects of how people interact with their physical environment (e.g., machines); to advance understanding of how multisensory control can degrade in a patient population with degeneration of the cerebellum, which is thought to contribute importantly to tool use; and to advance understanding of how control systems for machines can be developed to exploit identified models of human sensorimotor control to enhance performance of human-machine interactions. The project is significant because it develops computable theories (computational models of human sensorimotor control) and the physical manifestation of those theories (robotic control algorithms) that will lead to enhanced human-machine interactions. This project directly serves the NSF mission by promoting fundamental science exploring modes of interaction between humans and intelligent robotic systems, which may contribute to advancing the national health. The project supports education through outreach activities aimed at recruiting and retaining students in STEM fields.
This research will contribute to a fundamental understanding of human motor behavior by developing a set of multidomain (haptic and visual) models that describe the application of model-based control strategies in the context of accommodating (or rejecting) influences from the environment. Aim 1 builds on the assumption that the computational problem solved by the human nervous system can be captured using model-based control strategies involving a combination of predictive (feedforward) and reactive (feedback) mechanisms. In a series of four sets of experiments, the team will use single-sine "predictable" and sum-of-sines "unpredictable" disturbances of visual and haptic feedback to interrogate sensorimotor control during reaching and object manipulation tasks. By identifying the structure and parameters of neuromotor control in these tasks, the PIs set the stage for later development of engineered control systems to improve human-machine interaction. Experiments supporting Aim 2 will mirror those serving Aim 1, identifying how sensorimotor control is impaired in a cohort of cerebellar ataxia patients. Expected results promise insight into the cerebellum's contributions to motor coordination and control, thereby advancing the national research priority of understanding brain function in health and disease. Aim 3 seeks to engineer intelligent machine controllers to "wrap around" a human's model-based control system to enhance cooperative performance of the overall human-machine system. Cohorts of neurologically intact and cerebellar patients will be tested. One set of experiments will examine the extent to which human participants can correctly interpret haptic feedback to correctly perceive whether a coupled automaton works "for" or "against" their efforts. A second set of experiments will exploit individualized models of sensorimotor control to examine the extent to which real-time visual feedback of hand position can be augmented to enhance performance of goal-directed reaching in patients with cerebellar ataxia. The project outcomes may have long-term impact by advancing understanding of how machine control can be designed to enhance performance of physically-coupled human-machine systems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |
2018 — 2020 |
Cowan, Noah John Hedrick, Kathryn Knierim, James J [⬀] Zhang, Kechen (co-PI) [⬀] |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Crcns: Dynamics of Gain Recalibration in the Hippocampal-Entorhinal Path Integration System @ Johns Hopkins University
The striking spatial correlates of hippocampal place cells and grid cells have provided unique insights into how the brain constructs internal, cognitive representations of the environment and uses these representations to guide behavior. These spatially selective cells are influenced by both self-motion signals and by external sensory landmarks. Self-motion signals provide the basis for a path integration computation, in which the hippocampal system tracks the animal's location by integrating its movement vector (speed and direction) over time to continuously update a position signal on an internal cognitive map. To prevent accumulation of error, it is crucial that this endogenous spatial representation be anchored by stable, external sensory cues, such as individual landmarks and environmental boundaries. Accurate path integration requires that an internal representation of position be updated in precise agreement with the animal's displacement in the world. What if the relation between position calculated by self-motion cues and position defined by landmark cues is altered, e.g. during development (slow time scale) or due to injury (fast time scale)? Does the animal recalibrate the internal gain between representations of its movement and the updating of the representation of its position in the brain? We hypothesize that this gain must be learned by reference to visual feedback. We constructed an augmented reality system that allows precise, closed-loop control of the visual environment as rats move through physical space and provide evidence that the path integration system can indeed be recalibrated. We propose a collaborative research program to investigate plasticity of the path integration gain at multiple neural levels using combined theoretical, engineering, and experimental approaches. We will combine mathematical analysis, biologically inspired attractor network theory, and principles derived from engineering to develop the first models of how the path integration system dynamically recalibrates itself in response to sensory feedback. We will perform recordings from the hippocampus and medial entorhinal cortex to provide data to constrain and test these models. The combined expertise of the Pl and Co Investigators in electrophysiological recordings of the hippocampal system, engineering, and mathematical neuroscience will propel the theory forward to explain the network dynamics and functional implications of this ethologically critical form of neural plasticity.
|
0.958 |
2018 — 2022 |
Cowan, Noah Gracias, David [⬀] Nguyen, Thao (Vicky) Schulman, Rebecca |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Efri C3 Soro: Programming Thermobiochemomechanical (Tbcm) Multiplex Robot Gels @ Johns Hopkins University
This project aims to integrate sensing, computation, and actuation directly into the body structure of a micrometer- to millimeter-scale soft robot, by creating a new class of soft hydrogels that can be thermally and biochemically programmed. This integration is critical for constructing autonomous robots that can function in complex, real-world environments. This project seeks to develop a comprehensive framework for the design, programming, validation, and fabrication of these gel robots. The robots will respond to temperature and biochemical cues from the environment, make decisions, and execute multistep motion programs that will allow them to move in a desired direction. The project will address fundamental scientific and engineering challenges in hydrogel mechanics, biochemical sensing, path planning, and controls for continuum soft robots, making it possible to efficiently design and manufacture the hardware for gel robots and to program them with biochemical software. These robots will have potential applications in marine, biological and medical environments, including in vivo targeting and diagnostics in medical settings. The project will also provide research opportunities to undergraduate and K-12 students, including those from underrepresented groups, via new and established outreach programs at Johns Hopkins University.
The goal of this project is to develop a framework for building a broad class of submillimeter-scale thermobiochemomechanical (TBCM) C3 robots in the form of architected hydrogels in which systems of chemical sensors, actuators, and computing devices replace the electronic sensors, actuators, and controllers in traditional robots. The research objectives of the project are to develop thermal and DNA responsive hydrogels and utilize these materials with advanced patterning and printing techniques to assemble robots that can respond to TBCM cues which are of relevance to human environmental and physiological conditions. Hardware design will be guided by a biomolecular software framework, mechanics based computational and reduced-order control models to enable planning and sensory feedback control. The broader impacts of this work include K-12 outreach programs to foster participation in STEM, dissemination of the research in peer-reviewed journals, dissemination of code and data and the development of new educational materials for undergraduate and graduate instruction.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |
2020 — 2024 |
Katz, Joseph (co-PI) [⬀] Moss, Cynthia [⬀] Mittal, Rajat Cowan, Noah Sterbing-D'angelo, Susanne |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Crcns: Discovering How Touch Sensors in the Bat?S ?Hand-Wing? Enable Agile Flight Control @ Johns Hopkins University
Bats perform feats of aerial agility that are unique in the animal kingdom, and completely unparalleled by even the very best robotic flying machines. This project aims to discover the fundamental principles that underlie how these animals achieve such superior flight control performance. Bat flight is powered by a ?hand-wing,? i.e. the wing is actually an evolutionary adaptation of the mammalian forelimb. As such, the bat hand-wing shares the same basic anatomy as the human hand and is highly sensitive to physical forces. The bat hand-wing is highly deformable and controllable, and is unlike any artificial wing that has ever been successfully constructed. The bat hand-wing is built from a very thin membrane that stretches across its fingers and is covered with small wind-sensitive hairs that enable the animal to ?feel? the complex flow of air that envelopes its wing. This unique set of flight and sensing adaptations presents a powerful model to investigate the mechanisms of sensing, brain computation, and movement control. The multidisciplinary research team will characterize and uncover the complex coupling relationships between aerodynamics, tactile sensing, and neural processing using a combination of engineering and biological techniques. This project will lead to deeper understanding of biological flight control, and will lend insights into ingredients that could one day be used in developing new robotic aerial vehicles capable of bat-like flight performance.
This project integrates state-of-the-art experimental measurements and computational flow modeling with behavioral and neurophysiological experimentation and dynamical control systems neural modeling. Using a multidisciplinary approach, the team will test the hypothesis that bat wing sensors carry information about complex airflow patterns and forces to the sensory cortex. The team will also elucidate sensorimotor mechanisms that guide wing adjustments to enhance lift and prevent stall. To achieve these goals, the research includes, : 1) Quantifying the mechanical stimulus inputs to receptors on the bat hand-wing using stereo-particle-image velocimetry, digital image correlation and computational fluid dynamic modeling; 2) Encoding mechanosensory signals from the wings via multichannel neural recordings from bat primary somatosensory cortex; 3) Closed-loop modeling and real-time control based on decoded output of neural signals. This research will yield a deeper understanding of sensorimotor feedback in biological systems while also contributing novel computational and experimental tools in the arena of sensorimotor control, biophysics, and mechanics, with wide applications to many arenas of neuroscience. The project will leverage the JHU?s Women in Science and Engineering (WISE) program and Baltimore Polytechnic?s Ingenuity Project to engage high school students from diverse backgrounds.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |