2016 — 2020 |
Clark, Damon Alistair |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Algorithm and Neural Basis of a Fundamental Visual Motion Computation
? DESCRIPTION (provided by applicant): Visual motion perception is critical to animals. It guides a wide range of behaviors, from navigation and predator avoidance to mating. This project proposes to dissect a newfound visual motion computation in the fruit fly Drosophila, a model organism with powerful genetic tools that can define the roles of individual neurons in neural computations. With these tools, this research will map out the neurons that compute the new motion signal and the way they compute it. This work is significant for two reasons. First, this new motion computation is a different algorithm from classical motion estimation. Classical estimation is well-described by the Hassenstein-Reichardt correlator (HRC), an influential model that has long guided research on visual motion detection in insects, and whose descendants guide motion perception research in animals from mice to primates. The motion-guided behavior studied here appears to be fundamental, but the motion signal is computed in a qualitatively different way from the HRC model's predictions. This new algorithm and its implementation will add to the suite of potential visual motion detectors. Because of the parallels in visual computations between flies and vertebrates, it is likely that vertebrate visual systems make use of a similar algorithm. Second, animals compute motion at several places in their visual systems, but it remains unclear how the different computations are used. Analysis in the compact fly brain will uncover principles for understanding the larger brains of mammals. In the fly, genetic tools and behavioral measurements can be combined to investigate three questions about its motion detection: How do its two motion systems compute different signals? How do they use overlapping circuitry? And how do they guide different behaviors? These questions lead to three main aims of this research. Aim 1 will characterize the algorithm that computes the new motion signal. Behavioral measurements and targeted visual stimuli will constrain or rule out potential models and provide a mathematical analysis of how this motion estimate is extracted from visual inputs. Aim 2 will identify neurons required for the new motion computation. Genetic tools will be used to silence specific neurons in the visual system in order to identify which neurons are required for the new computation. Aim 3 will measure the functional response properties of visual neurons in the new motion circuit. Measurements of neuron responses to stimuli will show how these neurons combine signals to generate the observed motion signals. On completion, these studies will result in a detailed understanding of the algorithm and the neural mechanisms that implement the new motion computation. The new computation and its interactions with the classical motion detector will provide a template for understanding how multiple motion signals are generated and used by the brain.
|
1 |
2016 — 2019 |
Clark, Damon |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Understanding How Neural Nonlinearities Tune Motion Detection in the Fly Eye
To understand how our brains work, we must understand how groups of neurons work together to perform computations on environmental inputs to produce behavior. When neurons in a circuit compute, their activity and the output from the network can often be described by a series of mathematical operations, like addition, subtraction, and multiplication. A canonical example of neural computation is the eye's detection of visual motion, which employs mathematical operations that are present in many circuits in the brains of many animals. This research investigates the mathematical operations that describe the output of this neural computation, as well as how those operations are implemented by specific neurons in the circuit and how the selection of specific operations changes the performance of the circuit. The broader impact of this project focuses on high school outreach, where it will teach a module on visual perception and illusions. The project will also focus on bringing high school and undergraduate students into lab to perform research in an accessible experimental system.
This research focuses on the small circuit of motion detecting neurons in the eye of the fruit fly Drosophila. The fruit fly's eye is an ideal target for study because its circuit is small and increasingly identified; genetic tools allow us to manipulate and measure from individual, identified neuron types in order to test hypotheses; and a long-standing modeling framework can be used to interpret results. To understand the details of the circuit's computation, we will use behavioral psychophysics and in vivo calcium imaging to record circuit responses to specially designed visual stimuli that isolate specific neural computations. We will use responses to these stimuli and others to simulate circuit responses to visual stimuli. Genetic silencing will allow us to identify neurons that are involved in the motion computation. These experiments will lead us to an understanding of which nonlinear operations are implemented in this circuit, and how those operations affect the circuit's performance.
|
0.915 |
2018 — 2021 |
Turk-Browne, Nicholas [⬀] Clark, Damon Lafferty, John (co-PI) [⬀] Brock, Jeffrey (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Tripods+X:Res: Investigations At the Interface of Data Science and Neuroscience
This project will build a transformative bridge between data science and neuroscience. These two young fields are driving cutting-edge progress in the technology, education, and healthcare sectors, but their shared foundations and deep synergies have yet to be exploited in an integrated way - a new discipline of "data neuroscience." This integration will benefit both fields: Neuroscience is producing massive amounts of data at all levels, from synapses and cells to networks and behavior. Data science is needed to make sense of these data, both in terms of developing sophisticated analysis techniques and devising formal, mathematically rigorous theories. At the same time, models in data science involving AI and machine learning can draw insights from neuroscience, as the brain is a prodigious learner and the ultimate benchmark for intelligent behavior. Beyond fundamental scientific gains in both fields, the project will produce additional outcomes, including: new collaborations between universities, accessible workshops, graduate training, integration of undergraduate curricula in data science and neuroscience, research opportunities for undergraduates that help prepare them for the STEM workforce, academic-industry partnerships, and enhanced high-performance computing infrastructure.
The overarching theme of this project is to develop a two-way channel between data science and neuroscience. In one direction, the project will investigate how computational principles from data science can be leveraged to advance theory and make sense of empirical findings at different levels of neuroscience, from cellular measurements in fruit flies to whole-brain functional imaging in humans. In the reverse direction, the project will view the processes and mechanisms of vision and cognition underlying these findings as a source for new statistical and mathematical frameworks for data analysis. Research will focus on four related objectives: (1) Distributed processing: reconciling work on communication constraints and parallelization in machine learning with the cellular neuroscience of motion perception to develop models of distributed estimation; (2) Data representation: examining how our understanding of the different ways that the brain stores information can inform statistically and computationally efficient learning algorithms in the framework of exponential family embeddings and variational inference; (3) Attentional filtering: incorporating the cognitive concept of selective attention into machine learning as a low-dimensional trace through a high-dimensional input space, with the resulting models used to reconstruct human subjective experience from brain imaging data; (4) Memory capacity: leveraging cognitive studies and natural memory architectures to inform approaches for reducing/sharing memory in artificial learning algorithms. The inherently cross-disciplinary nature of the project will provide novel theoretical and methodological perspectives on both data science and neuroscience, with the goal of enabling rapid, foundational discoveries that will accelerate future research in these fields.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
0.915 |
2021 |
Clark, Damon Alistair |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Dissecting the Roles of Timing in a Canonical Neural Computation
Project Summary [30 lines max] Timing is critical to neural processing. Nowhere is that clearer than in visual motion detection. To detect motion, neurons transmit visual information with different latencies, or delays, allowing the circuit to compare visual scenes over time. When comparisons over time are combined with comparisons over space, the circuit can compute direction-selective signals, which are larger for motion in one direction than in the opposite direction. These signals in turn guide a wide range of behaviors, from navigation and predator avoidance to mating. This project proposes to investigate the origins and effects of timing differences in visual motion circuits in the fruit fly Drosophila, a model organism in which powerful genetic tools can identify the roles of individual neurons in computations. Using these tools, this research will identify mechanisms that underlie the different dynamical responses of visual neurons and map out how those responses control downstream computations. This work is significant for two reasons. First, motion detection is a canonical neural computation, since it requires circuits to integrate visual information over both time and space, and because it is necessarily nonlinear. Moreover, the anatomy, physiology, and computational structure of motion detection has strong parallels between flies and mammalian retina and cortex. Therefore, it is likely that what we learn about the mechanisms that regulate timing the fly eye and their effects on motion computation will be mirrored in other circuits that detect visual motion. Second, our proposed aims will test fundamental models of motion detection. All of these models rely on timing differences that permit comparisons to be made over time, but these assumptions have not been tested. Our research will distinguish between proposed models in the fly and test fundamental assumptions about how motion is computed by differences in the timing of neural signals. In our complementary aims, we will uncover mechanisms that generate different timing in different neural responses. We will also measure the effects of timing differences on motion signals and on behavior. On completion, these studies will advance our understanding of how neural response timing is regulated and how that timing determines downstream neural computations. We expect that what we learn in this small neural circuit can serve as a scaffold for understanding the roles of timing in motion detection in the larger brains of mammals.
|
1 |
2021 |
Clark, Damon Alistair |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Integrating Visual Counterevidence to Detect Self-Motion in a Small Visual Circuit
Animals perceive optic flow and use it to guide a range of navigational behaviors, such as stabilizing body trajectories and eye movements. In this project, we propose to dissect how different visual cues are combined to guide stabilizing behaviors, and how these cues serve as evidence for and against an animal?s self-motion. We will investigate these computations in the fruit fly Drosophila, which allows us to use powerful genetic tools to define the roles of individual neurons in neural computations. With these tools, our research will (a) characterize the algorithms that combine types of visual evidence, (b) identify the circuits that encode and combine these cues, and (c) determine how neurons perform these computations. This work is significant for two reasons. First, our studies will examine how the fly integrates visual evidence both for and against its own self-motion. This form of counterevidence integration cannot be accounted for by current models of optic flow detection. Thus, this project will establish new approaches to studying how animals estimate their own self-motion. Second, the types of visual evidence for and against self-motion are constrained by the geometry of the visual world. Because of this common geometry and parallels in visual processing between flies and mammals, it is likely that mammalian visual systems make use of algorithms similar to those we will find in the fly. Thus, analysis in the compact fly brain will uncover principles for understanding these computations in the brains of larger animals. In our research, we will combine genetic tools and behavioral measurements to investigate three questions: How do different visual cues interact in stabilizing the fly?s heading? What circuitry is required for the different types of cues? And how do the circuits process and integrate these cues to generate behavior? These questions lead to the three aims of our research. Aim 1 characterizes the algorithm that integrates evidence for and against the fly?s self-motion and guides its turning behavior. Behavioral measurements with targeted visual stimuli will constrain or rule out potential models. Aim 2 identifies neurons required to integrate visual counterevidence into turning behavior. We will use genetic tools to silence specific neurons in the visual system in order to identify neurons required for this computation. Aim 3 measures functional response properties of visual neurons in the circuits that encode and integrate these visual cues. We will use measurements of neuron responses to stimuli to test hypotheses for how these neurons combine signals to generate the observed motion signals and behaviors. On completion, these studies will result in a detailed understanding of the algorithms and neural mechanisms that integrate different visual cues to stabilize heading in flies. This will provide a template for understanding how visual evidence is combined to estimate self-motion in other animals as well.
|
1 |