2012 — 2017 |
Bensmaia, Sliman |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Career: Vibration and Texture Perception
When we run our fingers across a finely textured surface, small, complex vibrations are elicited in the skin, and specialized receptors embedded in the skin allow us to sense these vibrations. Different surfaces elicit different vibrations, and it is on the basis of these differences that we are able to discriminate silk from, say, satin. The ability to extract meaningful information from skin vibrations implies neural mechanisms that have yet to be discovered. This project investigates the mechanisms by which information about surface texture is extracted from texture-elicited vibrations. To this end, the Principal Investigator records (using a laser Doppler vibrometer) the skin vibrations elicited by everyday textured surfaces and examines the information conveyed. He then records the responses evoked by various textured surfaces in the nerve of Rhesus macaques and has human participants perform discrimination experiments using these same textures. Exploiting the well-documented homology between human and Rhesus sensory systems, he uses a variety of computational and machine learning techniques to reveal which aspects of the neural response account for our ability to distinguish between textures. Ultimately, the processing of skin oscillations constitutes a distinct and poorly understood mode of somatosensory processing. Indeed, while analogies have been drawn between visual and tactile processing of shape and motion, this second processing mode draws strong analogies with the auditory system. In both systems, stimulus information is extracted from spectrally complex oscillations (of the ear drum or the skin for audition and touch, respectively). In addition to their important scientific implications, results from this project will inform the development of sensorized upper-limb neuroprostheses and of virtual haptics displays. Finally, the project comprises an important educational and outreach component, with a strong emphasis on undergraduate education, recruitment of under-represented groups, and outreach to the general public.
|
0.915 |
2013 — 2017 |
Bensmaia, Sliman |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Hand Proprioception and Sensorimotor Interplay
DESCRIPTION (provided by applicant): The hand is one of the most complex, versatile, and critical human sensory organs. Sensory signals from the hand play a key role in guiding the dexterous manipulation of objects. Without these signals, we would need to constantly watch our hand to ensure that an object was properly grasped, to monitor our use of a tool, and to maintain its proper shape. Furthermore, simple activities of daily living, such as tying one's shoe, or turning a door knob, would be slow and clumsy, and require great concentration. In contrast to the sense of touch, which has received a lot of experimental attention, the representation of hand position and hand movements in the brain is poorly understood. Indeed, unlike vision and audition, for which monitors and speakers are commercially available, proprioception research presents unique technological challenges in measuring and manipulating the sensory input. As increasingly sophisticated technologies become available, however, we are able to more fruitfully investigate this important sensory modality. Our objective is to apply rigorous and quantitative methods to investigate the cortical basis of proprioception in primates, with an emphasis on hand kinematics and posture. Furthermore, we seek to understand how proprioceptive representations interact with motor representations in the cortex to guide behavior. To these ends, monkeys will perform two tasks: a grasping task, with objects different widely in shape and size, and a finger-press task, in which the animal is instructed to make individuated finger movements. While the monkeys perform these tasks, we will measure their hand movements using a camera- based motion tracking system, and record, using chronically implanted high-density multi-electrode arrays, the neuronal activity evoked in proprioceptive and motor areas of the cortex. We will then use a variety of mathematical techniques to (1) understand how hand movements and position are represented in the activity of individual neurons and of neuronal populations and (2) characterize the interactions between proprioceptive and motor representations to understand the role of this sensorimotor interplay in motor control. Our labs are currently developing approaches to convey sensory feedback through electrical stimulation of cortex for use in upper-limb neuroprostheses. Proprioceptive information about the position and movements of the prosthetic limb is essential for a complete prosthetic limb for the same reasons it is important in the native limb. Results from the proposed project will thus inform how to incorporate proprioceptive feedback in sensorized neuroprostheses.
|
0.915 |
2015 — 2019 |
Bensmaia, Sliman |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Chs: Large: Collaborative Research: Textureshop: Tools For the Composition and Display of Virtual Texture
When we interact with the physical world touch is a vitally important sensory channel, but when we interact with the digital world that is not yet the case. Historically, this situation may have been principally due to inadequate tactile displays, but that limitation is quickly disappearing. Increasingly, the principal limitation is the lack of tactile content. The goal of this collaborative research that involves scientists at three institutions is to empower the content creator, by enabling people to perform the same sorts of operations with tactile textures that they routinely perform with photographs. Those operations include "capturing" a texture, building a mathematical representation of it, creating and displaying synthetic versions that feel very much like the original, enhancing it in various ways (e.g., making it rougher or more velvety), and ultimately "composing" novel textures that nonetheless feel realistic and credible. As a tangible step in this direction, an open source, open hardware project begun under prior NSF support will be continued and expanded. That project resulted in the distribution of surface haptic devices to about a dozen different labs, leading to a variety of research studies. In this project, a low-cost surface haptic display and a variety of applications and software tools will be distributed to about 50 early adopters in the research community. Those individuals will be engaged in this research (e.g., by helping to "tag" various textures), and will be empowered to carry out their own research. In addition, workshops will be organized at major human-computer interaction conferences to support the growing surface haptics community.
This work is timely and compelling for a number of reasons. One, scientific understanding of the physical and neuronal bases of texture perception has advanced considerably in recent years. For instance, the relationships between vibrations on the skin (produced when a finger slides across a surface), spike timing in afferent neurons, and high-level percepts such as recognition of a specific texture, have recently been elucidated. Two, "surface haptic" technologies for displaying texture to the bare fingertips have also advanced significantly and can now display complex stimuli across the full bandwidth of tactile acuity. Three, the prevalence of touch screen interfaces has created a plethora of applications such as children's e-books, interfaces for the blind, games, and automobile control panels, which would be well-served by high quality tactile content. The merit of this research is that it will provide a principled foundation for both the creation and manipulation of that content. Contributions will include: the development of a "tactile camera" that is able to capture the relevant frictional and vibratory data from which realistic textures can be recreated; a novel mathematical representation of the salient aspects of texture as well as algorithms for synthesizing artificial textures on the basis of that representation; a suite of techniques for enhancing aspects of texture by direct operation on the mathematical representation, interpolation between multiple textures, and interaction with audio cues; and finally a set of tools for composing novel textures including search, texture combination and scale transformation.
|
0.915 |
2015 — 2019 |
Bensmaia, Sliman Miller, Lee E (co-PI) [⬀] |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Probing Somatosensory Representations in the Cuneate Nucleus of Awake Primates
? DESCRIPTION (provided by applicant): The cuneate nucleus (CN) of the brainstem receives ascending input from somatosensory afferents that innervate the skin, joints, and muscles as well as descending input from sensorimotor cortex. While the tactile and proprioceptive response properties of primary afferents and of neurons in primary somatosensory cortex (S1) have been extensively characterized, virtually nothing is known of the response properties of CN neurons. Multiple types of cutaneous afferents innervate the glabrous skin of the hand and play different albeit overlapping roles in tactile perception. Tactile afferents exhibit highly patterned, repeatable responses when their receptive fields are touched. Likewise, receptors in muscles and tendons provide signals related to muscle length, length change, and force. Within S1, the properties of neurons have been extensively characterized and are very different from their peripheral counterparts. First, S1 neurons receive convergent input from multiple somatosensory submodalities. Second, neurons in S1 convey high-level, processed information about stimulus features (edge orientation, surface texture, direction of limb movement). Third, S1 responses can be strongly modulated by the behavioral state of the animals. We propose to characterize, for the first time, the response properties of CN neurons in awake primates and assess (1) the degree to which signals from different somatosensory submodalities converge onto single CN neurons; (2) the extent to which the feature selectivity observed in S1 begins to emerge in CN; (3) the degree to which the state-dependence of S1 responses stems from CN. We anticipate that the present study will have important implications for our understanding of somatosensory processing and may inform the development of subcortical interfaces for brain-machine interfaces used to restore somatosensation in patients with spinal cord injury or who have lost a limb.
|
0.915 |
2016 — 2020 |
Bensmaia, Sliman Grill, Warren M. (co-PI) [⬀] Miller, Lee E [⬀] |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Biomimetic Somatosensory Feedback Through Intracorticalmicrostimulation @ Northwestern University At Chicago
Spinal cord injury causes both paralysis and loss of sensation from the limbs. The past 15 years have seen remarkable advances in ?Brain Machine Interfaces? (BMIs) that allow paralyzed persons to move anthropomorphic limbs using signals recorded directly from their brains. However, these movements remain slow, clumsy, and effortful, looking remarkably like those of individuals who have lost sensation from their arms due to peripheral neuropathy. Brain-controlled prosthetic limbs are unlikely to achieve high levels of performance in the absence of artificial sensory feedback. Early attempts at restoring somatosensation used intracortical microstimulation (ICMS) to activate somatosensory cortex (s1), requiring animals to learn largely arbitrary patterns of stimulation to represent two or three virtual objects or to navigate in two-dimensional space. While an important beginning, this approach seems unlikely to scale to the broad range of limb movements and interactions with objects that we experience in daily life. To move the field past this hurdle, we propose to replace both touch and proprioception by using multi- electrode ICMS to produce naturalistic patterns of neuronal activity in S1 of monkeys. In Aim 1, we will develop model-optimized mappings between limb state (pressure on the fingertip, or motion of the limb) and the patterns of ICMS required to evoke S1 activation that mimics that of natural inputs. These maps will account for both the dynamics of neural responses and the biophysics of ICMS. We anticipate that this biomimetic approach will evoke intuitive sensations that require little or no training to interpret. We will validate the maps by comparing natural and ICMS-evoked S1 activity using novel hardware that allows for concurrent ICMS and neural recording. In Aim 2, we will test the ability of monkeys to recognize objects using artificial touch. Having learned to identify real objects by touch, animals will explore virtual objects with an avatar that shadows their own hand movements, receiving artificial touch sensations when the avatar contacts objects. We will test their initial performance on the virtual stereognosis task without learning, as well as their improvements in performance over time. Aim 3 will be similar, but will focus on proprioception. We will train monkeys to report the direction of brief force bumps applied to their hand. After training, we will replace the actual bumps with virtual bumps created by patterned ICMS, again asking the monkeys to report their perceived sense of the direction and magnitude of the perturbation. Finally, in Aim 4, we will temporarily paralyze the monkey's arm, thereby removing both touch and proprioception, mimicking the essential characteristics of a paralyzed patient. The avatar will be controlled based on recordings from motor cortex and guided by artificial somatosensation. The monkey will reach to a set of virtual objects, find one with a particular shape, grasp it, and move it to a new location. If we can demonstrate that this model-optimized, biomimetic feedback is informative and easy to learn, it should form the basis for robust, scalable, somatosensory feedback for BMIs.
|
0.903 |
2017 — 2020 |
Bensmaia, Sliman |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Touch Spanning Spatial Scales: the Neural Basis of Texture Perception in Somatosensory Cortices
PROJECT SUMMARY Our sense of touch endows us with an exquisite sensitivity to surface texture. We can discern surfaces whose elements are tens of nanometers in size and hundreds of nanometers apart. The perception of texture not only allows us to make fine discriminations ? like telling real silk from fake silk ? but also guides object manipulation. Indeed, our perception of the surface properties of objects informs how much grip force we apply on them: more force is required for slippery objects. One of the remarkable aspects of tactile texture processing is that it operates over six orders of magnitude in element sizes, from the smallest discernible elements (on the order of tens of nanometers) to the largest elements that can fit on a fingertip, measured in tens of millimeters. We have shown that this wide range of scales is accommodated by distributing information across three types of nerve fibers, each sensitive to surface elements over different spatial scales. Importantly, these different afferents convey texture information differently. Coarse textural features, on the order of millimeters, are conveyed in the spatial pattern of activation in one afferent population, drawing analogies to visual texture representations on the retina. In contrast, fine textural features ? with sizes in the tens of nanometers ? are conveyed in temporal spiking patterns in two other afferent populations, driven by skin vibrations elicited when the textured surface moves across the skin, and drawing analogies to audition. How these two types of representations are integrated to achieve a unitary sensory experience of texture is a mystery. Furthermore, while afferent responses are highly dependent on exploratory parameters, such as contact force and scanning speed, the perception of texture is highly invariant with respect to these parameters. Thus, neural signals must be interpreted in the context of how they are acquired. Nothing is known about how this is achieved. The goal of the present study is to investigate the neural basis of tactile texture perception in primary and secondary somatosensory cortices using a large and diverse set of artificial and natural textures across the range of behaviorally relevant exploratory conditions. First, we will seek to elucidate how texture information is encoded in cortex, with emphasis on how different neural codes ? spatial and temporal ? are integrated. Second, we will examine the relationship between neural responses at each stage of processing and the resulting sensory experience of texture. Third, we will investigate how the perceptual invariance of texture across scanning speeds and contact forces is achieved, a mechanism that may be similar to that underlying the pitch-invariance of auditory timbre. The resulting basic scientific insights will form a foundation for next-generation approaches to restore the sense of touch through electrical stimulation of somatosensory cortex for use in upper-limb neuroprostheses, a program of research in which the lab is actively engaged.
|
0.915 |
2019 — 2020 |
Bensmaia, Sliman Lindau, Stacy Tessler |
R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
Bionic Breast Project: Towards Restoring Breast Sensory Function in Women With Mastectomy
PROJECT SUMMARY Of more than 3.5 million breast cancer survivors in the U.S. today, an estimated one-third have undergone mastectomy. Of these 1.2 million survivors, 40% have had or are considering breast reconstruction procedures. In 2018, an additional 250,000 adult women will receive a breast cancer diagnosis and an estimated 100,000 will undergo mastectomy. Advances in breast reconstruction techniques for women with mastectomy have contributed to improved patient and partner satisfaction by restoring aesthetic aspects or form of the breast. However, even the most advanced reconstructive techniques do not preserve breast function. Loss of breast sensation after mastectomy is a prevalent, well-established and distressing outcome for women that is rarely addressed in the course of breast cancer care. Failure to restore breast function may explain high rates of sexual dysfunction (37-77%) in breast cancer survivors, even among women satisfied with the aesthetic outcome. Little is known about the neural basis of sensory function in the intact breast or about the interactions between sexual arousal and breast sensation. To fill the gap, we will develop and validate a comprehensive patient-reported measure of breast sensory function for use in women with and without breast cancer. We will administer this measure together with the PROMIS Sexual Function and Satisfaction measure to a sample of 500 women with and without breast cancer to examine the relationship between breast sensory function and sexual function following treatment for breast cancer. In parallel, we will apply state-of-the-art psychophysical sensory assays to determine the relative contributions of nerve fibers toward various aspects of breast sensation, as we and others have successfully done in the glabrous skin of the hand. In addition, we will assess the modulation of breast sensation as well as the sensory consequences of blocking individual nerves known to innervate the breast. These experiments on healthy subjects will provide a detailed characterization of breast sensory function, its neural basis, and its modulation by arousal. The specific aims of this research are to: (1) Develop and validate a self-report measure of breast sensory function and examine the link between breast sensory function and sexual function in affected and unaffected women and, (2) Characterize breast sensation, examine its modulation by arousal, and establish its neural substrates using state-of-the-art psychophysical techniques. Together, these aims will improve our understanding of breast sensory function and its relationship to sexual function, permit us to estimate the impact of sensory loss on sexual function and satisfaction, and provide a means to identify women at greatest risk of sexual dysfunction following mastectomy. This developmental work will lay the foundation for surgical innovation and/or the development of a bionic breast to preserve or restore breast function following mastectomy. This proposal addresses NCI?s research priority to alleviate the adverse effects of cancer and its treatment.
|
0.915 |
2021 |
Bensmaia, Sliman |
R35Activity Code Description: To provide long term support to an experienced investigator with an outstanding record of research productivity. This support is intended to encourage investigators to embark on long-term projects of unusual potential. |
Sensory Mechanisms of Manual Dexterity and Their Application to Neuroprosthetics
PROJECT SUMMARY Manual behavior requires sensory signals from the hand, both tactile and proprioceptive, as evidenced by the severe deficits that result from somatosensory deafferentation. Three aspects of the sensory component of hand sensory function are poorly understood. First, the neural basis of touch has been studied almost exclusively with stimuli delivered passively to the skin, precluding any understanding of how tactile signals are modulated by and interact with motor commands. Second, proprioceptive signals carry information not only about the time-varying conformation of the hand, but also about manually applied forces, but proprioceptive representations of force are poorly understood. Third, stereognosis ? the sense of the three-dimensional shape of objects acquired from sensory signals arising from the hand ? implies the integration of tactile and proprioceptive signals, a process about which little is known. The study of active touch, hand proprioception, and stereognosis has been hindered by technical obstacles. Indeed, characterizing self-generated contact with objects has been difficult or impossible, as has tracking hand movements with sufficient precision. To overcome these obstacles, my team has developed an apparatus that allows us to measure contact events ? with a sensor sheet covering the object?s surface ? and track time-varying hand postures ? using deep learning-based computer vision ? with unprecedented precision as animals interact with objects. We then characterize the responses at every stage along the somatosensory neuraxis, from peripheral nerve through cortex. This novel experimental set up will allow us to study the neural basis of somatosensation ? particularly as it relates to manual dexterity ? under ecologically valid conditions. In a related line of inquiry, we leverage what we learn about sensory processing to restore the sense of touch to bionic hands. In brief, we develop algorithms to convert the output of sensors on the bionic hand into patterns of electrical stimulation of the peripheral nerve (for amputees) or of somatosensory cortex (for people with tetraplegia) to evoke meaningful tactile percepts. I am one of the principal architects of the biomimetic approach to artificial touch, which posits that encoding algorithms that mimic natural neural signals will give rise to more intuitive tactile percepts, thereby endowing bionic hands with greater dexterity. Our work on artificial touch comprises three components: evaluation of the perceptual correlates of electrical stimulation, development of sensory encoding algorithms, and assessment of the benefits of artificial touch to manual behavior. The interplay of the basic scientific results and neural engineering efforts will result in more naturalistic artificial touch for brain- controlled bionic hands.
|
0.915 |