2002 — 2007 |
Schank, Jeffrey (co-PI) [⬀] Joshi, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr: Research and Design of Robot Group Behavior Development @ University of California-Davis
This research aims to further integrate the study of behavior in both animals and robots. The research investigates how rules of animal behavior can be analyzed, quantified, coded, and instantiated in robots. It focuses on individual and group sensorimotor behavior from a developmental perspective using Norway rats (Rattus norvegicus). Robotic rat pups will be built that instantiate sensorimotor rules discovered by careful observation and computational modeling of pups in isolation and in groups at different stages of development. Specific attention will be paid to how these rules change at different stages of development. For the study of behavior, the aim is to use robotic systems to validate rule-based computational models of behavior and generate testable predictions. For robotics, the aim is to transfer information from organisms that start out life as simple sensorimotor systems, but subsequently develop social behaviors, while developing new sensorimotor, learning, and cognitive capabilities. By studying precisely how these capabilities develop, this research will provide additional insight into the emergence of group behaviors in robotics. In the longer run, this research should facilitate the design of group robotic systems that develop greater and greater sensorimotor, learning, and cognitive capabilities over time.
|
0.915 |
2010 — 2015 |
Joshi, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Intent Seeking Algorithms For New Human-Machine Interface @ University of California-Davis
PI: Joshi, Sanjay S. Proposal Number: 0966963
Project Summary: We have created a novel human-machine interface (HMI) technology for paralyzed persons which uses the surface electromyography (sEMG) signal of a single, facial muscle for simultaneous multidimensional control of external devices. Our new controller has the potential to significantly increase the quality of life for its users. Unlike many existing humancomputer interfaces for this population, our interface is: unobtrusive & inconspicuous, noninterfering with eyes/mouth/tongue, continuously available when needed, multifunctional, easy-to-use in almost any head position, and portable. We have recently discovered that humans can learn how to simultaneously manipulate power levels in two separate frequency-bands of a sEMG power spectrum (simply by contracting the muscle). Each frequency band becomes a separate control channel, which can simultaneously control different aspects of a device. Thus, we may exploit a single muscle?s natural electrical signals in far more complex ways than previously known. Using this underlying discovery, we have developed a new user interface that relies on a single head muscle?s surface EMG signal, which is easy to obtain and restricted to a small non-descript area near the ear. Our system is somewhat similar to some electroencephalographic (EEG) based brain-computer interfaces (BCI) in which a person learns to ?guide? a cursor to certain positions on a computer screen. These positions on the screen could be virtual buttons that open computer applications (human-computer interfaces), turn on/off lights (environmental control units), or control wheelchairs (mobility applications). Two central challenges in all ?cursor-guided? HMI systems are 1) intent (how does the computer know where the user intended to place the cursor?), and 2) speed at which the cursor can achieve the intended position. These two questions are intertwined in that earlier knowledge of intent can lead to faster systems. We propose to develop new ?intent-seeking? algorithms that could make our HMI much faster than our currently instantiated system. In addition, in order to conduct evaluation studies on subjects with disabilities who cannot leave either home or hospital, we will develop a new smaller mobile version of our hardware that is very easy to transport, setup, and use anywhere. Intellectual Merit: The use of a single sEMG signal for simultaneous multidimensional control in human-computer interfaces is potentially transformative. The notion of predicting the future location of a target (in this case a computer cursor) arises in many different applications (e.g. aerospace engineering, robotics, brain-computer interfaces). These applications employ a combination of mathematical and computer-science techniques including statistical decision making, optimal filtering, and artificial intelligence. We intend to draw from these fields to develop accurate, fast algorithms for our human-computer interface application. From a hardware perspective, entire new classes of computing devices are appearing that can perform complex computations and run graphics-intensive applications from a hand-held (or smaller) footprint. Designing our interface around these operating platforms will advance the area of highly portable and easy-to-use assistive interfaces. Broader Impact: A recent study initiated by the Reeve Foundation (2009) estimates that more than 5.5 million people live with paralysis in the United States. Many of the most severely paralyzed use ventilators to breathe, and are confined to certain head/body positions at different times during the day. Our goal is for severely paralyzed persons to regain some control of their surroundings and some basic independence. We are committed to including disabled persons in our research, not only as subjects but also as researchers themselves. As such, our work will create an additional broader impact in terms of research inclusiveness. In terms of intellectual broader impact, our new intent-seeking algorithms could have applications for many computer operating systems/programs for which disabled or non-disabled persons use various devices to guide cursors on a screen.
|
0.915 |
2012 — 2017 |
Joshi, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri-Small: Collaborative Research: Assistive Robotics For Grasping and Manipulation Using Novel Brain Computer Interfaces @ University of California-Davis
This is a collaborative proposal (with UC Davis) which is aimed at making concrete some of the major goals of Assistive Robotics. A team of experts has been brought together from the fields of signal processing and control, robotic grasping, and rehabilitative medicine to create a field-deployable assistive robotic system that will allow severely disabled patients to control a robot arm/hand system to perform complex grasping and manipulation tasks using novel Brain Muscle Computer Interfaces (BMCI). Further, the intent of this effort is not just technology-driven, but is also driven by clear and necessary clinical needs, and will be evaluated on how well it meets these clinical requirements. Validation will be performed at the Department of Regenerative and Rehabilitation Medicine at Columbia University on a diverse set of disabled users who will provide important feedback on the technology being developed, and this feedback will be used to iterate on the system design and implementation.
Intellectual Merit: The intellectual merit of this proposal includes: o Novel research in Human Machine Interfaces that has the potential to be transformative in eliciting rich, multi-degree-of-freedom signal content from simple and non-invasive surface electromyographic (sEMG) sensors. o Development of smart adaptive software that employs machine learning algorithms that can continually monitor user performance, and then automatically calibrate and tune system parameters based on system performance. o Data driven methods for real-time grasp planning algorithms that can be used with both known and unknown objects. o Methods for finding pose-robust grasps that are tolerant of errors in sensing. o Evaluation of an underactuated hand as a grasping device for certain application tasks. o Integration of 3D vision with real-time grasp planning. o Scientific evaluation at the clinical level of the impact of these new technologies on the disabled population.
Broader Impacts: The broader impacts of this proposal include: o Development of a complete system to aid the severely disabled population with tetraplegia. o Extensions of this technlogy to others lacking motor control function including multiple sclerosis, stroke, amyotrophic lateral sclerosis (ALS or Lou Gehrig disease), cerebral palsy, and muscular dystrophy. o New technology that can extend the reach and impact of the field of Assistive Robotics. o Major extensions to the open-source GraspIt! software system that will allow many other researchers to leverage the results of this project. o Educational thrusts that will bring together engineering students, clinicians and the disabled population to extend the reach and scope of Assistive Robotics. o New directions in Human Machine Interfaces that can extend beyond the disabled population and into a variety of other applications.
|
0.915 |