
Allison Okamura - US grants
Affiliations: | Stanford University, Palo Alto, CA |
Area:
Haptic systemsWebsite:
http://charm.stanford.edu/Main/AllisonOkamura/We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the NIH Research Portfolio Online Reporting Tools and the NSF Award Database.The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please sign in and mark grants as correct or incorrect matches.
High-probability grants
According to our matching algorithm, Allison Okamura is the likely recipient of the following grants.Years | Recipients | Code | Title / Keywords | Matching score |
---|---|---|---|---|
2001 — 2005 | Okamura, Allison Hager, Gregory Taylor, Russell |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Scale-Invariant Skill Augmentation For Cooperative Human-Machine Micromanipulation Systems @ Johns Hopkins University The proposed work addresses development of novel methods for enhancing the ability of humans to perform complex and delicate manipulation tasks at a microscopic level. The proposed approach is based on a combination of off-line programming and on-line adaptation of task-specific human-computer manipulation idioms. The work builds upon this group's experience in the development of steady-hand robotic augmentation. Specific tasks from microsurgery and micro-assembly will be used for testing the ideas and evaluating the results obtained. |
0.954 |
2002 — 2008 | Hannaford, Blake Hager, Gregory Taylor, Russell Okamura, Allison |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr: Modeling Synthesis and Analysis of Human-Machine Collaborative Systems @ Johns Hopkins University This project models and builds systems that augment human physical capabilities for performing skilled tasks. Areas of importance for human-machine physical collaboration include micro-manipulation such as microsurgery or microassembly, and remote operation (eg in outer space or under the ocean). It is also important to study such techniques in areas where great dexterity is required, such as medical palpation. |
0.954 |
2003 — 2006 | Okamura, Allison Mariko | R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Haptic Feedback For Robot-Assisted Surgical Systems @ Johns Hopkins University DESCRIPTION (provided by applicant): Cardiac surgery is traditionally performed through a median sternotomy, providing optimal access to all cardiac structures and great vessels, but generating significant disfigurement and pain for the patient. The advent of robot-assisted minimally invasive surgery (MIS) holds great promise for improving the accuracy and dexterity of a surgeon while minimizing trauma to the patient. However, clinical success with robot-assisted cardiac MIS has been marginal; improved patlent outcomes and mitigated costs have not been proven. We hypothesize that this is due in large part to the lack of haptic feedback presented to the surgeon. Without the sense of touch naturally used in surgical tasks such as fine suture manipulation, surgeon performance is jeopardized. The general objective of the proposed research is to acquire and use haptic information during robot-assisted minimally invasive surgery, with a focus on manipulation of fine sutures. It is anticipated that this approach will offer two main benefits over current systems: (1) forces will be fed back to the user in real time, directly or through sensory substitution, improving task performance, and (2) haptic information will be used to create automatic virtual fixtures that assist the surgeon, e.g., by prevention of excessive applied suture forces and maintenance of constant retraction forces. The specific aims are: (1) to understand the sensing requirements for minimally invasive surgical tasks, (2) to test different modes of haptic feedback in robot-assisted minimally invasive surgical environment, (3) to create virtual fixtures that augment and improve the execution of surgical tasks, and (4) to apply feedback of haptic information during phantom experiments, creating a direct path to clinical applications. These specific aims contribute to a Iong-term research plan for using haptic information in robot-assisted minimally invasive surgery. The proposed work represents the development of operative technology that can provide significant improvements in patient outcomes, as well as Iay the groundwork necessary to address several other exciting research issues such as beating-heart (off-pump) procedures, procedure and tissue models, and virtual environments for training. |
0.911 |
2003 — 2006 | Okamura, Allison | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ Johns Hopkins University Information Technology is making great progress in the operating room. In the practice of minimally invasive surgery, three-dimensional modeling techniques are enabling preoperative planning, new image acquisition and display techniques, and superior dexterity through teleoperated robotic systems. However, one sensory channel is presently ignored in these technological improvements: touch. In manual surgeries, haptic feedback is crucial for palpation, suture manipulation, and detection of puncture events. Sensation of forces is particularly important for invasive tool-tissue interaction tasks such as grasping, cutting, dissection, and percutaneous (GCDP) therapy. Lack of haptic information overloads the visual sensing required of the surgeon, requiring a demanding level attention. For information technology to truly enhance the practice of surgery, multiple sensory channels must be utilized. |
0.954 |
2004 — 2005 | Okamura, Allison Mariko | R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
Biomechanical Modeling For Steerable Needles @ Johns Hopkins University DESCRIPTION (provided by applicant): The long term goal of this work is to harness the effect of bevel tip needle bending to provide accurate, dexterous targeting for percutaneous therapies. New results in needle and tissue modeling, combined with robot motion modeling, will facilitate new models, hardware, control, and planning techniques for steering flexible needles inside soft tissue. Using path planning, control, and kinematics results from the field of robotics, we plan to demonstrate that an appropriately designed needle can be steered through tissue to reach a specified 3-D target. Even more compelling is that the methods we propose will allow needles to be steered to previously inaccessible locations in the body, enabling new minimally invasive procedures. The first step in needle control is to obtain an accurate model. Thus, the specific goals of this exploratory/developmental application are to under stand and model the biomechanics of needle motion. This is a high-risk activity because there has been no previous work to successfully model the 3-D path of flexible needles through soft tissue. The work is high impact because accurate 3-D needle modeling can be used to (1) improve targeting, thus enhancing the performance of many percutaneous therapies and diagnostic methods, (2) plan needle paths that steer around obstacles, and (3) develop realistic simulators for physician training and patient-specific planning. Working closely with physician collaborators, the investigators will study needle insertion scenarios that are relevant to improving the quality of health care. The specific aims are as follows: (1) Develop and validate a deterministic biomechanical model of needle insertion through un deformed tissue. The bending of the need le tip will be a function of tissue properties, needle properties and input parameters. Refine and test the model on phantom tissues. Select needle design parameters that facilitate steering. (2) Further refine the model of Aim 1 to include the effects of constraint and input uncertainties, generating a stochastic biomechanical model of needle insertion through undeformed tissue. Compute the space of possible needle tip positions for a given set of inputs and determine the probability of acquiring specific targets. (3) Further refine the models of Aims 1 and 2 to include the effects of both tissue deformation and inhomogeneous tissue properties. A particular challenge is the development of new modeling techniques that simultaneously handle needle and tissue deformation. |
0.911 |
2004 — 2010 | Okamura, Allison | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Career: Haptic Exploration and Modeling of Unknown Environments @ Johns Hopkins University Using force and tactile (haptic) feedback, humans are able to accomplish complex exploration and manipulation tasks in unstructured environments, even when other sensory modes are limited. However, we cannot reach environments that are remote in space and/or scale, such as outer space, under bodies of water, and inside the human body. The proposed research activity will harness the extraordinary ability of human exploration through novel telemanipulation systems. The intellectual merit is that controls, human factors, and modeling techniques will be used to develop versatile, practical telemanipulators that provide appropriate haptic feedback to the operator and automatically generate detailed haptic models of explored environments. A set of crucial telemanipulation issues will be addressed: (1) sensor/actuator asymmetry, which is necessary for practical implementation in many applications, (2) the roles of level and degrees of freedom of haptic feedback during teleoperated haptic exploration, and (3) the design and control of virtual fixtures that assist in teleoperated exploration, while not interfering with the operator's perception of the remote environment. The broader impacts of this work are safety for humans working in hazardous environments and improved health care through robot-assisted surgery. Specific education goals are to create haptic laboratories and public displays that use the sense of touch to improve scientific understanding and interest in technology |
0.954 |
2007 — 2010 | Okamura, Allison (co-PI) Hager, Gregory Cowan, Noah (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Manipulating and Perceiving Simultaneously (Maps) For Haptic Object Recognition @ Johns Hopkins University This SGER project is a feasibility study of algorithms for manipulation and recognition using haptic information. The Manipulation-and-Perceiving-Simultaneously (MAPS) paradigm borrows key ideas from recent advances in simultaneous localization and mapping (SLAM) and visual object recognition with modifications necessary to adapt them to haptic exploration. |
0.954 |
2007 — 2009 | Okamura, Allison Mariko | R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Steering Flexible Needles in Soft Tissue @ Johns Hopkins University [unreadable] DESCRIPTION (provided by applicant): Minimally invasive surgical techniques have been highly successful in improving patient care, reducing risk of infection, and decreasing recovery times. This project aims to further reduce invasiveness by developing a technique to insert thin, flexible needles into the human body and steer them from outside. This approach will potentially improve a wide range of procedures, from chemotherapy and radiotherapy to biopsy collection and tumor ablation, by enhancing physicians' abilities to accurately maneuver inside the human body without additional trauma to the patient. Building on emerging methods in robotics and highly encouraging results obtained under an R21 exploratory grant, we propose to design, prototype, and evaluate a working system that will steer flexible needles through deformable tissue and around internal obstacles to precisely reach specified 3D targets. This research program will significantly advance our understanding and practice of needle therapies through integrated needle design and modeling, preoperative visualization and needle motion planning, and image-guided intraoperative needle control. The scientific and engineering advances will culminate in a set of pre-clinical trials with imaging (fluoroscopy, ultrasound, and MRI) using phantom and natural ex vivo and in vivo models. The designs, analyses, and experiments of this study will determine the merits and weaknesses of flexible needle steering, with the goals of improving current clinical applications and leading to new ultra-minimally invasive surgical procedures. The results of this project could significantly improve public health by lowering patient recovery times, infection rates, and treatment costs. By increasing the dexterity and accuracy of minimally invasive procedures, anticipated results will not only improve outcomes of existing procedures, but also enable percutaneous procedures for many conditions that currently require open surgery. [unreadable] [unreadable] [unreadable] |
0.911 |
2007 — 2012 | Okamura, Allison Hager, Gregory Taylor, Russell (co-PI) [⬀] Cowan, Noah (co-PI) [⬀] Kazanzides, Peter (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ Johns Hopkins University Proposal #: CNS 07-22943 |
0.954 |
2007 — 2011 | Okamura, Allison (co-PI) Cowan, Noah [⬀] Webster, Robert |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Active Cannulas For Bio-Sensing and Surgery @ Johns Hopkins University 0651803 |
0.954 |
2008 — 2009 | Okamura, Allison Mariko | R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
Models and Devices For Coordination Rehabilitation @ Johns Hopkins University DESCRIPTION (provided by applicant): Damage to the nervous system often results in altered movement control, leading to loss of function. The NIH estimates that movement-related neurological disorders affect millions of Americans each year. For many neurological disorders, rehabilitation therapy is a main treatment. In order to optimize rehabilitation techniques, a quantitative approach that pinpoints specific deficiencies and targets them with long-term intervention is needed. Thus, we will apply biomechanical models, robotic devices and novel control strategies for optimizing both compensation and learning approaches for improved movement control. This project focuses on poor motor coordination, which is a ubiquitous finding in people with damage to the nervous system. Incoordination leads to poor trajectory and targeting control, and is most distinctly related to cerebellar dysfunction. The fundamental mechanism of cerebellar incoordination will be investigated by comparing human performance to computational models, and then by developing robotic control strategies that either compensate for motor impairments or optimize practice-dependent learning. Both approaches are needed since short-term learning mechanisms can be inefficient or absent in individuals with cerebellar damage, making compensation the best option for some people. A robotic exoskeleton device, the KinArm, will be used to acquire behavioral data during reaching tasks performed by control and cerebellar subjects, and dynamic models of the human arm will be used to determine the source of the differences between control and cerebellar data. Specifically, the dynamic models of subjects will be used to simulate the effects of misestimation of limb dynamics and timing delays. The parameters that best explain behavior will be used to inform a rational control strategy for robot-assisted rehabilitation for ataxic populations. Adaptation and compensation methods will be designed that provide assistive and/or resistive forces to help subjects achieve normal movement patterns. A pilot study in which cerebellar patients use these methods will provide design guidelines for future rehabilitation robotics development. The long-term goal of this work is to design and produce take-home devices that are customized to either compensate for an individual's deficit or to facilitate the learning of a new motor pattern. We plan to extend this methodology to a broad range of patient populations. This project lays the foundation for novel home therapies by identifying strategies a robot could use to normalize movement control of people with cerebellar damage. PUBLIC HEALTH RELEVANCE. Movement disorders commonly occur following neurological damage, affecting the activities of daily living for millions of Americans each year. Therapies and assistance methods using rehabilitation robots are promising techniques for improving the short- and long-term health care of these patients, by lowering the cost of treatment, enabling more effective methods for practice-based rehabilitation, and providing "smart" orthoses for recovery of normal movement function in populations for whom adaptation is not possible. |
0.911 |
2012 — 2016 | Okamura, Allison | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Hcc: Small: Haptic Realism Versus Haptic Utility @ Stanford University While numerous studies have focused on the design of novel controllers and devices to enhance the user experience of teleoperated and virtual environments, our general understanding of what makes a haptic interface acceptable to human operators is limited. The addition of haptic feedback to a computer-mediated task, such as training to perform a surgical procedure in a virtual environment or teleoperating a robot to dispose of explosive ordnance, has been hypothesized to improve the speed, accuracy, and precision of task performance. However, the benefits of haptic feedback have been insufficient to motivate inclusion of haptics in many mission-critical systems. Recent publications in the haptics literature have shown confusing results regarding the role of haptic feedback, such as haptic teleoperators that have high user acceptance and realism ratings but do not have any effect on user performance, as well as haptic teleoperators that are disliked by users yet lead to significant improvements in performance. The PI's goal in this project is to identify the salient parameters of haptic systems leading to haptic realism and haptic utility, their trade-offs, and - hopefully - their complementary features that lead to haptic systems that are realistic and useful, and therefore acceptable to human operators in high-risk scenarios. She plans to approach this problem by examining manipulation and exploration with bilateral teleoperators, where haptic feedback consists of vibrotactile and kinesthetic feedback mediated by a tool. System parameters of interest include feedback gain and frequency content, device mechanical properties, and time delay. For each parameter, the PI will examine its effect on realism and utility, specifically: how users subjectively rate realism and their own performance, and objective measures and theoretical predictions of user perception and performance. In addition, she will survey user acceptance and measure affect, and compare these to measures of realism and utility. Based on these results, she will design and test haptic feedback systems that should maximize user acceptance. This framework will allow her to answer relevant questions such as: What is the spectrum of cost-benefit ratios in bilateral teleoperation, considering the role of device performance in defining the limits of haptic realism and utility? How does the level of risk in a task affect user acceptance of haptic interfaces? Can we design "haptic cartoons" analogous to medical illustrations, which are not completely realistic but display the most important haptic features in order to maximize task performance? And is there a haptic "uncanny valley" in which slightly unrealistic haptic feedback is particularly disturbing to operators? |
1 |
2012 — 2016 | Okamura, Allison | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri-Large: Collaborative Research: Multilateral Manipulation by Human-Robot Collaborative Systems @ Stanford University This project addresses a large space of manipulation problems that are repetitive, injury-causing, or dangerous for humans to perform, yet are currently impossible to reliably achieve with purely autonomous robots. These problems generally require dexterity, complex perception, and complex physical interaction. Yet, many such problems can be reliably addressed with human/robot collaborative (HRC) systems, where one or more humans provide needed perception and adaptability, working with one or more robot systems that provide speed, precision, accuracy, and dexterity at an appropriate scale, combining these complementary capabilities. |
1 |
2014 — 2017 | Maclean, Karon Okamura, Allison Blikstein, Paulo (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Exp: Hands-On Haptics Laboratories For Classroom and Online Learning @ Stanford University The Cyberlearning and Future Learning Technologies Program funds efforts that will help envision the next generation of learning technologies and advance what we know about how people learn in technology-rich environments. Cyberlearning Exploration (EXP) Projects explore the viability of new kinds of learning technologies by designing and building new kinds of learning technologies and studying their possibilities for fostering learning and challenges to using them effectively. This project's technology innovation is a haptic (force-feedback) toolkit to be used in learning science and engineering content that involves forces. The toolkit the team is developing will allow learners (K-12 and beyond) to manipulate simulations and models through touch-sensitive devices and to directly feel resulting forces. Learning and using the science of forces is quite complex, as the ways forces actually combine and have effects are not always consistent with what people experience. Understanding forces, however, is essential for pursuits as varied as engineering, rehabilitative medicine, and construction and for understanding concepts and phenomena in biology, chemistry, physics, and engineering fields. The kind of direct engagement with forces at the micro-level that the proposed toolkit will allow has potential to foster better scientific understanding across this whole set of fields. Research in the context of this technological innovation focuses on the extent to which and the conditions under which such embodied experience with phenomena fosters deeper understanding. |
1 |
2014 — 2017 | Okamura, Allison Mariko | R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Ultrasound-Guided Robotic Needle Steering For Ablation of Liver Cancer @ Stanford University DESCRIPTION (provided by applicant): An estimated 33,190 new cases of primary liver cancer and an even larger number of new cases of metastatic cancer to the liver will be diagnosed in the United States in 2014. Eradicating those tumors improves patient survival. Percutaneous radiofrequency ablation (RFA), which kills tumors via thermal damage, is a minimally invasive treatment appropriate for the many patients who are not eligible for surgical resection, particularly those with certain tumor type, location, or size; poor liver function; or oher comorbidities. However, technical limitations have precluded broader adoption of RFA as a means of eliminating liver tumors. Tumors in some portions of the liver cannot be accessed because the straight paths currently taken by RFA electrodes would traverse vasculature, lung, or other sensitive structures. Large tumors require multiple electrode insertions; the increased bleeding risk with each puncture through the liver capsule can be too great in patients with advanced liver disease or severe comorbidities. The long-term goal of this work is to significantly improve access, accuracy, and precision during percutaneous liver RFA procedures using robot-assisted needle steering. The following specific aims will enable the development of this technology and demonstrate feasibility, as a path to clinical adoption: Specific Aim 1: Optimize ultrasound-based localization of highly curved needles and closed-loop control of robotic needle steering. Specific Aim 2: Develop an interactive user control interface and enable teleoperated needle steering intervention. Specific Aim 3: Design a clinical robot-assisted needle steering system for percutaneous liver RFA, validated in an ex vivo liver model. Specific Aim 4: Validate the ultrasound-guided needle steering system in vivo in porcine liver. Needle steering enables the insertion of highly flexible needles or guidewires along controlled, curved, 3D paths through tissue. Thus, the needle can be actively steered to acquire tumors not accessible by a straight- line path, and unexpected deviations from the desired path can be corrected as the needle advances toward the tumor. A functional needle steering system appropriate for in vivo experimentation must be compatible with widely available medical imaging (here, ultrasound imaging), have the ability to deliver treatment (here, RFA through the use of a steerable needle electrode), and an appropriate user interface including visualization and human-in-the-loop teleoperation. This technology could extend the survival advantages of RFA to patients who may otherwise not have good treatment options, and in the long term, be used for accurate placement of diagnostic devices and treatment of diseases in other organs. |
0.958 |
2016 — 2019 | Fan, Jonathan (co-PI) [⬀] Follmer, Sean Okamura, Allison |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri: Vine Robots: Achieving Locomotion and Construction by Growth @ Stanford University In contrast to legged robots inspired by locomotion in animals, this project explores robotic locomotion inspired by plant growth. Specifically, the project creates the foundation for classes of robotic systems that grow in a manner similar to vines. Within its accessible region, a vine robot provides not only sensing, but also a physical conduit -- such as a water hose that grows to a fire, or an oxygen tube that grows to a trapped disaster victim. The project will demonstrate vine-like robots able to configure or weave themselves into three-dimensional objects and structures such as ladders, antennae for communication, and shelters. These novel co-robots aim to improve human safety, health, and well-being at a lower cost than conventional robots achieving similar outcomes. Because of their low cost, vine robots offer exceptional educational opportunities; the project will include creation and testing of inexpensive educational modules for K-12 students. |
1 |
2018 — 2021 | Okamura, Allison | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri: Fnd: Collab: Intuitive, Wearable Haptic Devices For Communication With Ubiquitous Robots @ Stanford University This National Robotics Initiative (NRI) project will promote the progress of science and beneficially impact human health and quality of life by developing wearable soft robotic devices with distributed tactile stimulation that enable new forms of communication. Human-robot interactions will be commonplace in the near future. In applications such as self-driving cars and physically assistive devices, interaction will require effective and intuitive bidirectional communication. Transferring information through vision and sound can be slow and inappropriate in many circumstances. This project focuses on haptic (touch-based) robotics to enable communication in a salient but private manner that alleviates demands on other sensory channels. This project serves the national interest by advancing knowledge in the fields of human perception, psychology and neuroscience, while developing novel, convergent technology that integrates concepts across the fields of robotics, haptics, and control engineering. Project results will be disseminated through tactile haptic devices for education and publicly available software and data. The project aims to broaden participation of underrepresented groups in engineering through outreach programs, public lab tours, and the mentoring of female and minority graduate students, undergraduates, and high school students. |
1 |
2018 — 2021 | Okamura, Allison | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ Stanford University In the real world, people rely heavily on haptic (force and tactile) feedback to manipulate and explore objects. However, in virtual and augmented reality environments, haptic sensations must be artificially generated. Advances in both sensing and feedback are needed to create more realistic virtual scenarios for human interaction, education, and training. This research will implement wearable fingertip haptic feedback systems that enhance human perception and task performance in virtual and augmented reality, and will provide the field of computer-mediated haptics with important new design and rendering insights for a promising form of wearable haptic device. Successful development of fingertip haptic devices and improved tracking methods will lower the cost of integrating haptic technology into virtual reality, and yield more realistic and compelling virtual experiences. Effective haptic feedback systems for virtual reality utilizing the developed technologies will have broad impact by improving human health and well-being through a myriad of applications such as training for critical tasks like surgery and defusing of explosive ordnance, tactile communication to enable design and e-commerce, and immersion in virtual worlds for enhanced education and training. Project outcomes will be disseminated through demonstrations of educational applications, expansion of an online course on haptics, and publicly available software and data. Outreach programs, public lab tours, and mentoring of female and minority graduate students, undergraduates, and high school students will broaden participation of underrepresented groups in engineering. |
1 |
2020 — 2023 | Okamura, Allison Liu, C. Karen |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri: Fnd: Computational and Interactive Design of Soft Growing Robot Manipulators @ Stanford University This project will contribute to the development of new soft robot manipulators that can assist people with limited physical abilities in their activities of daily living in the home. Soft robots are made of flexible and stretchable materials configured to increase human safety, lower cost, and mechanically adapt to their environments, which are significant advantages compared to traditional, rigid robots. However, today's soft robots are far from capable of performing household manipulation tasks; they have limited dexterity and cannot handle the weight of many objects encountered in daily life. This award supports research to develop novel mechanisms and design tools in order to create soft robots that seamlessly integrate into homes and provide needed assistance. This project aims to improve human health and quality of life by (1) assisting people with limited physical abilities in their homes, (2) lowering the cost of robot manipulators, toward making robots accessible to a wide spectrum of users, and (3) promoting human safety through controllable mechanical compliance and robustness. These features are important to improve equity in access to technology, especially in scenarios where physical distancing between humans is required. This interdisciplinary project integrates mechanical engineering and computer science, and the project will educate a diverse group of students in classes and research in order to broaden participation in STEM. |
1 |