Yoky Matsuoka - US grants
Affiliations: | Computer Science & Engineering | University of Washington, Seattle, Seattle, WA |
Area:
NeuroboticsWe are testing a new system for linking grants to scientists.
The funding information displayed below comes from the NIH Research Portfolio Online Reporting Tools and the NSF Award Database.The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please sign in and mark grants as correct or incorrect matches.
High-probability grants
According to our matching algorithm, Yoky Matsuoka is the likely recipient of the following grants.Years | Recipients | Code | Title / Keywords | Matching score |
---|---|---|---|---|
2001 — 2007 | Hudson, Scott [⬀] Yang, Jie Forlizzi, Jodi (co-PI) [⬀] Matsuoka, Yoky |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr/Sy: Situationally Appropriate Interaction @ Carnegie-Mellon University We are poised at the threshold of an information rich world with devices and services able to deliver that information to nearly anyone, at any place, and at any time. Humans have evolved social mechanisms for smoothly and flexibly managing interpersonal communications; however, current computational and communications devices are, almost without exception, utterly unaware of the social and attentional state of the user. They know little or nothing of the personal, social, and task situations in which they are used, and they do little or nothing to account for, and minimize, the human costs they induce. In this project, the PI and his team will explore situationally appropriate interfaces that retrieve, generate, and deliver information in a manner that is sensitive to the situation of the user. These interfaces will allow for communication and information systems that maneuver, rather than blunder, through the social world. To accomplish this ambitious goal, the team will pursue a three-part research plan. First, they will use behavioral theory and research to model social mechanisms for managing interpersonal communications. The comparatively unexploited research we will draw on examines the affordances of situations and consistent patterns of human nonverbal social behavior within situations. Second, they will extract key situational and user behavior data from these models via input from new sensing technologies, using noninvasive (e.g., vision-based) sensing technology to provide information about situations and users. Third, leveraging knowledge from sensory, perceptual, and cognitive psychology, as well as from the fields of visual and interaction design, the team will create displays and interaction designs that are far more situationally appropriate than today's interfaces. To address the substantial challenges that this breadth of work presents, the PI has assembled a strong multidisciplinary team that brings expertise from computer science, social, sensory, perceptual, and cognitive psychology, and the field of design. |
0.934 |
2002 — 2003 | Matsuoka, Yoky | R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
Robotic Rehabilitation of Stroke With Animal Models @ Carnegie-Mellon University DESCRIPTION (provided by applicant) In this proposed research, we intend to develop two new techniques which will be used to identify a highly effective robotic rehabilitation strategy. Animal models will be used to address issues that cannot be addressed using human patients. Currently, several robotic stroke rehabilitation techniques are being evaluated to determine their effect on human patients' short and long term recovery performance. Robots, for example, are being attached to patients' limbs and applying force to move them as physical therapists routinely do. Such robotic techniques, however, are simple extensions of what physical therapists are already doing, and the only outcome measurements available are patients' behavioral changes. Robots are currently being used on a limited basis in stroke rehabilitation research because it is not ethical to test a variety of robot force fields or techniques on humans if these fields or techniques have not been proven to have a positive effect on them. Therefore, we believe that evaluating robotic rehabilitation techniques on animal models is crucial. To our knowledge, animal models have never been used to evaluate robotic rehabilitation of stroke. To use animal models, we must develop two new techniques that have not yet been explored. First, we will develop a technique to produce a precise lesion in an animal that simulates a stroke without risking the animal's survival rate. To do this, we will use a non-invasive photochemical technique. Second, we will design, construct, and test a new robot controller technology for animals. We will rehabilitate animals using this new robotic controller which will later be applicable to human rehabilitation techniques. We will combine these techniques to establish the superiority of robot-assisted intervention over non-assisted rehabilitation, explore the optimal training schedules, and identify gene products that are selectively modulated following robotic rehabilitation. The results generated in this project will be used as preliminary results to apply for an R01 grant in which effective robotic force assistance will be investigated to identify the optimal therapeutic solution for robotic rehabilitation. We have no doubt that the experimental results we produce with these techniques will significantly affect the field of rehabilitation. |
0.934 |
2003 — 2009 | Matsuoka, Yoky | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Pecase: Virtual Robotic Environments For Rehabilitation and Human Augmentation @ University of Washington Proposal Title: PECASE: Virtual Robotic Environments for Rehabilitation and |
1 |
2004 — 2007 | Pollard, Nancy Matsuoka, Yoky |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
RR:Collaborative Research Resources: Learning From Human Hands to Control Dexterous Robot Hands @ Carnegie-Mellon University This project, investigating human grasping and manipulation strategies to gain insight into synergies employed by human subjects in grasping and manipulation tasks, aims to explore how best to use these findings to create better control algorithms for robot hands. Of particular importance are techniques to make autonomous robot behavior robust to uncertainties and to make algorithms with a human in the loop, such as teleoperation and control of prosthetic devices, more intuitive and effective for fine manipulation tasks. Although a high degree of freedom (DOF) device may be required to manipulate a wide variety of objects and perform a wide variety of tasks, in any given task situation, only a small number of independently controlled DOF may be necessary. To make significant progress towards dexterous grasping and manipulation, we must: |
0.934 |
2005 — 2006 | Matsuoka, Yoky | R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
Prosthetic Finger For Muscle Control Investigation @ University of Washington DESCRIPTION (provided by applicant): We propose to complete the construction and validation of the anatomical robotic finger within the R21 funding structure as a way to continue this research using the R01 funding structure to investigate the human hand neural strategies and to help design FES and prosthetic hand controllers. We will focus on completing and validating the robotic index finger. The specific aims of this proposal are to: 1) Incorporate the robotic skin that will be used as the feedback device during object exploration for the index finger. This will complete the mechanical construction of the robotic finger. 2) Identify the system component (the relationship between external force perturbation and joint displacement before cortical feedback) for both the anatomical robotic index finger and the human index finger. We will compare these components and iterate on the robot's mechanical design. 3) Identify the active system component (the relationship between muscle tension (motor current) and joint displacement) for the robotic index finger given no external perturbation. Using this active system identification database, we will determine combinations of muscle tensions that would achieve a specified movement trajectory. Furthermore, we will select several cost functions that could be used to achieve good engineering solutions to control this robotic finger. When we accomplish the above, we will have the first mechanical model of the index finger that can be controlled using "engineering" solutions. Once this system is constructed, we will be able to compare between the biological and the engineering solutions used to achieve the same finger movement. This comparison will allow us to infer the muscle synergy and optimization criteria used in the nervous system. In addition, we will be positioned to investigate the neural control of the movement with the robotic index finger during object manipulation with cutaneous feedback. For example, we will be able to investigate the neural control strategies used when pushing objects of different weights and sizes from one place to another. We will also explore how we take advantage of the finger's passive mechanisms to manipulate objects. Finally, we will be able to expand our system to include the thumb to analyze pinching, other fingers to study the relationship between different finger interactions during manipulation, and the palm to understand the role of the palm during manipulation. We believe that this hand will not only be a tool to investigate the neural control of movement, but it will also be used as a foundation to construct a future prosthetic device that will seamlessly communicate with the nervous system. |
1 |
2006 — 2007 | Matsuoka, Yoky | R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
Robotic Stroke Rehabilitation Using Perceptual Feedback @ University of Washington [unreadable] DESCRIPTION (provided by applicant): Due to advances in modern medicine, the elderly population is growing worldwide, and, along with this growth, there is a growing need for physical rehabilitation after strokes, which have an average onset of over 65 years of age. Given the magnitude of this problem and its societal ramifications, the time is ripe to explore the extent to which robotic devices and virtual environments can be used as a means of rehabilitation to improve the quality of life for both the elderly and the physically disabled. According to recent studies in robotics, robot-assisted stroke rehabilitation enhances arm movement recovery. Moreover, robot-assisted rehabilitation improves patients' mobility and strength to the point where it is equal to, or greater than, that which is achieved by human-assisted therapy. However, none of the currently available systems addresses patients' perceptual or cognitive deficits. Furthermore, these systems neglect to address the fact that many patients do not reach their full mobility potential using these systems. To remedy these problems, a virtual robotic environment that explores the full potential needs and abilities of patients must be developed. We will coin this strategy, "rehabilitation by distortion." To develop an environment to rehabilitate by distortion, however, there are two fundamental issues to address. First, it is necessary to quantify the perceptual gap that can be created between virtual and actual movements that is not perceptible to patients. To do so, we will first quantify the lowest sensory resolution, also known as just noticeable difference (JND), of force and position. As we explain below, the JND will act as the lowest bound of the perceptual gap. In addition, we will quantify the size of the perceptual gap with the existence of visual feedback distortion. Second, having identified a perceptual gap that is not noticeable, we must prove that mobility and strength of stroke patients can be extended by undetected distortion. For this proposed work, we will isolate working with one finger. Fingers are one of the parts of the body that are most commonly affected by strokes; thus testing the basic concepts about the perceptual gap between virtual and actual movements using fingers is appropriate. After we prove that rehabilitation by distortion is therapeutic for a finger, we can expand our work to other limbs. In addition, to show that this strategy is effective for patients who already received traditional therapy, we will work with patients who have already completed their traditional therapy and are at least one year after the onset of strokes. If we prove that we can allow patients to move beyond what they thought was possible after traditional therapeutic techniques, the results will be groundbreaking and will lead to an R01 grant to develop a new robotic virtual therapeutic strategy, "rehabilitation by distortion," that extends the force production and range of motion for motor impaired patients recovering from stroke. [unreadable] [unreadable] |
1 |
2008 — 2009 | Matsuoka, Yoky | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Pacific Northwest Workshop On Neural Engineering @ University of Washington The field of neural engineering aims to develop technologies that discover, rehabilitate or alter brain function while at the same time deriving inspiration from neurobiology to build artificial systems with adaptive capabilities similar to biological systems. The ultimate goal is to enhance quality of life both for disabled and healthy individuals. Neural engineering solutions will reduce the economic burden on families and communities and bestow a greater degree of freedom and confidence. |
1 |
2008 — 2010 | Matsuoka, Yoky | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
U.S. - Japan Workshop On Robotics For Safety, Security, and Society @ University of Washington With the growth of the elderly population, finding a way to enhance quality of life and safety with minimum human assistance will become critical in the next few decades. Japanese robotics researchers have been focusing on research and applications in these areas in light of the aging Japanese population. U.S. robotics researchers have also been working on the same areas but from different viewpoints. Even though both Japan and the U.S. are conducting research on the same areas, there has been virtually no collaboration between U.S. and Japanese researchers in these research areas due to the lack of appropriate funding structures and mechanisms. |
1 |
2008 — 2012 | Valero-Cuevas, Francisco [⬀] Liu, Chang Matsuoka, Yoky Todorov, Emanuel (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Efri-Copn: Reverse-Engineering the Human Brain's Ability to Control the Hand @ University of Southern California This project aims to reverse-engineer the human brain's ability to control the hand. |
0.943 |
2009 — 2012 | Rao, Rajesh [⬀] Ojemann, Jeffrey (co-PI) [⬀] Matsuoka, Yoky |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Electrocorticographic Brain-Machine Interfaces For Communication and Prosthetic Control @ University of Washington 0930908 |
1 |
2009 — 2012 | Todorov, Emanuel [⬀] Matsuoka, Yoky |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Rapd: Development of Domestic Virtual Robotic Environment @ University of Washington This award is funded under the American Recovery and Reinvestment Act of 2009 (Public Law 111-5). |
1 |
2011 — 2019 | Moon, Kee Kassegne, Sam Voldman, Joel Moritz, Chet (co-PI) [⬀] Daniel, Thomas (co-PI) [⬀] Rao, Rajesh [⬀] Matsuoka, Yoky |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nsf Engineering Research Center For Sensorimotor Neural Engineering @ University of Washington Over the last decade, the field of neural engineering has demonstrated to the world that a computer cursor, a wheelchair, or a simple prosthetic limb can be controlled using direct brain-machine and brain-computer neural signals. However, technologies that allow such accomplishments do not yet enable versatile and highly complex interactions with sophisticated environments. Today's intelligent systems and robots can neither sense nor move like biological systems, and devices implanted in or interfaced with neural systems cannot process neural data robustly, safely, and in a functionally meaningful way. Doing so requires a critical missing ingredient: a novel, neural-inspired approach based on a deep understanding of how biological systems acquire and process information. This is the focus of this proposal. |
1 |