2003 — 2007 |
Wang, Ranxiao |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
The Mechanisms of Spatial Updating @ University of Illinois At Urbana-Champaign
Spatial knowledge plays an important role in daily life. We must learn the layout of the streets and buildings to navigate between home and work. We must remember where a book was put down, where the dinner plates are stored, and other ordinary instances of spatial location. What's the nature of this spatial knowledge? Are we allocentric in that we learn objects' spatial relationships to each other, or egocentric in that we learn things relative to our own position in space? If the latter, how do we acquire and maintain accurate knowledge when it changes with every step we take and every turn we make? Also, have we a different spatial system than our ancestors due to daily exposure to complex, nested urban environment or new technologies such as maps and virtual realities? With NSF support, Dr. Ranxiao Wang will seek to answer these questions. Her studies include new methods to distinguish between different types of spatial knowledge, to examine whether spatial knowledge differs for different types of environments, and what forms of spatial knowledge operate in virtual environments. The broader impacts of her research include the potential to better understand various spatial deficits and navigation difficulties. Moreover, her research compares spatial processing in real and virtual worlds, which will provide insights for how to develop user-friendly virtual reality systems.
|
0.915 |
2015 — 2017 |
Goudeseune, Camille (co-PI) [⬀] Wang, Ranxiao Hovakimyan, Naira [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Eager: Human Centered Robotic System Design @ University of Illinois At Urbana-Champaign
1548409(Hovakimyan)
Technological advances today have the potential to transform the life of everybody within the next 20-30 years in a dramatic way. The society of non-distant future assumes co-existence of humans, pilotless planes, driverless cars, and robots, in which machines and human beings share the airspace, the roads, and populate indoor environments. This idea poses significant challenges, falling within the scope of engineering, psychology, social science, politics, ethics, and economics. Inspired by this vision, the project proposes to lay down the foundations for the world of the future, in which cars, drones, and robotic machines autonomously co-habit and work with humans. Hence, it is important to build a scientific community, where psychologists and engineers work together in order to develop socially trustable autonomous mobile robots. More specifically, we propose to formulate rigorously relevant scientific and human centered psychological concerns (namely, actual safety, humans' perceived safety, and humans' comfort) that arise in such systems, where humans and robots are required to safely interact in a shared space. The overarching objective is to provide a general design framework and control architecture for human interaction with autonomous robots that is flexible and capable of safe operations.
This proposal focuses on the problem of human centered robotic design and control. The goal of this project is to provide the foundations to address human related concerns that arise in multiple human-robot systems, where robots have to perform tasks in the presence of (and in cooperation with) humans. In particular, this proposal targets the fundamental understanding of two issues that are crucial in the integration of robotic systems into real-life human populated environments: first, how humans perceive autonomous mobile robots as a function of robots' appearance and behavior; second, how to design and control mobile robots so that to improve the level of comfort and perceived safety of the people present in the environment. The research here assumes a collaborative study, involving Mechanical Engineering, Psychology and Computer Science Departments at University of Illinois at Urbana-Champaign, in which human perception of robotic appearance and behavior is the main subject that has to define what it means for a robot to be socially accepted and to generate empathy between humans and robotic machines. Social etiquette in human-robot systems will be sought through the study of human behavior. At the initial stage of this ambitious research project, the virtual reality cave at Beckman Institute and the motion capture suite at CSL Intelligent Robotic Lab will serve as testing and validation environment to derive robots' design features and behavioral protocols toward the development of these robots. The outcome of this first step will provide the basis for further research in the field and will contribute to the sociotechnical society of the future.
|
0.915 |
2015 — 2018 |
Kwiat, Paul Wang, Ranxiao |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Inspire: Exploring Living System Responses to Quantum States of Light @ University of Illinois At Urbana-Champaign
This project is jointly funded by the following programs in the Divisions of Physics (PHY) and Chemistry (CHE) of the Directorate for Mathematics and Physical Sciences (MPS), and the Division of Behavioral and Cognitive Sciences (BCS) of the Directorate for Social and Behavioral and Economic Sciences (SBE), with co-funding from the Office of Integrative Activities: MPS/PHY/Atomic, Molecular, and Optical Physics--Experiment, MPS/PHY/Physics of Living Systems, MPS/CHE/Chemistry of Life Processes, SBE/BCS/Program on Perception, Action, and Cognition and the SBE Office of Multidisciplinary Activities. This project will test the limits of how living things detect light and study how the laws of quantum mechanics apply to biological systems. Single photons (individual particles of light) will be used to study both the human visual system and light-sensitive bacteria. In the human visual system, this project will determine whether humans can see a single photon (a longstanding question in psychology) by using a method of creating one photon at a time to test the vision of human observers. This project will also investigate what human observers see when they detect a photon in a superposition - a state of being in two places at the same time - which is allowed by quantum mechanics. Light-sensitive bacteria will also be studied using single photons, to understand the limits of light detection and the possible effects of quantum laws in a different biological system. In addition to advancing our knowledge of how living things detect and use light, this research might help us understand why particles such as photons show strange quantum behavior such as superposition, while the familiar world around us behaves differently. Making quantum effects available to human perception would also have broad popular appeal, and could be a gateway for students and the public to learn about quantum physics.
To produce single photons, this project will use a well-known heralded single-photon source design based on spontaneous parametric downconversion. Single photons at 505 nm and 440 nm will be used for experiments with humans and E. coli, respectively. To determine whether humans can see single photons, observers will complete a series of trials in which they must choose whether the stimulus photon appeared on the left or the right side of their visual field. If observers are able to choose left or right with accuracy statistically greater than 50%, this is strong evidence that they can see single photons. An EEG-contingent stimulus delivery method will also be developed to improve the likelihood of detection. If single-photon vision is confirmed, a subsequent experiment will attempt a test of quantum nonlocality with a human observer replacing one single-photon detector. Finally, to test perception of superposition states, observers will be presented with both a classical mixture of photons at the left and right spots, and photons in a superposition of left and right. The observer's frequency of choosing left and right will be compared between the two cases, with any unexpected difference suggesting a deviation from standard quantum mechanics. To study phototactic bacteria (which move in response to light), individual E. coli cells will be immobilized in an IR optical trap and stimulated with single photons. In an established technique, the response of the cells can be measured by tracking their swimming and tumbling behavior with the trapping beam. Other species will be studied in following experiments, including photosynthetic Rhodobacter sphaeroides.
|
0.915 |
2015 — 2018 |
Kirlik, Alex (co-PI) [⬀] Wang, Ranxiao Hovakimyan, Naira [⬀] Stipanovic, Dusan (co-PI) [⬀] Laviers, Amy |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri: Collaborative Research: Aspire: Automation Supporting Prolonged Independent Residence For the Elderly @ University of Illinois At Urbana-Champaign
Because of the graying of the population, there is a growing need for new assistive technologies to aid the elderly in their daily living. Based on figures provided by the U.S. Census Bureau, and due largely to the aging of the "baby boomer" generation, the population of U.S. adults who are 65 and older is projected to be twice as large in 2030 as it was in 2000, increasing from 35 million to 71.5 million and representing nearly 20 percent of the total U.S. population. This trend is placing enormous burdens on health care costs and causing disruptive changes to how individuals and families manage key late-in-life decisions, including residence. A recent (2015) report by the U.S. Department of Housing and Urban Development concluded that most seniors would prefer to age in place, and a 2010 survey found that 88% of respondents over 65 preferred to remain in their homes as long as possible. It is important to note that these residential preferences must typically be viewed in light of alternatives that are likely to be much more expensive and/or more socially taxing, such as older adults living instead in hospitals, assistive living facilities or with family members. While estimates of the costs associated with these trends vary greatly, it is almost certain that extending the portions of older adults' life spans in which they can live safely and independently through technological means could have enormous positive societal impact. In order to assist in successful aging in place, assistive robots have been developed in the past few decades. However, very few of them have become commercially available, and their use in domesticated environments remains highly limited. The major cause of the problem is that assistive robots typically take the form of full-size humanoid devices or something equally as cumbersome, expensive, and limited in movement and function. We propose a novel assistive robotic system that provides: (i) flexibility, allowing the designed system to be personalized based on users' needs without demanding any home modification upon installation; (ii) safety, ensuring that the system development process accounts for perceived safety by the user, and that the underlying theoretical framework guarantees collision avoidance; (iii) usability, consisting of a minimal and intuitive user interface to provide acceptable controls; (iv) reduced costs, with respect to currently available solutions on the market. The idea behind this research project is the development of a general framework that enables a team of unmanned ground vehicles and small multirotor unmanned aerial vehicles to safely cooperate with the elderly in a home environment. Equipped with appropriate human-machine interfaces, the co-robots will be able to accomplish a number of tasks as demanded by the users.
The project addresses fundamental problems in the domain of multi-agent cooperative systems, comprised of humans and co-robots interacting in shared, highly constrained spaces. In order to assist humans, the co-robots have to be trusted by humans, implying that their behaviors be predictable and consistent with principles of human spatial perception, and their appearance must foster a high level of comfort and not create high cognitive demands on the user. Inspired by these challenges, this proposal focuses on the design and control of co-robots, which can adapt to unstructured and rapidly changing environments in a manner consistent with human perception and cognition, thus enhancing safety and robustness. The key focus areas include the design and acceptance of mobile ground and aerial robots that coexist in environments inhabited by humans and the development of a multi-objective control framework to allow intuitive user control over an ensemble of co-robots, which includes the design of both low-level controllers (LLC) and a supervisory, high-level controller (HLC). To demonstrate the benefits of the framework and to engage student groups from various, diverse populations, the following scenario will be considered as a test case: multirotor unmanned aerial vehicles and ground robots acting as domestic assistive devices for healthy older adults in a research laboratory. These co-robots will safely navigate the shared space and accomplish domestic tasks requested by humans while displaying behaviors and appearances that are perceived as safe and trusted. Humans will use an intuitively designed interface for both controlling and monitoring co-robots on a tablet or a smartphone device. A motion capture system and virtual reality Cube at the Beckman Institute will provide the context for data collection, iterative testing and validation.
|
0.915 |
2018 — 2021 |
Wang, Ranxiao Hovakimyan, Naira [⬀] Salapaka, Srinivasa Marla, Lavanya |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri: Int: Collab: Synergetic Drone Delivery Network in Metropolis @ University of Illinois At Urbana-Champaign
Synergetic Drone Delivery Network in Metropolis
The rapid growth of e-commerce demands has put additional strain on dense urban communities resulting in increased traffic of delivery trucks while slowing down the pace of delivery operations. With recent quick-purchase innovations like the Amazon Dash button, e-commerce drastically modified the consumers behavior to buy smaller products separately and regularly, adding more load to delivery operations. Another growing trend is the offering of fast delivery services such as same-day and instant delivery. Instacart, Uber Eats and Amazon Now are examples of services that can fulfill a delivery order in just under 2 hours. These services rely heavily on the infrastructure of ride-sharing vehicles as Uber or Lyft drivers. This solution offers great flexibility to the consumer, but a single person can only deliver one purchase order to a customer at a time, and it is not scalable or cost-effective. There is an unquestionable need to redesign the current method of distribution packages in urban environments. This project envisions a framework that synergizes manipulatable distribution networks, comprising autonomous flying robots (drones) with existing transport networks, towards enhanced autonomy and economics in logistics. Imagine that a ride-sharing vehicle outfitted with a docking device for packages on its roof is traveling through a distribution center towards downtown. A drone can place a package on the vehicle's roof while it drives by the distribution center, and another drone can recover the package once the vehicle is driving through another distribution center in proximity to its destination. An operator that owns several base stations, at each of which it employs a network of drones to pick packages from the respective base station and drop it on a ground vehicle assigned to the package, is a required assumption by the framework. The ground vehicles can be public transport vehicles (PTVs), ride-sharing vehicles (RSVs), or operator owned vehicles (OOVs), which carry the package for most of the distance.
The approach relies on three main thrusts: i) socially aware robotics, ii) safe and robust motion planning and execution, iii) cooperative network logistics. Motion planning for robots will be developed with account of peoples perception of safety, privacy, and comfort. Socially-aware motion planning methods to generate trajectories with guarantees of safety in the presence of obstacles and humans will be developed. Psychological experiments will be developed to study human's subtle behavior in response to the presence of multiple drones using virtual reality test environment. Local control algorithms will be developed for each drone to follow a feasible collision free path. Robust local communication protocols will be investigated so that flying robots can perform collaborative tasks over busy air/ground traffic conditions and unreliable communication networks. Another objective is to achieve robust and safe rendezvous with fast moving vehicles under communication, schedule, and other modeling uncertainties. Algorithms that generate (possibly multi-hop) routes for each package, consisting of vehicle route segments, with the objective of minimizing cumulative delivery time, will be developed. The series of vehicle segments on which each package travels, and the associated schedule, is required as input for drones. This in turn necessitates solving the underlying network design problem for the centralized entity, to determine locations of distribution centers (bases) and number of OOVs required for feasible and reliable delivery of all packages, while explicitly estimating uncertainty from traffic trends and overall frequency of travel of RSVs between various points in the network. Game-theoretic mechanisms that incentivize cooperation among multiple independent operators of PSVs and RSVs will be developed. Mechanisms have to be specifically designed to ensure truthful bidding, because the objectives of the operator, the RSVs and the PSVs are not naturally aligned.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
0.915 |