Brian Scassellati - US grants
Affiliations: | Computer Science | Yale University, New Haven, CT |
We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the NIH Research Portfolio Online Reporting Tools and the NSF Award Database.The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please sign in and mark grants as correct or incorrect matches.
High-probability grants
According to our matching algorithm, Brian Scassellati is the likely recipient of the following grants.Years | Recipients | Code | Title / Keywords | Matching score |
---|---|---|---|---|
2002 — 2004 | Krishnamurthy, Arvind [⬀] Nilsson, Henrik Scassellati, Brian |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Composing Data-Rich Embedded Systems the Easy Way @ Yale University Composing Data-Rich Embedded Systems the Easy Way |
1 |
2002 — 2006 | Hudak, Paul [⬀] Trifonov, Valery Scassellati, Brian Taha, Walid (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr: a Framework For Rapid Development of Reliable Robotics Software @ Yale University Taha, Walid |
1 |
2003 — 2006 | Hudak, Paul [⬀] Peterson, John Nilsson, Henrik Scassellati, Brian |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr: Dance, a Programming Language For the Control of Humanoid Robots @ Yale University Robots are becoming increasingly common in, and important to, many commercial, industrial, and military applications. This project focuses on humanoid robots, which are becoming increasingly useful as they advance in sophistication, because they can perform in environments engineered specifically for humans, and because they make it easier for humans to interact with automation. This project focuses specifically on how to program humanoid robots; i.e. how to program their movements and interactions as easily and as effectively as possible. The focus is not on developing new algorithms for robot movement or sensing. Rather, once an algorithm is in hand, how does one program a robot to walk, wave its arms, clap its hands, or pick up an object? How does one do so in a high-level way that is devoid of unnecessary detail, yet is expressive enough to capture all desirable movements and interactions? |
1 |
2003 — 2008 | Scassellati, Brian | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Career: Social Robots and Human Social Development @ Yale University This project focuses on the development of anthropomorphic robots that interact with people using natural social cues. Socially-competent robots would have great practical impact; users could interact with these robots in a more natural and effortless way, could command them through social instruction, and could integrate them into daily life. This project seeks to address both the technical challenges involved in constructing these robots and the ways in which they can be used as tools to study human social development. |
1 |
2005 — 2009 | Scassellati, Brian Volkmar, Fred (co-PI) [⬀] Klin, Ami (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Quantative Measures of Social Response For Autism Diagnosis @ Yale University Autism is a pervasive developmental disorder that is characterized by a severe set of social and communicative deficits. Autism is diagnosed behaviorally; there is no known blood test, genetic test, or functional imaging method that can diagnose autism. Existing diagnostic methods provide primarily qualitative descriptions of dysfunctional social skills. Given the need to capitalize on early brain plasticity and thus maximize the beneficial impacts of intervention, there is a great need for novel, sensitive and quantified performance-based measurements of social vulnerabilities in young children with autism. |
1 |
2008 — 2012 | Scassellati, Brian Chawarska, Katarzyna (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ Yale University Eye tracking has become a widespread tool throughout the cognitive sciences and has attracted particular attention as a behavioral measurement tool for children with developmental disabilities. However, there are no standardized quantitative tools for assessing broadly defined attention skills in young children, and there is a lack of analysis techniques that would allow gaze patterns to be compared across individuals, across populations, or for a single individual across time. This study will develop methods of quantitatively measuring attentional capacities by (1) designing a Visual Attention Assessment Suite (VAAS) which examines the interaction and impact of particular features of scenes on visual attention; (2) constructing novel computational analysis techniques for comparing gaze patterns across individuals, populations, and time; (3) validating these techniques against both standard behavioral assessment protocols and through an embodied modeling approach to ensure that our models capture the behaviorally important aspects of gaze. |
1 |
2010 — 2011 | Scholl, Brian (co-PI) [⬀] Scassellati, Brian |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ Yale University People recognize dramatic situations and attribute roles and intentions to perceived characters, even when presented with extremely simple cues. As any cartoon viewer can attest, two animated shapes are sufficient to describe a scene involving tender lovers, brutal bullies, tense confrontations and hair-raising escapes. These basic notions of agency and intentionality are foundational to our social perception of the world. They provide the first discriminations between agents and objects, delineate which elements of the world can move with goal-directed purpose, and provide the primitive structure for describing cause and effect. Extensive laboratory experiments have described many of the basic properties that produce these perceptions on controlled stimuli. However there have been only limited attempts to quantify these processes and no attempts to see if these same properties hold on real-world activity patterns. |
1 |
2011 — 2014 | Scassellati, Brian | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Hcc: Small: Manipulating Perceptions of Robot Agency @ Yale University Robots are increasingly becoming a part of daily human interactions: They vacuum floors, deliver medicine in hospitals, and provide company for elderly and disabled individuals. This project examines one aspect of people's interactions with these robots: how intentional and self-reflective the robot seems to be. Because the perceived agency of a robot affects many dimensions of people's interactions with that robot, it is important to understand how features of robot design, such as its behavior and cognitive abilities, affect perceptions of agency. This question is addressed through a series of laboratory experiments that manipulate behavior and cognitive abilities and measure the degree of agency attributed to socially interactive robots. |
1 |
2012 — 2017 | Scassellati, Brian Volkmar, Fred (co-PI) [⬀] Morrell, John Shic, Frederick (co-PI) [⬀] Dollar, Aaron Paul, Rhea (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Socially Assistive Robots @ Yale University Socially Assistive Robots |
1 |
2017 — 2018 | Scassellati, Brian | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ Yale University This is funding to support a Pioneers Workshop (doctoral consortium) of approximately 24 graduate students (12 of whom are from the United States and therefore eligible for funding), along with distinguished research faculty. The event will take place as part of the first day of activities at the 12th International Conference on Human Robot Interaction (HRI 2017), to be held March 6-9 in Vienna, Austria, and which is jointly sponsored by ACM and IEEE. HRI is the premier conference for showcasing the very best interdisciplinary and multidisciplinary research on human-robot interaction, with roots in diverse fields including robotics, artificial intelligence, social psychology, cognitive science, human-computer interaction, human factors, engineering, and many more. It is a single-track, highly selective annual international conference that invites broad participation. Building on the "Smart City Wien" initiative, the theme of HRI 2017 is "Smart Interaction." The conference seeks contributions from a broad set of perspectives, including technical, design, methodological, behavioral, and theoretical, that advance fundamental and applied knowledge and methods in human-robot interaction, with the goal of enabling human-robot interaction through new technical advances, novel robot designs, new guidelines for design, and advanced methods for understanding and evaluating interaction. More information about the conference is available online at http://humanrobotinteraction.org/2017. The Pioneers Workshop will afford a unique opportunity for the best of the next generation of researchers in human-robot interaction to be exposed to and discuss current and relevant topics as they are being studied in several different research communities. This is important for the field, because it has been recognized that transformative advances in research in this fledgling area can only come through the melding of cross-disciplinary knowledge and multinational perspectives. Participants will be encouraged to create a social network both among themselves and with senior researchers at a critical stage in their professional development, to form collaborative relationships, and to generate new research questions to be addressed during the coming years. Participants will also gain leadership and service experience, as the workshop is largely student organized and student led. The PI has expressed his strong commitment to recruiting women and members from under-represented groups. To further ensure diversity the event organizers will consider an applicant's potential to offer a fresh perspective and point of view with respect to HRI, will recruit students who are just beginning their graduate degree programs in addition to students who are further along in their degrees, and will strive to limit the number of participants accepted from a particular institution to at most two. As a new feature this year, the organizers will also invite 3 undergraduate students (all eligible for funding) to help increase diversity in the pipeline of students entering this field. |
1 |
2018 — 2021 | Scassellati, Brian | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Chs: Small: Watch One, Do One, Teach One: An Integrated Robot Architecture For Skill Transfer @ Yale University In the last several years, robotics research has transitioned from being concerned exclusively with building fully autonomous and capable robots to include building partially-capable robots that collaborate with human partners, allowing the robot to do what robots do best and the human to do what humans do best. This transition has been fueled by a renaissance of safe, interactive systems designed to enhance the efforts of small- and medium-scale manufacturing, and has been accompanied by a change in the way we think robots should be trained. Learning mechanisms in which the robot operates in isolation, learning from passive observation of people performing tasks, are being replaced by mechanisms where the robot learns through collaboration with a human partner as they accomplish tasks together. This project will seek to develop a robot architecture that allows for new skills to be taught to a robot by an expert human instructor, for the robot to then become a skilled collaborator that operates side-by-side with a human partner, and finally for the robot to teach that learned skill to a novice human student. To achieve this goal, popular but opaque learning mechanisms will need to be abandoned in favor of novel representations that allow for rapid learning while remaining transparent to explanation during collaboration and teaching, in conjunction with a serious consideration of the mental state (the knowledge, goals, and intentions) of the human partner. A fundamental outcome of this work will be a unified representation linking the existing literature in learning from demonstration to collaborative scenarios and scenarios involving the robot as an instructor. Thus, project outcomes will have broad impact in application domains such as collaborative manufacturing, while also enhancing our substantial investment in education and training (especially research offerings for graduate and undergraduate investigators), and will furthermore enrich the efforts to broaden participation in computing. |
1 |
2019 — 2020 | Sarkar, Nilanjan [⬀] Rehg, James Scassellati, Brian Bruyere, Susanne Warren, Zachary |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ Vanderbilt University The NSF Convergence Accelerator supports team-based, multidisciplinary efforts that address challenges of national importance and show potential for deliverables in the near future. |
0.97 |
2019 — 2023 | Chertow, Marian (co-PI) [⬀] Scassellati, Brian Dollar, Aaron Reck, Barbara |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ Yale University This Future of Work at the Human-Technology Frontier (FW-HTF) project investigates a novel human-robot collaboration architecture to improve efficiency and profitability in the recycling industry, while re-creating recycling jobs to be safer, cleaner, and more meaningful. The specific goal is to improve the waste sorting process, that is, the separation of mixed waste into plastics, paper, metal, glass, and non-recyclables. The US scrap recycling industry -- which represents $117 billion in annual economic activity and more than 530,000 US jobs -- is struggling to meet increasingly challenging standards in domestic and international markets. A major problem for the industry is poor sorting of waste, resulting in materials impurity and a significant decrease in the quality and value of the recycled product. Human perception and judgement are essential to handle the object variety, clutter level and changing characteristics of the waste stream. Yet waste-sorting workers currently face health risks and discomfort arising from sharp and heavy objects, toxic materials, noise, vibration, dust, noisome odors, and poor heating, ventilation, and air conditioning. The innovative robotics component of this project, especially in object detection, manipulation, and human-robot interaction, will allow new sorting facility architectures, creating new, safer roles for human workers. The project complements these technological advances with economic analyses to determine the facility configurations that best remove processing bottlenecks, target materials of high value, and boost the end-to-end efficiency of the recycling process. Division of labor between humans and robots will be investigated to improve job desirability and worker motivation, incorporating consideration of the workers' well-being. In particular, the project will explore ways to utilize robots to amplify worker expertise and value. A holistic and interconnected research approach will be taken for all these aspects, i.e. developing robotics technology, designing the human-machine interfaces, investigating workers' workers' role in the new sorting plant architectures, and understanding and incorporating workers' needs and well-being into the design process. |
1 |
2020 — 2022 | Sarkar, Nilanjan [⬀] Rehg, James Scassellati, Brian Bruyere, Susanne Warren, Zachary |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
B1: Inclusion Ai For Neurodiverse Employment @ Vanderbilt University The NSF Convergence Accelerator supports use-inspired, team-based, multidisciplinary efforts that address challenges of national importance and will produce deliverables of value to society in the near future. |
0.97 |
2021 — 2025 | Scassellati, Brian Jara-Ettinger, Julian Vazquez, Marynel |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Hcc: Medium: Proactive Physical Assistance For Collaborative Human-Robot Teams @ Yale University It is essential that robots working side-by-side with people be able to offer physical assistance. However, robots do not yet take the initiative to help. Instead, they typically follow preset plans for collaboration, or respond only when users explicitly ask for help. While useful in many situations, this reactive approach is poorly suited for many collaborative applications. This project focuses on enabling robots to proactively offer physical assistance that is timely, task-appropriate, and wanted by their human partners. The goal is to construct robots that can answer three key questions: (1) Does a teammate need help? (2) Can I (or someone else) help? and (3) Should I help? By enabling these capabilities, this project will make collaborative human-robot teams function more fluently and efficiently. |
1 |