Bilge Mutlu, Ph.D. - US grants
Affiliations: | 2009 | Carnegie Mellon University, Pittsburgh, PA |
Area:
Computer Science, Artificial Intelligence, RoboticsWe are testing a new system for linking grants to scientists.
The funding information displayed below comes from the NIH Research Portfolio Online Reporting Tools and the NSF Award Database.The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please sign in and mark grants as correct or incorrect matches.
High-probability grants
According to our matching algorithm, Bilge Mutlu is the likely recipient of the following grants.Years | Recipients | Code | Title / Keywords | Matching score |
---|---|---|---|---|
2010 — 2014 | Gleicher, Michael (co-PI) [⬀] Mutlu, Bilge |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Hcc: Small: Designing Effective Gaze Mechanisms For Cross-Modal Embodied Agents @ University of Wisconsin-Madison The goal of this project is to design and build gaze mechanisms for embodied agents that can achieve high-level social and communicative goals that people achieve using such mechanisms and that can be applied across a wide range of agent presentations and task domains. |
0.934 |
2011 — 2017 | Takayama, Leila Mutlu, Bilge |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Hcc: Small: Embodied Mediated Communication in Collaborative Work @ University of Wisconsin-Madison The goal of this project is to gain a deeper understanding of mobile remote presence systems (MRPs) and to create guidelines for their effective design, development, and adaptation into organizational use. MRP systems enable embodied mediated communication in which individuals at a remote location connect to a local robot that is used to physically navigate in the local environment and to interact with local users via audio and video. MRP systems allow remote users to visit individuals in an organization and attend group meetings, seminars, and social gatherings in the local environment. MRPs enable new forms of interactions and offer remote users an improved sense of presence compared with stationary video-conferencing systems. |
0.934 |
2012 — 2017 | Mutlu, Bilge | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Career: Designing Socially Adept Robots @ University of Wisconsin-Madison Robots hold the promise of delivering significant social contributions in such key domains as education, rehabilitation, and collaboration, but to achieve this promise they must be able to communicate effectively with people. The goal of this project is to design effective social behaviors for robots by modeling human verbal, vocal, and nonverbal behavior using computational tools, regenerating them in robots, and evaluating their effectiveness in human-robot studies in social scenarios such as storytelling, interview, conversation, instruction, and collaborative work. The research will follow an interdisciplinary approach, combining computational and social-scientific methods toward advancing the state of the art in robotic technology. |
0.934 |
2012 — 2017 | Montague, Enid (co-PI) [⬀] Mutlu, Bilge Ferris, Michael (co-PI) [⬀] Squire, Kurt Shapiro, Benjamin (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ University of Wisconsin-Madison This project explores the hypothesis that compelling learning games based on contemporary science that offer opportunities to contribute to scientific inquiry will lead to increased interest in science, increased career choice of science, increased conceptual understanding of science content, and better scientific literacy around what scientists do. The idea is to capitalize on crowd sourcing both to shed light on the answers to open scientific questions and to engage the public in authentic and needed scientific inquiry in meaningful ways. The PIs will extend four games that are already designed and built or that are under construction and develop a platform for supporting a broad range of participatory science games that offer the public opportunities to contribute to scientific inquiry. The chosen games all encourage sustained and deep participation, include apprenticeship opportunities and opportunities for practicing authentic science, promote reflection in and on action, and are designed to be emotionally compelling. Games come from four game genres: role-playing, strategy, action, and puzzle, as different people are drawn to different types of experiences. All are in the areas of bioscience and biotechnology, and each addresses some open question in bioscience or biotechnology that participants might shed light on. The broad range of games serves several purposes -- offering a substantial enough range of experiences that a broad range of participants can be expected to join in, offering enough diversity to know that the infrastructure tying the games together has all of the functionality required to support a broad range of such games, and offering enough diversity to answer targeted research questions. Research focuses on identifying the challenges in creating a broad and diverse public gaming community that interacts with more formal and established scientific and educational cultures, how learning occurs in such an environment and how to promote sustained engagement and deep learning, identifying core features and mechanisms of games that promote sustained engagement and science learning, and understanding the design features in the particular games being studied that contribute to sustained engagement and learning. |
0.934 |
2012 — 2017 | Ferrier, Nicola (co-PI) [⬀] Mutlu, Bilge Gleicher, Michael [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri-Small: Perceptually Inspired Dynamics For Robot Arm Motion @ University of Wisconsin-Madison In order for robots to collaborate efficiently and effectively with humans, the human perception of their movement must be considered in motion creation. Because a human collaborator will interpret the movements of a robot (even subconsciously), robot motion synthesis algorithms that do not consider the human observer may create motions that are perceived incorrectly, interpreted negatively (e.g. as being angry or threatening), or at least miss out on the opportunity to use this subtle communication channel effectively. The key idea of this project is to develop an understanding of human perception of movement that can be applied to the development of robot trajectory planning and control algorithms. The team will use human subjects experiments to understand and evaluate the interpretation of movements and apply these findings in robotics and motion synthesis. The research plan interleaves empirical studies of how people interpret motions, algorithm development to create methods that generate robot motions in a controllable manner, and contextualized deployments that allow the PIs to evaluate the success of the methods. The success of the project will provide a deeper understanding of how people interpret movements, new algorithms for synthesizing robot movements, and demonstrations of the potential applications of collaborative robots. |
0.934 |
2014 — 2017 | Mutlu, Bilge | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ University of Wisconsin-Madison Robots for application in collaborative manufacturing must perform manual work side-by-side with people. Such robots offer the flexibility to work on many different tasks and promise to transform manufacturing by improving the quality and efficiency of manual processes in small shops and in facilitates that manufacture highly customized products. However, in order to meet this promise, robots must be effectively integrated into existing manufacturing teams and practices. To enable this integration, this National Robotics Initiative (NRI) award supports fundamental research on the methods and instruments that manufacturing engineers will need to form effective human-robot teams based on task requirements and worker skills. These methods will also enable robots to adapt to changes in workflow to maximize safety and efficiency. The effective integration of collaborative robots into manufacturing promises improvements in many industries that have not yet benefited from robotic technology. Therefore, results from this research will contribute to the competitiveness of U.S. manufacturing and benefit the U.S. economy and society. The research will involve contributions from multiple disciplines, including robotics, human factors, computer science, and manufacturing, and by academic and industry collaborators. These collaborations will help the dissemination of research results into manufacturing organizations and the integration of research into undergraduate and graduate curriculum in engineering. |
0.934 |
2016 — 2018 | Mutlu, Bilge Albarghouthi, Aws Sauppe, Allison |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Eager: Representations and Methods For Verifiable Human-Robot Interactions @ University of Wisconsin-Madison Robotics promises to transform how people work, learn, communicate, and engage in other activities of daily living. To do so, robots must interact with users following conventions and norms that people expect, so that people readily adopt them into their environments. These conventions and norms, however, are highly complex and nuanced, requiring that designers consider an impractical number of scenarios, constraints, and requirements in the design process. This project will build new computational methods to facilitate this process, enabling designers of future robot systems to more easily explore the design space for human-robot interactions and to verify that their solutions satisfy design goals. These methods will help achieve the promise of robotics by improving their safety, effectiveness, and reliability. |
0.934 |
2018 — 2021 | Gleicher, Michael [⬀] Zinn, Michael (co-PI) [⬀] Mutlu, Bilge |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nri: Fnd: Communicating Physical Interactions @ University of Wisconsin-Madison For robots to become ubiquitous collaborators, they must interact physically with objects in unstructured human environments. Robots must adeptly grasp, push, squeeze, snap, balance, stabilize and hand-off objects, and must effectively communicate about these actions with their human collaborators. People must not only be able to specify to a robot what to do and how to do it, they must also be able to interpret what a robot intends to do (and how). Effective communication about physical interactions will be essential if robots are going to perform such tasks as delivering care to older adults, responsively helping technicians in performing repairs, or being trained by non-experts to perform repetitive assembly tasks. This project will enable a new generation of robot applications, and advance the vision of ubiquitous, collaborative robots by developing better methods for robots to more effectively communicate with people. |
0.934 |
2018 — 2021 | Mutlu, Bilge Ruis, Andrew (co-PI) [⬀] Shaffer, David (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ University of Wisconsin-Madison Collaborative robots are emerging as a new family of advanced technologies that are designed to work side-by-side with people in industrial settings. These robots can improve the productivity and flexibility in manufacturing and logistics industries, while also assisting workers in repetitive, and unhealthy or unsafe work conditions. Unlike traditional factory robots that are programmed and configured by engineers to function independently for long periods of time, collaborative robots require human workers to frequently interact with the robot, including training the robot, making the necessary changes in the environment for the robot to function, and supervising the robot's work to ensure its successful operation. The integration of collaborative robots into and operation within existing industrial settings require expert skills that many workers lack, and that current educational programs do not cover, highlighting a skills mismatch from the demands of an increasingly automated future of work. This project will develop an understanding of what skills workers will need to perform these tasks effectively and to design a hybrid digital-physical educational technology that will support worker education and training in these skills in classrooms and industrial-training facilities. The research team will evaluate the effects of the technology on skill development among trainees and make recommendations for future job-training programs. |
0.934 |
2019 — 2021 | Mutlu, Bilge | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Designing and Testing Companion Robots to Support Informal, in-Home Stem Learning @ University of Wisconsin-Madison Future educational robots are emerging as social companions supporting learning. By socially interacting with such a robot, learners can potentially reason and talk about the things they are learning and receive help in seeing the relevance of STEM in their daily lives. However, little is known about how to design educational robots to work with youth at home over a long period of time. This project will develop an informal science learning program, called STEMMates, in collaboration with a local community center, for youth with little interest in science. The program will partner learners with an in-home learning companion robot, designed to read books with youth and provide science activities for them at the community center, where youth will engage in exciting and personally relevant science learning. As the learner reads books, the robot will make comments about what is happening in the book to help connect the reading to the science activities at the community center. The overarching goals of STEMMates are to: (a) positively support youth's individual interest in science and future science learning, (b) connect in-home learning experiences with out-of-school community-based learning, (c) bridge the gap between formal and informal engagement and learning in science, and (d) encourage the participation of youth who are underrepresented and who have low interest in STEM learning. This project is funded by the Advancing Informal STEM Learning program, which seeks to advance new approaches to and evidence-based understanding of the design and development of STEM learning opportunities for the public in informal environments. |
0.934 |
2019 — 2020 | Radwin, Robert [⬀] Li, Jingshan (co-PI) [⬀] Mutlu, Bilge |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Fw-Htf-P: Human-Robot Collaboration For Enhancing Work Capabilities @ University of Wisconsin-Madison The research objective of this Future of Work at the Human-Technology Frontier (FW-HTF) project is to create a matchmaking tool to identify opportunities in manufacturing workplaces where robot capabilities can augment or enhance human workers to improve productivity and safety. This planning project will investigate the possibility to integrate human workers into work processes by modeling tasks from an existing database of critical tasks required to perform each of more than 1000 occupations within the US economy. The project team will also engage with industry to understand differences between the results of task modeling and real work. Successful project outcomes will include the advancement of knowledge around future collaborative technologies, as well as the advancement of fundamental knowledge around the relationship and co-existence between American workers and intelligent robotic workplace assistants. A series of intensive workshops will foster new research collaborations between experts in process optimization, labor economics, and public health to build a multidisciplinary research team for a long-term FW-HTF research initiative. |
0.934 |
2021 — 2023 | Smeeding, Timothy Radwin, Robert [⬀] Li, Jingshan (co-PI) [⬀] Mutlu, Bilge Jacobs, Lindsay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Fw-Htf-Rm: Human-Robot Collaboration For Manual Work @ University of Wisconsin-Madison This Future of Work at the Human-Technology Frontier (FW-HTF) project advances the vision of robots that work alongside workers to augment human activities, create better working conditions, provide more job satisfaction, open new employment prospects, and result in greater productivity than manual labor or automation alone. Existing principles and methods for work design and automation often use robots to displace human workers and are inherently limited in their economic and social benefits. This research advances a model of robots as physical assistants, working in partnership with human workers to redefine human work. The objective is to create a "matchmaking" process to identify human-robot pairings that can significantly improve productivity, health, and safety, while maximizing human labor resources. A multidisciplinary approach is applied to existing manual tasks in the manufacturing sector that show high potential to benefit from the integration of collaborative robots, in order to gain a better understanding of the impact of teaming humans and intelligent robotic assistants at the individual, operations, and socioeconomic levels. Finally, the insight gained from investigating current job characteristics, worker skills, and robot capabilities, is applied to enhance the benefits of the next generation of manufacturing robots for the strength of the economy and the quality of life of the workforce. |
0.934 |
2022 — 2025 | Mutlu, Bilge | N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ University of Wisconsin-Madison Researchers in the learning sciences describe learning as a social activity where social interactions with others are conducive to developing deep and lasting knowledge. Teachers often help students using social interactions in science classrooms by providing guidance to students, including giving small cues to what students might notice about their learning, restating key points in what others are explaining, or asking students questions to explain their reasoning. However, during homework, students usually complete lessons by themselves without the aid of rich social interaction that comes from teacher guidance. To help students during homework, educational technologies can provide personalized support for lessons, but most systems lack the ability to provide the type of socially rich guidance that teachers use in classrooms. One solution is to use social robots, or learning companion robots, as an educational technology capable of providing a social learning experience and personalized supports for students as they complete their homework. A major challenge in providing homework lessons designed for social robotic support is a lack of tools for teachers to create and customize these lessons. To address that challenge, in this project, researchers will partner with middle school science teachers and students, to design PATHWiSE, Personalized Augmentation Tools for HomeWork in Science Education, a homework studio for robot augmented learning. The PATHWiSE homework studio is an online system where teachers can design lessons and add robot interactions to science homework and students can log on with a robot in their home to engage in these robot-assisted homework activities. PATHWiSE empowers teachers to add guidance for students during homework by including guiding comments, highlighting material, and asking questions that relate to their classrooms and students lived experiences. With help from the PATHWiSE tool, teachers can transform homework into a socially interactive experience that can vastly improve the learning experience for students of all abilities. This work makes significant societal contributions through an innovative technology that can transform at-home learning experiences; new knowledge and guidance that demonstrate how to develop teacher authoring tools for complex technologies, and new recommendations that inform real-world practice of integrating social robots into science curriculum.<br/><br/>Students usually complete homework lessons in isolation rather than through rich social interaction. Educational technologies can support student homework via step-by-step feedback or proficiency tracking, but most systems lack mechanisms to provide guidance that reflects the socio-cultural experiences students have in and out of the classroom. To address that challenge, researchers in this project will conduct design-based research in partnership with middle school science teachers and students to co-design PATHWiSE, Personalized Augmentation Tools for HomeWork in Science Education, a homework studio for robot augmented learning. The PATHWiSE homework studio is an open-source online system that includes a teacher authoring environment for adding teacher guidance to science homework activities and a student homework environment where students engage in robot-assisted homework activities. PATHWiSE uses a modified trigger-action programming environment that visually overlays trigger-action pairs directly onto multimedia homework activities in the teacher interface. Teachers indicate trigger points, indicating where to provide guidance, and associated actions that describe the robot response at that trigger point. As students work at home with the PATHWiSE interface, the robot will retrieve trigger-action pairs for an assignment from a database. When a trigger point is reached, the associated actions will be translated into robot behaviors (verbal, non-verbal, and movement actions). Finally, researchers will conduct a pilot study to examine teacher use and student experiences with the PATHWiSE system, where teachers will create augmented homework assignments, embedded over four weeks in their actual curriculum, and students will complete these assignments at home with either the robot or a similar computer-based system.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria. |
0.934 |
2022 — 2027 | Cai, Zhiqiang (co-PI) [⬀] Mutlu, Bilge Shaffer, David [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
@ University of Wisconsin-Madison In modern classrooms, students learn science, technology, engineering, and mathematics (STEM) through interactions not only with teachers and textbooks but also with computer games and simulations, automated tutors, and online resources. Education researchers thus have access to large amounts of data about students’ STEM learning processes, from classroom or online conversations to detailed records of student activity in educational apps. Despite the potential of such rich data for curriculum development and personalized assessment, there are significant technical and conceptual challenges to analyzing data that come from different sources or modalities. To address these challenges, this project will develop and test trans-modal analysis (TMA). TMA is a statistical technique and software package that will help researchers better and more easily integrate multiple types of data into analyses of STEM learning. This will enable more accurate understanding of students’ STEM learning processes, and in turn help identify potential inequities in assessment of student learning, informing education policy and practice for diverse learners. This project will include post-doctoral scholars, graduate student researchers, and undergraduate research interns, who will develop skills and experience in data science, learning analytics, software development, and scientific communication, providing training and mentoring for the next generation of education researchers. <br/><br/>Although most analyses of learning processes are based on a single type or modality of data, STEM learning typically takes place in a multimodal setting. Models of STEM learning processes thus need to account for multiple sources and types of data to account for complex interactions between learners and the setting(s) in which they learn. For example, there are different types of events (questions from a teacher, chats with a peer, views of a resource) and different properties of events (gender of a person gesturing, linguistic fluency of a speaker, reading level of a person reading a document) that may influence future events with more or less impact over time. In addition, the structure of a learning environment creates a horizon of observation for each student, making some events (e.g., a conversation in another group of students) more or less visible. Finally, different characteristics of students (age, cultural or ethnic background, gender identification, whether instruction is in their native language or a non-native language) may lead them to respond to events in different ways. Extant learning analytic techniques account for the influence of prior events by lagging: for example, using some fixed number of prior events to predict future events. TMA will enable those same techniques to operate not on properties of the events themselves but on underlying functions that represent claims or hypotheses about the interaction between different learning modalities, the structure of the learning environment, and the ways in which students might systematically differ as STEM learners. The project team hypothesizes that TMA models will provide a more nuanced, more accurate, and more equitable view of STEM learning processes for diverse learners. This approach will expand the understanding of effective multi-modal STEM learning processes and allow researchers to account for diversity and address questions of equity in multi-modal STEM learning. TMA will be developed and tested first as a set of algorithms for conducting trans-modal analyses with three widely used learning analytic tools: process mining, epistemic network analysis, and dynamic Bayesian networks. The investigators aim to use simulation studies and the analysis of actual STEM learning datasets to address two fundamental research questions regarding the science of learning: (1) Under what conditions (if any) are trans-modal models of STEM learning processes more informative than uni-modal models? And (2) Can TMA model meaningful differences in trans-modal learning processes for minoritized groups of STEM learners?<br/><br/>This project is supported by NSF's EHR Core Research (ECR) program. The ECR program emphasizes fundamental STEM education research that generates foundational knowledge in the field. Investments are made in critical areas that are essential, broad and enduring: STEM learning and STEM learning environments, broadening participation in STEM, and STEM workforce development. The program supports the accumulation of robust evidence to inform efforts to understand, build theory to explain, and suggest intervention and innovations to address persistent challenges in education.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria. |
0.934 |