The broader impact/commercial potential of this I-Corps project is the development of learning programs and tools to develop empathetic and inclusive behaviors. The proposed technology may offer an alternative to current solutions, such as coaching (typically only accessible to top executives due to high-cost implications) and didactic training (e.g., often mandatory online tutorials or workshops, mostly disconnected from employees’ day to day needs). The goal is to provide a tool to be used during day-to-day work, rather than only in training or coaching sessions. The proposed technology may have broad applications, ranging from the early therapeutic support of children with Autism Spectrum Disorders and their families, enhancing medical staff’s ability to empathize with patients in the context of telemedicine, managers’ inclusive leadership behaviors, military leaders’ ability to empathize and resolve conflicts, and even assisting couples and families with their personal relationships. The proposed technology may be an effective, user-driven, ethical, and accessible way to generate insights into the emotional dynamics of people’s interactions, while generating useful metrics to track and steer progress.<br/><br/>This I-Corps project is based on the development of an artificial intelligence (AI)-based bio-signal analysis system aiming to help people enhance their ability to empathize. The proposed technology analyzes a set of bio-signals from people engaging in video calls (e.g., tone of voice, facial expressions, use of language, breathing rate, etc.) with the help of machine learning and a recently developed ultrasound signal processing algorithm. The system then creates a continuous feedback loop making each interlocutor aware of their own and their conversational partner’s evolving emotions. The laboratory experiments performed with early prototypes have provided an initial proof-of-concept - participants have experienced the emergence of empathy-related and inclusive behaviors including a higher awareness of their own and other’s emotions (pausing, reflecting, inquiry, and using pro-social language), as well as a positive learner experience. These insights form the basis of actionable feedback for users to develop more inclusive behaviors. The high accuracy of evaluating emotions based on multi-modal biosignal fusion and distributed AI, and its ability to provide rich feedback in real-time with the help of custom-made biosignal processing algorithms provides opportunities for further development and commercial application.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.