Videos are a popular medium for online learning, and captions are essential for increasing the accessibility of videos to students for effective learning. Dr. Qi Wang, who teaches in the Department of Business, and professors from the University of Illinois at Urbana-Champaign and the University of Notre Dame, were recently awarded a $133,918 grant from the National Science Foundation (NSF). The work is entitled “Collaborative Research: Advancing STEM Online Learning by Augmenting Accessibility with Explanatory Captions and AI.” Dr. Wang is the first professor in the history of the Business department to secure a NSF grant. In this project, research is being conducted on the success of explanatory captions as an educational tool for both deaf and hard of hearing learners as well as non-native English learners. There are two types of video captions: typical closed captions and explanatory captions. Closed captions are a text form of the spoken part of a video. Explanatory captions explain the visual, textual, and audio content of the video. Existing technologies have focused on automatically generating or improving the quality of closed captions, but this project’s focus is explanatory captions. For STEM learning, explanatory captions can play a new role in enhancing students’ comprehension. The project will work to devise effective question-and-answer mechanisms and effective interaction designs that allow students and instructors to generate explanatory captions for STEM videos in a collaborative manner. The proposed technologies will augment accessibility and learning experiences for underserved populations and the deaf and hard of hearing community, while also improving comprehension for non-native English speakers, deaf or hearing. Evaluation sites include both Gallaudet University and the University of Illinois at Urbana-Champaign, which has the largest international student population among U.S. public institutions and supports students with disabilities in inclusive learning environments. The University of Notre Dame supports this project through contributions in computer science and artificial intelligence. This interdisciplinary research draws from the fields of computer science, learning science, and accessibility practices. The project aims to discover new knowledge and practical ways and means as to how accessibility-enabled videos (with explanatory and closed captions) can broaden the participation of under-served populations in STEM learning. The effort will also investigate effective mechanisms of human contribution and machine learning algorithms to create explanatory captions for STEM videos at different learning stages (e.g., preparing, tracking, troubleshooting, and reflecting), as well as how novel chatbots and AI agents impact student and instructor practices. Their work will be developed and analyzed through interactive, constructive, active, and passive (ICAP) theoretical frameworks.. Congratulations to Dr. Wang for this accomplishment. We look forward to the findings of this exciting research and applications! This material is based upon work supported by the National Science Foundation under NSF Award No. 2118824. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.