What does the future hold for sign language and artificial intelligence? No one knows for sure, but at the inaugural SLxAI Summit in Boston this week, a group of researchers, innovators, and entrepreneurs is trying to ensure that emerging technologies benefit Deaf and hard of hearing individuals around the world.

“SLxAI is bringing many different types of stakeholders together and giving Deaf communities space to lead the conversation in the field of sign language and AI. It has already gotten a lot of traction, starting with a global virtual meeting, attracting 60+ organizations across 25+ countries,” says Dr. Abraham Glasser, Assistant Professor in the Accessible Human-Centered Computing and Policy program (AHCP), who is part of a large cohort of Gallaudet faculty and alumni taking part in the sold-out event.
One of SLxAI’s major goals is to formally adopt bylaws that lay out the group’s collective vision. These include establishing standards that ensure accessibility, advocating for inclusive technology policies, bridging the gap between technology developers and the Deaf community, and promoting technology education and training for Deaf individuals.
Doing the right thing
Glasser is a key figure in this movement thanks to his work for the Coalition for Sign Language Equity in Technology (CoSET), which recently released a toolkit with the SAFE AI Task Force to help decision makers navigate the use of AI for interpreting solutions. He is also a member of the World Federation of the Deaf Ad Hoc Expert Group on Artificial Intelligence (along with Nina Tran, G-’24).
One of the summit’s sessions is dedicated to understanding CoSET’s toolkit and other plans. “This presentation will outline our goals for creating a reliable, real-world testing environment, aiming to identify and manage potential risks in communication technologies intended to be used during everyday human conversations,”” Glasser says.

In another panel, Glasser will weigh in on the ethical implications of deploying sign language AI systems. The official description explains that the “discussion will focus on power, consent, accountability, and what guardrails should be expected across research, product development, and procurement. Topics will include data ownership, consent, compensation, cultural sovereignty, model deployment risks, governance, and the role of Deaf leadership in shaping the future of sign language AI.”
All of these issues are critical to AHCP research at Gallaudet, says Glasser, who notes that engaging with the Deaf community and industry players will provide important insights that can guide their strategic planning. They will also help influence the direction of Gallaudet’s new Universal-AI NSF Research Traineeship Program — funded by a $4.77 million award from the National Science Foundation — that is designed to prepare the next generation of AI scientists. “The Universal-AI NSF Traineeship Program aims to create a well-prepared workforce that pushes the frontiers of AI while advocating for universality, accountability, and accessibility,” says Principal Investigator Dr. Raja Kushalnagar, who is leading a team that includes Co-Principal Investigator Dr. Christian Vogler as well as Glasser.
Glasser notes, “Similarly, this event will touch on many topics that reach many other folks at Gallaudet beyond those working in technology, such as linguistics, philosophy, deaf studies, interpreting, sign language education, policy, etc.”
Leading the charge
Gallaudet’s Leadership Team recognizes the significance of this summit, explains Provost Dr. Khadijat K. Rashid, ’90. AI has long been a focus for President Roberta “Bobbi” Cordano, which is why Heather Harker, Cordano’s Chief of Staff, will be in attendance, meeting people and learning about new developments in the field.
“Sign Language is one of those areas that AI is trying to assimilate,” Rashid says. “And we believe that Gallaudet should be at the forefront of that work since we have a vast trove of ASL assets that’s very valuable to the AI companies and to Deaf people’s future use of AI using direct ASL rather than typing queries in English.”
Rashid is pleased to see so many Gallaudet names among the list of presenters. “That GU is so well represented there just speaks to how important this is for us, and that many of our faculty researchers are so interested in the field,” she adds.
Indeed, last year’s Provost Research Excellence Award went to the team of Associate Professor Dr. Geoffrey Whitebread, Director of the Master of Public Administration program, Interpretation and Translation Associate Professor Dr. Pamela Collins, ’07, G-’11 & PhD ’20, and English Professor Dr. Kathleen Wood for their project, “Generative AI Tools for Language Equity for Deaf and Non-Native English Students and Professionals.”
“We found that Deaf people were using AI in the workplace a lot to ensure language access and equity. We found they didn’t have formal training on how to use AI best for language equity — either in prompt development or ethical use of AI,” says Whitebread, who sees a big opportunity in offering education modules tailored specifically to Deaf people to help improve their career prospects. “For example, ChatGPT accurately simplifies complex documents and can translate documents to ASL gloss.” He is also interested in exploring how to leverage AI platforms using video and animation, as well as ASL AI tools.
Learning about AI
When it comes to educational applications of AI technology, there are some intriguing possibilities, says Dr. Lorna Quandt, Associate Professor and Director of the Action & Brain Lab. Along with postdoctoral researcher Dr. Lee Kezar, Dr. Athena Willis, ’18 & PhD-’23, and Educational Neuroscience PhD student Laurel Aichler, G-’24, Quandt is presenting a panel at the summit focused on how to make AI-assisted learning more accessible to students who use sign language.

They will be sharing the latest information on the Building Real-time Intelligent Grounding in Deaf Education (BRIDGE) project, which is designed to support Deaf and hard of hearing students’ communication in STEM courses. Co-Principal Investigators Quandt and Associate Professor of Biology Dr. Alicia Wooten are harnessing the power of AI to tackle the lack of standard signs in ASL for many scientific concepts.
For instance, there are multiple ways to sign “photosynthesis.” “One person might fingerspell, another might sign ‘SUN+EXCHANGE’ or use another sign,” Quandt explains. “STEM has a higher variability of signs and that can pose challenges in Deaf education.” That is why BRIDGE is creating augmented reality goggles that will recognize when a scientific term is used and then provide more information through real-time captioning or a signing avatar.
“Our vision is the classroom of the future for Deaf students,” says Quandt, who is excited to see barriers to learning reduced. But she is worried about some of the other projects that are happening in the AI space — especially those led by people who are attracted to sign language but do not actually know it. “Entering into this space with no knowledge of a signed language is not appropriate, and importantly, the work itself won’t be good without authentic knowledge regarding how Deaf people can benefit,” she says.
Thinking globally
Making sure that AI serves Deaf people worldwide is something that Deaf Studies Professor Dr. Joseph Murray confronts regularly as president of the World Federation of the Deaf (WFD). Murray, an expert on Deaf Gain, has presented on AI and Deaf communities at the United Nations, in high-level dialogues with Arab governments, and in regional engagements in Southeast Asia. He also sits on the Equitable AI Alliance Advisory Board, which brings together disability thinkers and industry to shape accessibility standards in the development of AI.

At the summit, Murray is presenting a global policy session on how sign language AI is showing up in international forums, standards discussions, and advocacy work.
“Who gets to shape sign language AI, on what terms, and with what accountability? Decision-makers are often making commitments about AI and accessibility without meaningful deaf community input. A significant part of the work right now is getting community perspectives into frameworks before they harden,” says Murray, who will be drawing on findings from the WFD World Conference in Nairobi, where structured workshops on AI engaged national-level deaf decision-makers from 50 countries. “That consultation showed that deaf communities globally have clear expectations about consent, decision-making power, and what genuine partnership looks like versus tokenism.”
There is a lot of work that needs to happen over the next few years, but Murray says the top priority is “developing formal positions that give deaf communities real leverage in AI governance conversations.”
And that is where he hopes this summit can make a difference. “The technical community and the deaf community have largely been developing their thinking about sign language AI in parallel, without enough genuine exchange,” Murray says. Creating concrete relationships between researchers, developers, and community representatives can help create a brighter future for sign language AI.
Gallaudet’s School of Science, Technology, Accessibility, Mathematics, and Public Health (STAMP) offers many ways for students to get involved in about artificial intelligence research.