Gallaudet University
Who We Are
Our Work
Overview
News & Stories
Oct 4, 2024
Upcoming Events
October 16, 2024
October 17, 2024
University Wide Events
No Communication Compromises
Areas of Study
Schools
Programs
Changing the world
Research
Community & Innovation
Research Experiences & Services
Our Global Presence
Global at Home
Global Learning For All
Global Engagement
Your Journey Starts Here
Admissions
Financial Aid
Explore Our Campus
Connect
Discover
Influence
Directories
Popular Keywords
Explore
Quick Links
GU
/
Accessible Human-Centered C...
Professor presents at AAAS panel on...
Products are increasingly being designed to talk to consumers, explains Dr. Abraham Glasser, assistant professor in the School of Science, Technology, Accessibility, Mathematics, and Public Health. Cars invite drivers to ask for directions, and devices such as Alexa and Google Assistant are meant to chat about the weather, recipes, and other topics. “It is adding more and more barriers for people who don’t use speech,” says Glasser, who is deaf and uses American Sign Language (ASL). “There is no way for me to input sign into that.”
Glasser offered his perspective as part of the panel, “I’m Sorry, I Don’t Understand: How Voice AI Poses Barriers — and Some Solutions,” during the annual meeting of the American Association for the Advancement of Science (AAAS), February 15-17, in Denver, Colorado. Moderated by Dr. Shelley B. Brundage of George Washington University, who organized the session with Dr. Nan Bernstein Ratner of the University of Maryland, it also included researchers looking at the problems this technology poses for people who stutter, mumble, or have other kinds of speech impediments.
“The moderator was good at connecting all of our work into one cohesive story,” says Glasser, whose presentation was titled, “Voice AI, Deaf Speech, and the Search for Signed Language AI Interfaces.” “We want technology that works for all people, and all of us had common issues.” A major flaw with the technology is that it cannot recognize a range of different voices. “Some deaf people speak for themselves, but it doesn’t understand deaf accents and speech,” Glasser adds.
Glasser is looking into other methods of input, and collecting data on what deaf people want from this technology. “They need to find ways for me to sign everything,” he says. He believes that AI that understands ASL (and other sign languages) is possible, and that it is critical to be thinking about how to make that happen.
“Looking at the audience, I saw a lot of light bulbs coming on,” Glasser says. “Often companies don’t consider people with disabilities at all.”
Glasser also appeared March 16 at the IEEE VR 2024 conference in Orlando, Florida. His presentation, “XR Research Avenues for Deaf Users,” was part of a workshop on Inclusion, Diversity, Equity, Accessibility, Transparency, and Ethics in XR.
Fill out our inquiry form for an Admissions Counselor to contact you.
Create an account to start Your Applications.
October 4, 2024