The Impact of AI on Accessibility

min read
EDUCAUSE Exchange | Episode 5

Students with disabilities are a vulnerable population in higher education. And the emergency move to remote instruction in the wake of the COVID-19 pandemic has thrown the vulnerability of this population into stark relief. One of the technologies that is being explored to help improve and create tools for a more accessible learning environment is artificial intelligence.

Listen on Apple Podcasts Listen on Google Podcasts Listen on Spotify Listen on Stitcher

View Transcript

Gerry Bayne: Welcome to EDUCAUSE Exchange, where we focus on a single question from the higher ed IT community and hear advice, anecdotes, best practices, and more. Students with disabilities are a vulnerable population in higher education. 19% of undergraduate students in the United States reported having a disability in the years 2015 and 2016. Yet the real percentage is likely higher, given that many choose not to disclose their disability to their institutions. Students with disabilities experience barriers to education that many other students do not. And they can have both visible and invisible needs. Their dropout rates are substantially higher and their graduation rates are significantly lower than those of non-disabled students.

And the emergency move to remote instruction in the wake of the COVID-19 pandemic, has thrown the vulnerability of this population into stark relief. One of the technologies that is being explored to help improve and create tools for more accessible learning environments is artificial intelligence. And the question for this episode is how will artificial intelligence impact accessibility?

Mark Hakkinen: Artificial intelligence or AI, also sometimes using the term machine learning, are really promising capabilities and technologies for accessibility and inclusion. One of the key things we see is the potential for making students more independent.

Gerry Bayne: That's Markku Hakkinen, Director of Digital Accessibility for Educational Testing Service or ETS. A nonprofit organization whose mission is to help advance quality and equity and education. When talking about AI's ability to help students become more independent, he focused on three areas.

Mark Hakkinen: One is improving the user experience for students who rely upon assistive technologies. And one of the more promising areas that we're actually applying today at ETS is the use of advanced speech synthesis technologies, which are based on machine learning models. So the quality of synthetic speech is improving rapidly, more natural. In recent developments, we've been able to replace some human recorded audio that was used to supplement test content for students with disabilities by synthesized speech using technologies from Amazon. So we're able to in fact, improve user experience, improve turnaround time for producing alternate format materials and provide a much more natural and clear text to speech voice for the students.

The other area we see is for improving our own processes. Where can AI help us produce accessible content more rapidly? Turnaround time is always significant when you have a lot of human involvement in producing things such as text descriptions or orientation information to how a student who's visually impaired may approach a complex set of test questions. And so we're looking right now at how artificial intelligence techniques can be used to automatically describe images. We're nowhere near making that operational, but the potential is at some point we can have AI based systems do a first pass at describing content and then subject matter experts can then spend time tweaking it or refining it or saying, "Hey, this is no good. We'll have to write that from scratch."

The final area is really using AI and machine learning technologies to provide new ways of interacting content in an accessible manner. And the thing that I've been most excited about as have team members that I have who are visually impaired in my group at ETS, things like the Amazon Echo provide really new ways of interacting with content through a spoken dialogue model. Again, all driven behind the scenes by AI and machine learning technologies to handle the natural language interactions.

And so we're looking at things like the Echo devices changing the whole paradigm of what assistive technologies are. You can basically look at a screen reader as being a fairly mechanical process of interacting element by element through a webpage. Whereas AI could allow a student who is visually impaired to basically ask, "Well, tell me what's important on this page." Or, "How many questions are here on the page that we'll need to answer? How long is the reading passage? Read the passage to me." And so these are capabilities that become a lot more natural in terms of interaction and perhaps more efficient from the perspective of a student who previously had to go sequentially navigating through a web content page to determine what I have to do.

Gerry Bayne: Carly Gerard, web accessibility engineer at Western Washington University, shares Hakkinen's excitement about consumer tools like Echo, Siri and Alexa, but would like to see the AI features develop a great deal more.

Carly Gerard: I would really love to see AI take some common apps for accessibility such as Seeing AI that can point at something and be able to tell someone what that is in an oral sense. So I can point to a dollar bill and it goes, "Hey, tell me what dollar bill this is?" "$15." I would love it if that was just built in to smart assistants like Alexa or Siri so I could just say something like, "Hey Siri. Tell me what I'm looking at." I just think that would also then maybe decrease the stigma to where that's just part of the system that everybody uses. You don't need to go and pay for an app. It just comes on board with those devices.

And I would also like to see those smart assistants be able to navigate a webpage. Currently there are screen readers available, which is great because that can easily determine the structure of a page. But again, say if maybe you're doing something in the kitchen. You're not close to a device and you're trying to get a sense of a webpage for example, or maybe you'd rather just hear it and not look at something. Maybe it's too visually stimulating. So someone could go, "Alexa, tell me all the headings on a page." And that could then give someone the sense of how a page is structured and then be able to figure out where to go or even skip content.

"Alexa, take me to the navigation. Take me to the footer of the page. I need contact info." So I'm not sure. I don't think I've seen smart assistants really do that. There might be some read aloud features, but just to make that more robust for ease of use.

Judy Brewer: I think artificial intelligence will eventually impact most of accessibility. And I expect that the transition will be incremental. I think of its likely impact actually as a combination of promise and peril. So if we start with a peril side of things, AI is already impacting people with disabilities at a times in a negative way.

Gerry Bayne: That's Judy Brewer, Director of the Web Accessibility Initiative at W3C or The World Wide Web Consortium. She sees some real challenges that artificial intelligence presents for confronting problems of inequity, but also sees the promise of better tools and better opportunities for those with disabilities.

Judy Brewer: From a user experience design perspective, people with disabilities essentially represent the outliers in human functioning. We're not the "average user". Our user requirements are sometimes on the edge of the usual experience, which by the way, is the zone that drives a lot of really interesting innovation. So that's a neat zone to be in if you're trying to develop cutting edge technologies.

But AI based design and development is often driven by the average needs and behaviors that are informed by data modeling training sets. And if an automatic speech recognition system is evolving, it's going to get optimized usually around common speech patterns, not around the speech patterns of people with speech disabilities. So if somebody's working really hard in their education then they go out and try to get a job, the ASR, the Automatic Speech Recognition is not going to be optimized for them. They'll have barriers in that, or even in a classroom if they're using speech recognition. They could be disadvantaged by the current designs.

On the other hand, there's also great promise. If we get this right, then AI can greatly enhance future accessibility. Images on websites that some people can't see could be automatically described in context. We're not quite there yet, but that's where we want to be. The Automatic Speech Recognition may be good enough at some point to qualify for captions for media in a way that is a true accommodation, rather than an approximation with critical errors in it. People with disabilities will be able to fully control their environments. Maybe not only at home, but in the workplace, in the classroom. We're already seeing artificial intelligence being used a lot in the assessment of conformance to accessibility guidelines. And I see that as a positive step. We want the evaluation of accessibly to be as fully automatable as realistically possible.

That's going to be an incremental process, but over time it will lead to more scalable conformance assessment. Ideally, you would want someone to have an online learning site for instance, and be able to run a... Crawl a tool across it and rapidly find out either it's great for accessibility or you've got a lot of issues you need to address to make sure you're not losing some of your learners. And here's a priority ranking of what to address. That would be great. And so, currently we can't fully automate yet because you need some manual evaluation of usable accessibility. But the whole process is coming along. And there's lots of other ways in which AI could be helping accessibility beyond evaluation. And we look forward to all of those as well.

Gerry Bayne: If you'd like to find out more about the technology needs of students with disabilities, you can read the summary of a recent study from EDUCAUSE about the topic. Find the article at er.educause.edu/equitableaccess. I'm Gerry Bayne for EDUCAUSE. Thanks for listening.

 


This episode features:

Judy Brewer
Director of the Web Accessibility Initiative (WAI)
at the World Wide Web Consortium (W3C)

Carly Gerard
Web Accessibility Developer
Western Washington University

Mark Hakkinen
Director of Digital Accessibility
Educational Testing Service

Recommended Resource

Dana C. Gierdowski and Joseph D. Galanek “ECAR Study of the Technology Needs of Students with Disabilities, 2020,” EDUCAUSE Review, June 1, 2020.