Learning technologies, especially those relying on data analytics, are exciting but also present ethical challenges that deserve our attention and action.
Gerry Bayne: Welcome to EDUCAUSE Exchange, where we pose a question to the higher ed IT community and hear advice, anecdotes, best practices, and more.
Gerry Bayne: In a recent EDUCAUSE review article titled “Digital Ethics in Higher Education: 2020”, President and CEO, John O'Brien, defined digital ethics as, "Doing the right thing at the intersection of technology innovation and accepted social values." New technologies, especially those that rely on artificial intelligence or data analytics, are exciting but also present ethical challenges that deserve our attention and our action.
Gerry Bayne: So for this episode of EDUCAUSE Exchange, we asked several IT leaders: how do you address the ethical issues around learning analytics?
Loralyn Taylor: It is difficult to remind people what predictive analytics is and what it can do.
Gerry Bayne: That's Loralyn Taylor, Interim Associate Provost for Institutional Effectiveness and Analytics at Ohio University. Her concern with predictive analytics is that it doesn't factor into access.
Loralyn Taylor: So many times people, especially people that don't work with the data every day, think that data are objective, and they're not. They're a complete subjective reflection of our own reality, and our own reality has student success gaps with our minoritized students. And so, all of our predictive analytics are going to reflect that, and so you have to be really careful not to get to predicted to fail and have people start to say, "Well, we shouldn't be providing access to these students." The easiest way for us to increase our completion and graduation rates is to become more selective, but that's not where we need higher education to be as a national goal.
Gerry Bayne: Her institution is addressing this by being proactive with providing access.
Loralyn Taylor: I'm really happy that Ohio University has a very active access mission. We have about a third of first generation students and close to that in Pell students as well. Being located in Appalachian Ohio, we really do try to have a local access mission.
Robert Carpenter: I think you need to think about using analytics to open doors or hold doors open and not to close doors.
Gerry Bayne: Robert Carpenter is Associate Provost for Analytics and Institutional Assessment, and Deputy CIO for Business Intelligence and Student Success Technologies at the University of Maryland, Baltimore County.
Robert Carpenter: I mean, the whole point of analytics, the way that I see it, is how do you provide support, and how do you coordinate that support in the students' interests or in their pursuit of their educational goals. [inaudible 00:02:53] what we're really using analytics for is identifying students who may need some additional support and then giving them a message that's typically phrased in an empathetic way, not, "Our predictive model suggests that you may need to go to tutoring," more along the lines of, "Just checking in with you at the midpoint of the semester," which can be really stressful for a lot of people. "If you feel that you need some support, here are some places that you can go."
Gerry Bayne: Carpenter says there's a limit to what analytics can track and predict, and that it's important to remember the blind spots of predictive learning analytics.
Robert Carpenter: You can't use analytics to foreclose opportunities. I mean, the tools are just not there, and furthermore, analytics is good for understanding sort of the behavior of groups more than it's useful for understanding the behavior of an individual within a group. It's an inference about a population parameter, it's not an inference about a person. Some students who should succeed, sort of based on the information that's available to us, don't, and some students who you look at and you say, "This student is going to have a very difficult time with this pathway," do exceptionally well. And the reason I think that the models don't predict accurately, is they don't capture things like student effort, they don't capture a student's engagement, they don't capture a student's determination. At UMBC, we talk about grit, right? They don't capture students' grit and determination.
Gerry Bayne: He says that when an institution decides to use analytics to capture extra information about a student's behavior on campus, they must prioritize that student's interest.
Robert Carpenter: How engaged is a student on campus? Are they going to extracurricular events? Are they going to the Career Center? Are they engaged in the life of the campus? So all of those things track, in some cases, behavioral movements on campus, and so, what's the appropriate boundary for us to kind of capture that data and use it? And I think we need to make sure that we're always the agent of the student because we have some sort of duty to the institution, but when you think about using analytics to support students, you have to, I think, shift your perspective so that you view yourself as the agent of the student.
Gerry Bayne: David Kowalski is Associate Vice President for Institutional Effectiveness and Strategic Innovation at Montgomery County Community College. He says that when looking at predictive analytics, having someone at the table that understands the data and the data structures is essential to ethical implementation.
David Kowalski: Because a lot of times we're looking at information, and if we don't have that person at the table, we don't know where that came from exactly or how that was derived, and that can have big implications for what we actually do based on that. If we have someone sitting there and saying, "Well, that data wasn't grabbed in the way you may think it was grabbed," then it doesn't reflect what you think it might reflect, based on the labels or how it's being presented, but really, it comes from this. And I think it goes on the opposite end too, so not only on the data going in, but for people to be knowledgeable of the context.
David Kowalski: It's having people at the table saying, "Hey, this class is coming up is a problematic course for students, and based on what we're seeing here, I have some insight into why that might be because they're actually taking it out of order here," or "I'm the designer for that course, and I know that that course, the online design is very, very difficult, so that may be one of the reasons," instead of jumping to other assumptions or conclusions based on the data.
David Kowalski: And I think finally, just having training, making sure we're not just rolling out analytics to everyone, just [inaudible 00:06:45] hitting the button and saying, "Now we have analytics," but making sure that people have some basic understanding, that they're connected to people, that they know that there's people they can ask questions of, and that we check in with them.
Gerry Bayne: He says that the really important piece of using analytics ethically is taking the time to sit down and have meaningful conversation about the data, about what they might do with the insights, and what they shouldn't be doing with them.
David Kowalski: So for example, I'm working with a few faculty right now on a kind of predictive analytics action research project at the college, and they're getting predictive analytics on their students' persistence at the beginning of the term and incorporating that with some interventions and kind of an action research approach where you check in. And, one of the things we're talking about is how do we use data like that in a way that doesn't stereotype students, but can be used to help them? And again, I think it's something where it's not always a clear cut, "This is what you do to make sure you don't do," you don't stereotype your students or you don't perceive them negatively and unconscious bias starts playing a role. It's more about sitting down and having a meaningful discourse about it. I'm afraid that if I look at this information before I see my students, suddenly I'm going to be perceiving them one way, but what's happened is something totally different. We found out that faculty, oftentimes, they use it in the beginning, but they seem to forget it after a certain piece because they learned to know the student and that takes over. So, for us in this particular project, it's given them a little insight at the beginning and then their own intuition and knowledge takes over.
John O'Brien: There's also ethical issues on both sides. It is absolutely a moral and ethical issue when we have so many students at our colleges and universities who start a degree and don't finish. They start based on a hope and a promise that this is going to change their life, and they end up leaving early with no degree and student debt.
Gerry Bayne: That's EDUCAUSE President and CEO, John O'Brien. O'Brien has written an article for EDUCAUSE Review, entitled “Digital Ethics in Higher Education: 2020”.
John O'Brien: So there's ethics on both sides. There's ethics that say, "We need all the tools we could possibly have to make a difference." But, as I've also said in the article that you mentioned, Jerry, there's an awful lot of ethical risk if we just blindly go into this. We need to be extremely thoughtful. It's about a balance, and it tips one way or another, and the key is to be attentive and aware that it's a balance and to make sure that's the case. In the end, I believe higher education needs to lead the way. We need to lead the way in the appropriate and effective use of data and analytics, and we need to lead the way in attention to digital ethics. After all, we are in the process of influencing and training the entire next generation of folks who are going to build AI applications, work with algorithms and analytics. And so we have a real important obligation and responsibility to make sure they do this work with a balanced sense of ethics involved.
Gerry Bayne: Again, that article by John O'Brien is called “Digital Ethics in Higher Education: 2020”. You can find it at er.educause.edu/digitalethics>.
Gerry Bayne: I'm Gerry Bayne for EDUCAUSE. Thanks for listening.
This episode features:
Associate Provost for Analytics and Institutional Assessment
Deputy CIO for Business Intelligence and Student Success Technologies
The University of Maryland, Baltimore County
Associate VP of Institutional Effectiveness and Strategic Innovation
Montgomery County Community College
President and CEO
Interim Associate Provost for Institutional Effectiveness and Analytics
John O’Brien, “Digital Ethics in Higher Education: 2020,” EDUCAUSE Review 55, no. 2 (2020)