We asked several leaders in the EDUCAUSE community where they currently stand on the use of predictive analytics for student success.
View Transcript
Loralyn Taylor: When we're talking about measuring student success not only do our students have different definitions of success but the different areas across campus have different definitions of student success and so partnering with Student Affairs, or with different departments within academics, they all might have different goals for their students and their students also have different ideas of what they want to do and so in order to make sure that higher education is the transformative experience that we want it to be, we have to be tracking more and more metrics that are specific to the outcomes that those students are looking for.
John Campbell: You know when I talk to other institutions about how to use predictive analytics and how to tackle this around student success, I encourage them first to look at something that they can take action on. You can immediately lose your focus by looking at the data and asking more questions, and asking more questions, and in the end, may not have something you can take action on. And so I always encourage people, focus on the action, what do you wanna do in the end, and are the questions that you're asking lead you to an action. 'Cause if it doesn't, it just becomes curiosity, rather than a benefit for the institution.
David Kowalski: That we've tried to use it in different ways. We've done Nudge Campaigns with it, we've done triage with it, we have used it for randomized control trial of intensive advising. We kind of incorporate it in a lot of different ways and sometimes we use it in the daily functioning, checking where we are in terms of how our students are doing, changing the actual discussion about what it means for the student to be successful in the given course.
Robert Carpenter: I think the use of technology and analytics at colleges and universities is in it's infancy but there are just very exciting developments. So I wanna take a little bit of a step back because I actually feel like in many cases we're tool rich, and we haven't really figured out what to do with a lot of these tools, how to combine them into an ecosystem for coordinated actions, but I also think that in a lot of cases you have to think about the incentives that you're providing people to use the information that comes from the analytical tools. So implementing and in many cases conducing analytics on students and on student data, it's a technical challenge to be sure, but it's one that's solvable. I mean, we know how to do this. We've got a lot of brain power at universities and a lot of people at universities have very good technical knowledge, intuitive knowledge about econometrics and statistical modeling and predictive modeling so that's not the major challenge. The challenge that I talk with my Provost and my President about is once we find this information out what are we prepared to do with it? What incentives are we providing people to actually take action based on the analytics?
Loralyn Taylor: I think we're all familiar with the Gartner Maturity Curve for analytics, and that's not only a great visual for helping you understand where your institution is and where your end users are, but I also look at that as a frame work for how do we move our institutions forward because we have a lot of people in higher education, particularly some of the ed-tech vendors that are saying they can come in and they can jump you from only looking at descriptive statistics on your students all the way to using predictive or prescriptive analytics in a matter of weeks or months and we know that's just not true because if you are overwhelming your end users with metrics and information that they don't know how to use, one they'll shut down and that's the best case scenario, they just won't use it, or two, the worst case scenario is that they'll misuse that, and unfortunately some of the language we use around predictive analytics, the very nature of predictive, makes people think that it's easy to go from, "Oh this student is predicted at risk" to "This student is predicted to fail." and that is never the approach that we want our end users, our faculty or staff, to think, "Oh, well, I shouldn't be putting effort into this student "because they've told me this student "is definitely going to fail." And so it's an educational process, even more than it is a backend can we analyze the data. Yes, we can analyze the data, but the more important piece is educating people how to use the information and how to make it actionable to help our students.
John Campbell: I see a big shift on how WVU and other institutions are looking at students success. When we started out a lot of student success was about these big bulky measures and often times tied to things like the learning management system or completion at the end. Now we're getting into things and trying to understand all the various elements of student success. Does students find a fit within their major? Do students take opportunities to do things, such as study abroad or internships? All these things build to a larger set of student success metrics and that's really what has changed over, particularly the last three or four years.
Richard Sluder: It seems to me that one of the biggest untapped sources of assisting students are students. You know, students have to be involved in this too. They have to own their own success, so we see student facing platforms, on their phones, and other attempts to integrate them into the process, so it is including them as a part of the plan so that we could provide guidance as they proceed in this technology rich environment.