To deploy an analytics strategy effectively, institutions must avoid missteps at all phases of the project. We asked several IT leaders what stumbling blocks to consider when using analytics.
Bayne: Welcome to EDUCAUSE Exchange, where we focus on a single question from the higher ed IT community and hear advice, anecdotes, best practices, and more.
Data should be managed and treated as a strategic institutional asset. Colleges and universities need to develop an analytic strategy based on institutional strategic priorities that can meet compliance and regulatory needs, monitor progress on short-term goals and long-term strategy, and inform institutional decision-making. But sometimes, along the way, colleges and universities can run into pitfalls around their analytics strategy, and that's the question we're asking on this episode. What are the possible pitfalls around using analytics?
Reinitz: What analytics really does is allows you to take data, turn it into information that can be used for decision-making. So a lot of it is taking a whole bunch of data and then finding patterns within it and then making decisions based on those patterns. With predictive analytics, the idea is that you can use past patterns to guess what's going to happen in the future.
Bayne: That's Betsy Reinitz, director of the Enterprise IT program at EDUCAUSE. She says one of the stumbling blocks in using data is when you allow the data to drive your decisions, rather than simply inform your decision-making
Reinitz: Because the outcome is going to really depend on the data going into it. The patterns that you see are going to be based on the algorithms that have been set up, the way you're analyzing that data. So AI is a good example of how analytics can do things like make course or degree recommendations for students based on how other students with similar profiles have performed in the past. So students who aren't doing very well in something like biology or chemistry might be encouraged to not go into pre-med for example, and so this kind of student profiling could emerge out of analytics if you're not careful, and it could mean that some very capable students could be discouraged from pursuing their dreams and their interests.
Bayne: A danger that can lead to such results, according to Colleen Carmean who's the founder of the Ethical Analytics Group at the University of Washington Tacoma, is lack of training.
Carmean: The institution buying the tool and not training people in its use, not coming up with best practices based on the early stories of how good intention went wrong. The registrar that sent out a notice saying, "I noticed you have lower than a 2.5 grade average for last term. Here's a form to drop out so that you can come back later without penalty." It led to a rise in dropouts, right? If she had been working on a team with people trained in positive messaging and in strength-based advising, that note wouldn't have gone out that way. So the leadership buying the tool and putting it in the hands of traditional practitioners of another field, it's such a struggle, because as a community, we haven't yet defined what the characteristics of a good practitioner in analytics is. So we pull them from our traditional departments and then we're surprised when something unfortunate happens.
Bayne: Once again, Betsy Reinitz.
Reinitz: Understanding how the data fits into the larger institutional mission and how the data can be used to advance that institutional mission is a really important bottom line as far as I'm concerned, getting to a sort of clarity of why you're using data, how it's going to advance institutional mission. And often that clarity, getting to that sense of clarity, requires a very inclusive approach and a very holistic conversation to bring the right people into that conversation.
Bartelson: So there's two ways to talk about this. It's the IT institutional view, and then there's the academics view while we're training students on how to enter this field, right?
Bayne: Jon Bartelson is the Assistant Vice President for Information Services and CIO at Rhode Island College. He talks about having the right people in the conversation and the right training for students interested in the field of analytics.
Bartelson: From the IT perspective, it's about not having the IT and the hardcore data analytics people control the conversation, right? We need to be part of that conversation, absolutely, and help guide it, but it's really the people on the business side that own the data, that know it best, and then we can help them with the technology, right? So that's how I look at it to make sure that we have the right people in the room having the conversations and looking at the data sets objectively. The flip side for us as a liberal arts institution is, so we do have a fledgling data analytics program that's kicking off, we're giving our computer science students the analytical skills, but at the same time, they're getting that liberal arts education behind them in addition to the computer science work that they're doing. So I think we're trying to approach it very thoughtfully on both sides.
Moreau: It is possible to look for too much data, and in the past we've used terms like analysis paralysis, terms like survey fatigue around our students, around our employees, around whomever the stakeholder group is.
Bayne: That's Joe Moreau, Vice Chancellor and Chief Technical Officer at Foothill-DeAnza Community College District. He wonders if reliance on large datasets inhibits an institution's agility with analytics.
Moreau: I think we have a tendency to look for large analytical data sets to support decision-making and planning, and maybe we should really be thinking more about microdata sets, just in time data sets, so as we make decisions and we want to test the validity of those decisions, how can we get small snippets of data back from whomever the stakeholder group is, students, faculty, staff, the public, et cetera, to be able to check ourselves, to be able to monitor the quality of our decision-making as we go along, without inhibiting the agility of that decision-making?
Bayne: Another leader that worries about big data sets and large data in general is Jack Suess, Vice President and CIO at the University of Maryland Baltimore County.
Suess: Often, more data is not better data, and so understanding your data, understanding any biases that you might have in that data, are really key. One of the examples I love to talk about when I'm working with data science students at UMBC is there was a great study that Amazon had done where they were looking at, well, what are the characteristics of really top-notch programmers? Well, one of the things that came out of that study was is it was heavily biased against women, because almost all of the data that they had collected was from men. That was who was already in their population of programmers. And so now if you go and you create, using data science, something we've already implicitly loaded, all that bias that's already been found in society with men as sort of programmers, and we're potentially eliminating women who may be great programmers because they didn't do some of the outside activities. And so that's just one of the sort of ways that I think sort of scares me and is the dark side, is it's so easy to be getting more and more data that we think that adds more and more validity, and it's not always the case.
Bayne: If you'd like to read more about best practices for analytics, visit our analytics channel on EDUCAUSE Review by going to er.educause.edu/analyticschannel. There, you'll find a wealth of articles, videos, and more from leaders in the field around analytics and using data strategically. I'm Gerry Bayne for EDUCAUSE. Thanks for listening.
This episode features:
Assistant Vice President for Information Services and CIO
Rhode Island College
Founder, Ethical Analytics Group
University of Washington, Tacoma
Vice Chancellor of Technology and CTO
Foothill-DeAnza Community College District
Director of Enterprise IT Programs
Vice President and CIO
University of Maryland, Baltimore County