During his remarks at a ceremony commemorating the atomic bombings of Hiroshima and Nagasaki seventy-one years ago, U.S. President Obama contextualized the moment by saying: "Technological progress without an equivalent progress in human institutions can doom us."1 Those of us working in higher education may not deal with the enormous global consequences of atomic weapons, but we do play powerful roles in helping our institutions appreciate the transformative effects, either positive or negative, of the technologies that we lead those institutions in adopting.
Many of us in the EDUCAUSE community have careers centered on examining—and helping colleagues consider ways of adapting—technologies so that they can be used to the greatest effect in teaching, learning, and research. Resistance to technological adoption can be rooted in fears of change, of technologies, or both. To some extent, our work entails engaging with colleagues to address concerns, both real and imagined, and to champion the adoption of new tools where justified. As institutional scouts in technological marketplaces, we have roles that also entail a deep understanding of our institutions' needs and a critical eye for separating the hype surrounding technological developments from the realistic uses. This is a cycle that frequently repeats itself in our field.
Learning analytics tools represent a complicated iteration of this cycle. We will need to be on the top of our game in imagining future uses and also in engaging in informed critique.
At present, colleges and universities struggle mightily to improve learning environments and, more importantly, student success. A common, high-level measure of success is a graduation rate within 150 percent of the "normal" time for completion—that is, within six years for four-year institutions and within three years for two-year institutions. In the United States, that rate currently stands at a mere 60 percent for four-year and 31 percent for two-year institutions.2 Nationally, a series of characteristics—such as institutional selectivity in admissions, race and ethnicity, socioeconomic status, and gender—are used to describe variations in these rates.
College and university leaders are increasingly trying to understand variations in graduation rates and the underlying causes. This is where learning analytics comes into the conversation. In what ways might the data relating to student characteristics and behaviors better inform the learning environments we design for our students? How might this data inform our students' choices? It is critical that we engage in systematic studies wherever possible. This is true not only for our students' sakes but also for the viability of our institutions. The stakes are high.
At best, analytic systems offer us the prospect of capturing data from student information systems, learning management systems, and other sources. With this data in hand, we have tools that promise to provide a means of identifying individual students who are struggling or institutional structures that do not serve their intended purposes. This idea motivates dedicated institutional leaders to adopt and invest in tools that fall under the category of learning analytics. So if we have data in hand and the means to analyze it, what is the problem?
Learning analytics discussions can be both fascinating and fraught, particularly to the extent that these tools do not clarify the algorithms or statistical models employed. This lack of transparency may be attributed to business models or to machine learning techniques. These techniques employ computational power not only in analyzing data but also in establishing the means by which the analysis takes place. In other words, one might end up with a list of students "at risk" but with no clear understanding of how exactly those students were identified. It is hard to overstate the degree to which this is a departure from established research methods, particularly in educational research. In contrast, it is much more common for those in the social sciences to specify statistical models based on findings in the literature or on hypotheses developed locally and then identify models with the greatest predictive power. This is a means of addressing issues of correlation or confounding variables that might otherwise lead to flawed analyses.
In terms of learning analytics, now is not the time for premature clarity. This is true whether that clarity takes the form of rejecting learning analytics tools for a lack of methodological familiarity or of accepting the product of these tools because they provide the comfort of actionable results, justified or not. Our role as information professionals is not to arbitrate these methodological debates but, rather, to make sure that significant institutional investments in learning analytics are warranted. This will entail marshaling the best thinking at our institutions. Unlike our colleagues in other sectors, those of us in higher education have the luxury of working with dedicated scholars who are both proponents and skeptics of the methods employed by analytics systems. As we engage in these debates, we need to judge success not on whether analyses are "actionable" but rather on whether the actions prompted actually improve student outcomes.
Fortunately there is important, emerging work that can inform our discussions. The NSF-funded Council for Big Data, Ethics, and Society—led by danah boyd, Geoffrey Bowker, Kate Crawford, and Helen Nissenbaum—is a group of academic and industry researchers. The Council's work is broader in scope than learning analytics but promises to be of benefit insofar as it addresses methodological and ethical issues. The group's white paper "Perspectives on Big Data, Ethics, and Society" both enumerates policy gaps and identifies critical areas for further research.3
In addition, efforts are increasing to examine the resulting analyses in studies in which the methodological approaches are undisclosed. Anupam Datta, Shayak Sen, and Yair Zick of Carnegie Mellon University have developed a Quantitative Input Influence system designed to identify the degree of influence that input variables have on resulting outputs.4 ProPublica made a notable contribution to critiquing a predictive analytics system used in judicial proceedings and shared the methods of its analysis.5 These are both important models for examining unintended biases that may arise in algorithmically based decision making.
Technological leaders need to realize the potential effects, either positive or negative, of learning analytics. As we help our institutions navigate this situation, we should be prepared to draw on the strengths of both the proponents and the skeptics in our communities to ensure that institutional mechanisms are in place to examine the overall efficacy of learning analytics systems, as well as any unintended bias or other deficiencies that may creep into analyses. As always, the ultimate measure of these technological systems should be the degree to which they improve the success of the students at our institutions. As President Obama recognized, power lies in pairing technological progress and institutional development.
Notes
- "Remarks by President Obama and Prime Minister Abe of Japan at Hiroshima Peace Memorial," May 27, 2016.
- Grace Kena et al., The Condition of Education 2016, National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education, May 2016.
- Jacob Metcalf, Emily F. Keller, and danah boyd, "Perspectives on Big Data, Ethics, and Society," Council for Big Data, Ethics, and Society white paper, May 23, 2016.
- Anupam Datta, Shayak Sen, and Yair Zick, "Algorithmic Transparency via Quantitative Input Influence: Theory and Experiments with Learning Systems," Proceedings of 37th IEEE Symposium on Security and Privacy, May 2016.
- Julia Angwin et al., "Machine Bias," ProPublica, May 23, 2016.
Andrea Lisa Nixon ([email protected]) is Director of Educational Research at Carleton College.
© 2016 Andrea Lisa Nixon. The text of this article is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.