The Two Worlds of Learning Analytics

min read

To begin, I'll make three assertions:

  1. Learning analytics on the course level is very different from analytics on the institutional level.
  2. The rise of institution-level analytics may be a symptom of an increasing disconnect between university administration and faculty.
  3. Combinations of course- and institution-level analytics are rare and may not be desirable.

I recently attended a gathering of researchers and administrators at a large university to discuss learning analytics, which are sexy, fashionable, highly fundable, and profitable. I had been looking forward to the meeting, hoping to contribute work my university had done on analyzing data gathered from course management systems. Nobody was interested. Everybody else was analyzing institutional core data: registrar's, demographics, admissions, time-to-graduation, attrition… Why is course management data so different? Why the separation?

The answer is scale. In institutional core data, a course boils down to one data point per learner: the grade. In course management systems, a course easily produces five orders of magnitude more data points per learner. But more importantly, there are also scale differences in thinking: to what level are courses just data points, students just data sources, and faculty interchangeable commodities? Are we looking at individuals, or are we looking at large-scale systems of education products?

The World of Learning Analytics in Course Management Systems

Course management systems (or any other web-based teaching platforms) collect large amounts of transactional and log data during courses. This World of Learning Analytics emphasizes formative assessment, i.e., assessment that accompanies and informs the ongoing learning process. The goal oftentimes has been the creation of adaptive, personalized, "intelligent" learning environments, facilitated either by automated processes (established by instructional designers) or better-informed instructors.

Today, course management systems commonly include dashboards as integrated or pluggable components.1 These dashboards can provide valuable information for faculty to personalize and adapt course instruction to the needs of learners, but only inasmuch as they detect symptoms, not reasons, for poor performance. Lack of preparation, nervousness, poor study techniques, bad time management, external constraints like work or illness, phobias, relationship issues, or lack of study partners can all lead to "alarming" performance, but only personal interaction can elicit and address these underlying challenges. Learning analytics can enhance but not replace personal interaction in the classroom or in office hours.2 Faculty need to be accessible to students,3 and meaningful faculty-student interactions can promote student retention.4

The World of Learning Analytics in Institutional Core Data

Institutional core data — from student information systems, registrars, and financial systems — are in some respects more closely related to e-commerce than e-learning. Institutions don't necessarily rate courses according to how much students learn, but as a means to an end, namely successful achievement of the product called "degree." Mining of institutional core data does not inform instructional designers and instructors involved with teaching and learning, but instead advisors and administrators. In times of ever-rising tuition, educational institutions want to give students quicker return on investment and stay competitive when it comes to retention, graduation rates, and time-to-degree.

In this World of Learning Analytics, course data is summative: whatever evaluators gather from a student's success or failure in a particular course does not help that particular student. Except, possibly, if prior data can be used to advise the student on which course to attempt next.

Maybe it is no surprise that the rise of this World of Learning Analytics coincides with the general bloating of university administrations and the trend toward interchangeable non-tenure-track faculty.5 The rise in cost of student services and institutional support has outpaced instruction.6 Running universities like a business rather than a public service introduced the tools of the trade of a business. Meanwhile, the gulf between administration and what actually goes on in the classrooms continues to widen, and industry stands ready to provide costly "solutions" to this problem.

Should There Be Two Worlds? These Two Worlds?

I frequently hear the argument that the two worlds of learning analytics should not be "worlds apart," a position sometimes portrayed as widening the view.7 A combined assessment of courses, majors, course sequencing, demographics, test scores, high school preparation, and performance inside a particular course could add up to much more reliable analytics than either single "world view." An example of such a combined system is Purdue's Course Signals, which boils all of this data down to a green-yellow-red "traffic light" as a measure of risk.8

But is this really what we want? And who should get access to such combined data? For starters, course faculty should not have access to students' academic backgrounds and performance outside of their courses, as such information could easily bias grading decisions.9 On the reverse side of the coin, university administrators should not have access to individual course grading data, as it is not their business to interfere with faculty conducting their courses.

Arguably, students and their academic advisors are the only groups who should have access to such a widened view, as it may help them better navigate toward degrees and "trigger" personal meetings with advisors and faculty. Is it worth all the effort and financial outlay recently put into academic analytics to provide this functionality?

Course-level learning analytics are cheap, at most involving licensing fees for some plug-ins added to just one system and data source: the course management system. Institution-level learning analytics cost more than just licensing fees. Systems integration of diverse and partly archaic university systems, staff training, privacy auditing, support, and maintenance all contribute to initial and continuing costs.

While the rise of institution-level learning analytics might reflect the trend toward larger administrations and increasing disconnects to what is happening in courses, an alternative approach might prove a better way to spend the money. Instead of treating courses and faculty as business commodities, hire traditional tenure-track faculty with a permanent commitment to the institution's success and let them design courses according to their expertise. Doing this could restore universities to their role of public service assets. And hiring more academic advisors to build personal relationships with students could increase retention more than applying advanced business analytics to all the data collected about them. In the case of learning analytics, the personal touch — supported by the data collected about students — could make the real difference in their success.

Notes

  1. Katrien Verbert, Erik Duval, Joris Klerkx, Sten Govaerts, and José Luis Santos, "Learning Analytics Dashboard Applications," American Behavioral Scientist, Vol. 57, No. 10 (October 2013): 1500–9; DOI: 0002764213479363.
  2. Ibid.; Carol A. Lundberg and Laurie A. Schreiner, "Quality and Frequency of Faculty-Student Interaction as Predictors of Learning: An Analysis by Student Race/Ethnicity," Journal of College Student Development, Vol. 45, No. 5 (September/October 2004): 549–565; and Robert C. Wilson, Lynn Woods, and Jerry G. Gaff, "Social-Psychological Accessibility and Faculty-Student Interaction beyond the Classroom," Sociology of Education, Vol. 47, No. 1 (Winter 1974): 74–92.
  3. Wilson, Woods, and Gaff, "Social-Psychological Accessibility."
  4. Vincent Tinto, Leaving College: Rethinking the Causes and Cures of Student Attrition (Chicago, IL: University of Chicago Press, 1987).
  5. Paul F. Campos, "The Real Reason College Tuition Costs So Much," New York Times, April 4, 2015; and American Association of University Professors, "Losing Focus: The Annual Report on the Economic Status of the Profession, 2013–2104."
  6. Donna M. Desrochers and Steven Hurlburt, "Trends in College Spending 2003–2013: Where Does the Money Come From? Where Does It Go? What Does It Buy?" American Institutes for Research, Delta Cost Project, January 2016.
  7. Veronica Diaz and Shelli B. Fowler, "Leadership and Learning Analytics," EDUCAUSE Learning Initiative Briefs, November 26, 2012.
  8. Kimberly E. Arnold and Matthew D. Pistilli, "Course Signals at Purdue: Using Learning Analytics to Increase Student Success," Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, ACM, 2012: 267–270.
  9. John M. Malouff, Ashley J. Emmerton, and Nicola S. Schutte, "The Risk of a Halo Bias as a Reason to Keep Students Anonymous During Grading," Teaching of Psychology, Vol. 40, No. 3 (July 1, 2013): 233–237; DOI: 10.1177/0098628313487425.

Gerd Kortemeyer is an associate professor of Physics Education at Michigan State University with a joint appointment between the Lyman Briggs College and the Department of Physics and Astronomy. He is also director of the LON-CAPA and CourseWeaver projects. He received his Diplom (Master's) in Physics from the University of Hannover, Germany, and his PhD in Physics from Michigan State University. His research interest is the effective use of technology in science education, with a particular focus on assessment.

© 2016 Gerd Kortemeyer. This EDUCAUSE Review article is licensed under Creative Commons BY-NC-ND 4.0 International.