Executive Summary: ELI White Paper

min read

Veronica Diaz is Associate Director of the EDUCAUSE Learning Initiative.

On April 11 and 12, 2012, the ELI teaching and learning community gathered for an online focus session on learning analytics (LA). Analytics is an umbrella term for the use of data, statistical analysis, and explanatory and predictive models to gain insights and act on complex issues. As a genre of analytics, LA uses these methods to achieve greater success in student learning. LA can be used in a variety of ways, some of which include alerting faculty, students, and advisors when intervention is needed; providing input for continuous improvement in course design and delivery; and enabling personalization of the learning environment.

The focus session examined several key LA issues:

  • Learning is an inherently social process and therefore is complex and multidimensional. Even the best evidence about learning requires interpretation, which entails analysis of the evidence with a full view of the learning context. By helping make sense of these factors, LA enables decision making. View George Siemens's full session on this subject.
  • Analytics can leverage data across an entire institution and can inform strategy, planning, and resource allocation. It also has the potential to benefit not just students but also instructors and can be a tool for the faculty, helping them make decisions about course tactics and design. Hear more on this topic from John Fritz and from Tom Cavanagh and Chuck Dzuiban.
  • It could be argued that LA is tantamount to snooping—to invading the privacy of our students—or that the information revealed by an analytics system places a responsibility on the institution to act on that information, similar to a physician receiving a report about a patient. View John Campbell's full session on this topic.
  • Vernon Smith described a five-step technical process outlined in a 2007 paper by Campbell and Oblinger to begin learning analytics work (capture, report, predict, act, and refine), though he added an additional first step—charter. Learn more about this approach in Smith's session.
  • Because it is a rich source of data about student activity, the LMS plays a key role for LA. ELI invited representatives from leading LMS companies and an expert consultant on LA to participate in an interview-format discussion of LA. View the full session with Don Norris, Al Essa, and Jim Chalex.
  • In this concluding session, Simon Buckingham Shum explores the best or most genuine role of LA in higher education and explores some future directions.

The focus session contained eight examples — each set along a different theme—of institutions that had implemented some form of learning analytics in their instructional environments.

Course Design and Student Success Projects

Institutions collect data around various tools, and in the first project round that tool was lecture capture, with the goals of improving course design, supporting faculty members in the effective use of the tool, and assisting students in acquiring the skills necessary in their subject matter and retaining knowledge from their didactic courses.

The second project was on the use of student success plan, a software system and process designed to increase the persistence, success, and graduation rates of targeted students at Sinclair Community College. Through various tools, the system collects data that are then used to improve the retention and success of at-risk students; increase their graduation rate; implement a systematic, comprehensive counseling and intervention process and an integrated early-alert intervention process; develop and maintain a comprehensive resource of community and college referral sources for addressing challenges to student success and retention; develop a web-based counseling record (case) management system; and create self-help tools to connect students to resources that help them overcome challenges to their success. For more information, see the focus session presentations from Byron Roush and Russ Little.

Predictive Modeling Projects

In the predictive modeling project sessions, we heard about a large-scale, ongoing data-mining effort. In this program, researchers have created a database that measures 33 variables for the online coursework of 640,000 students, totaling 3 million course-level records. Six large institutions are participating in this study (American Public University System, Community College System of Colorado, Rio Salado College, University of Hawaii System, University of Illinois–Springfield, and University of Phoenix), in which variables have been identified that help track student performance and retention across a broad range of demographic factors. This project investigates various research questions and areas.

  • What factors influence student loss/retention and momentum/completion?
  • How do the factors affecting loss differ from indicators of completion?
  • What can we discover about the existence of unique demographic, pedagogical, or institutional factors affecting loss/retention and momentum/completion?

The other session in this set focused on the use of LMS tracking data for predicting student achievement. Leah Macfadyen pointed out that traditional summative assessments typically occur too late and offer limited insight into student learning practices, student study strategies, the development (or not) of effective learning communities, the degree of student engagement with peers and course materials, and what absent or disengaged students are doing (or not doing). The presenter noted that although hundreds of variables are available, only 13 had any correlation to student final grades at her institution. For more information, see focus session presentations from Sebastian Diaz and Leah Macfadyen.

Improving Student Success and Retention Projects

The first session in this set reviewed the Open Academic Analytics Initiative, whose goal is to employ analytical software to find patterns in "big data sets" as a means to predict student success. Using data from both the student information system and the LMS, the project seeks to create an open-source early-alert system that will (1) identify at-risk students in the first two to three weeks of a course and (2) enable the deployment of interventions to help those students succeed. The other dimension of Marist College's project is its Online Academic Support Environment. Marist's intervention framework makes a variety of resources available to students to help them achieve academic success.

The second session was about Central Piedmont Community College's Online Student Profile Learning System, which addresses various challenges facing new students and promotes retention via three major components:

  • An improved student services model, with new and expanded services (such as student success centers across campus), self-assessment resources (such as learning and personality inventories), and a comprehensive orientation course
  • Resources to enhance faculty skills, with a faculty training series
  • An online student tracking system, with a profile for each student and an early-warning system.

For more information, see focus session presentations from Josh Baron and Clint McElroy.

Course-Based Learning Analytics Projects

The first session in this set of projects focused on the role of the instructional designer when real-time analytics is available to students during an active course. Chris Brooks proposed a rapid instructional design process that is agile and addresses immediate findings or deficiencies that may be addressed by course design. In other words, he proposed "micro interventions" where a designer is enabled to support students just in time, with a staffing model that includes an instructional designer, using a descriptive course dashboard, for a certain number of students in a given course.

The second project was a case study on the application of LA to the collaborative construction of knowledge and writing. This project posed two questions: Does learning analytics related to collaborative writing foster greater metacognition and thus greater learning among students? And, does such analytics data promote both instructor and peer opportunities for real-time interventions as formative assessment? In conjunction with the use of Google Docs, this project developed a system that generates visualizations of real-time metrics of edit histories of collaboratively written documents. The hope was that making this information available to the collaborators would lead to improved metacognition. For more information, see focus session presentations from Chris Brooks and Brian McNely.