Diving Deep into Data is issue #4 in the 2024 EDUCAUSE Top 10.
"Technology should be a tool for faculty to understand how students are learning and be able to see and subtract information and decide if any changes are needed to the teaching modality, and also to the content."
—Zulma Toro, President, Central Connecticut State University
The higher education landscape is changing dramatically. Instructors are adopting more tools and gaining comfort with learning technologies. They are blending modalities and learning spaces throughout courses. Learners are also using a variety of tools, modalities, and learning spaces. Faculty and students deserve to know what works when, for whom, and why. Campus planners need to understand which kinds of learning spaces to invest in or divest of.
Student success extends beyond learning success and involves advising and career services, tutoring, financial aid, behavioral health services, and extracurricular activities. In addition, students may need a variety of services to address life challenges that can disrupt their education--services such as transportation, housing, childcare, healthcare, and internet and technology access. Advisors, students, and their families need to understand what combination of support and services can best help each student define and attain academic success.
Although instinct, personal preference, and assumptions still often hold sway in campus decision-making, decision-makers have more access than ever before to data and analytics related to learning and student success. The information in the LMS is no longer sufficient. Analytics and data professionals need to integrate data from the many resources gathering students' data to give the students, faculty, and advisors insights they can act on. Decision-makers are asking questions about how and where to focus resources now and in the coming years and about how to evaluate pilots and initiatives. Analytics professionals need to give them flexible models and projections. Technology professionals should be able to understand the data and analytics requirements throughout the institution and provide cost-effective, contemporary data services and infrastructure. And technology and data leaders need to shape the strategy that will enable the institution to leverage analytics for actionable insights.
Ultimately, the ability to use and manage quantifiable and actionable data will help stakeholders track progress (or the lack of it), allocate resources for all the issues on the Top 10 list, and respond to changes in an informed way.
When it comes to student learning and success, final grades and course credits are definitive. However, they occur too late in a term to be actionable, either by students or by faculty and staff who might intervene on the students' behalf. Students' use of digital learning environments may be a reasonable proxy for their engagement and progress. Social sciences have used survey or behavioral data as proxies for concepts that are difficult to measure (e.g., well-being, socioeconomic status), and the learning analytics field has done so for over a decade.Footnote1 Leveraging students' "digital footprints" of time and attention can improve educators' ability to deliver the right message, to the right student, at the right time. In addition, data can help cultivate students' agency about their own learning by raising their awareness of their habits and outcomes, early and often, so that they are motivated to seek help that the institution is all too willing to provide.
As leaders continue to recruit and retain student populations beyond those the institution has traditionally served, it is particularly important to review data to understand students' needs, engagement, and outcomes and to adjust—both for individual students and systemically—what happens inside and beyond the classroom.
Build a culture of data and analytics. Stakeholders and decision-makers need to become used to looking at a problem and saying, "We have data on that" or "We know how to get data on that." They need to know how to use data, query it, and interpret it to address the problems at hand.
Don't fall down the data rabbit hole. Avoid obsessing over perfecting predictions at the expense of a good intervention. Those who are willing to try, iterate, refine, and repeat their attempts to help students—and rigorously collect and assess the data on change in outcome—will be more successful at implementing learning and student success analytics.
Invest. Unfortunately, data analytics tools can often be expensive. Even free and low-cost solutions require some knowledge, staffing, and resources to support.
Be guided by leadership with a strong vision and concrete priorities. This kind of leadership will make it easy to use data to advance priorities and will help coalesce disparate administrative and faculty voices to agree on common data sources and shared dashboards.
The Key to Progress
Institutional leaders and their constituents who have defined clear priorities and clear problems to solve—who are framing learning and student success, from the individual learner and instructor all the way up to the institution—will be best positioned to benefit.
From Strategy to Practice
What You're Saying
"Nothing revolutionary here. We are an institution that serves nontraditional backgrounds, first-in-family learners, MƒÅori and Pasifika students. We continue to evolve and refine our learning analytics data to focus on student success. This journey will never be finished."
"We are actively growing out our data lake and data warehouse in support of deeper data analytics."
"We are diving deeply into using in-course data to assess student performance, issue alerts, institute interventions, etc."
"We have to start where we are, and that is establishing sound data governance. Once that's in place, the hope is that people better understand the data and are more comfortable using it."
"Minnesota State Colleges and Universities has an Equity Scorecard and a new center for data access and analytics."
Jacquelyn Malcolm Bailey
"At its best, student success is about passing not only one course but also the next one. But how and why this happens—or doesn't—for all students is why human learning is essentially a wonderful mystery. To truly become self-regulated learners, students must be willing and able to honestly and accurately assess what they currently know, understand, and do. To help, UMBC has been trying to better understand how faculty course design and institutional learning analytics can help nudge students' responsibility for learning by delivering the right message, to the right student, at the right time."
"The University of Cape Town (UCT) in South Africa has a Data Analytics for Student Success (DASS) project, run in collaboration with the Institutional Planning Department, central ICT, the Centre for Innovation in Learning and Teaching, and college-based educational support units. DASS aims to identify early indicators for students at risk and match interventions to these. The DASS team recently hosted the South African Association for Institutional Research (SAAIR) Learner Analytics Bootcamp. The event brought together more than forty delegates from universities across Southern Africa."
Richard van Huyssteen
"One of Thomas More University College's projects involved learning analytics. The aim of this project is to make new (digital) forms of education sustainable in terms of quality. Digital education carries the risk that students from vulnerable groups will lose the connection with the higher education institution and, as a result, commit less or not at all to the program. In order to deploy digital education, it is useful to keep a finger on the pulse through learning analytics and detect early which students are at this risk. In addition, we can use learning analytics to strengthen competencies around new forms of education. After all, it allows us to analyze and evidence-based optimize our teaching."
Mia De Wilde
What You're Working On
Comments provided by Top 10 survey respondents who rated this issue as important
- Using Ellucian Advise to document any and all interactions with students and tying that with their early grades, attendance, Wi-fi connection, meal plan consumption, and dorm room access.
- Leveraging our education data warehouse for insights into career advising. This is a medical school, and a lot is changing in what gets included in students' applications to residency programs after graduation. Step 1 of the U.S. Medical Licensing Examination (USMLE) is now pass/fail, and there's a growing trend for core clerkships to be pass/fail. As a result, some of the traditional metrics are no longer available.
- We are using an analytics model of engagement to identify students who might benefit from being contacted by our Student Experience Center and nudged gently toward an advisor. We have also undertaken analysis that has identified course units that present "blockers" to some students being able to progress beyond their first year of study, and we are recognizing that these blockers have an inequitable effect on MƒÅori and Pasifika students. So we are now designing interventions to clear those blockers and foster success for all learners.
- We have been working closely with our math faculty and using predictive models to identify at-risk students before the first exam. Faculty have redesigned their courses and give more lower-stakes assessments, which we use in our model to identify students who need more support.
- We have a Workday dashboard that provides early-warning signs if a student is struggling. It brings in data from other systems (e.g., the LMS) and marries that data with the ability of faculty to note issues, financial issues, etc.
- Leveraging our contracted data analytics software and institutional data to provide partners with insights and provide real-time dynamic dashboards for our enrollment and student progress.
- We have formed a Student Engagement Dashboard strategic plan working group to come up with one that collates data points from our LMS, e-textbook, interactive study guides, etc. The intent is to ascertain where our students' digital footprints are during their course of study.
- We use a dashboard to analyze our retention based on first-year, second-year, third-year (etc.) students. We are evaluating the date and trying to adjust to the retention issue.
- Working on integrating data from multiple domains such as the student information system, LMS, financial information, campus activities, and health and wellness (to name a few) to create a single dashboard for each student. This will help us personalize the support we need to provide to each student. Ultimately, this should improve student retention and graduation rates.
- Completed a compare-and-contrast proof of concept project with Microsoft utilizing Azure Lighthouse and ML.
- Create a data sea with open-source data science tools for data analytics.
- We are building a data lake for aggregating data from multiple sources to provide a central point for queries and analyses.
- We are experimenting with combining data sources on an ad hoc basis prior to institutionalizing the methodology.
- We are seeking grants to obtain a unified data analytics solution.
- Working on a data analytics project to establish a Microsoft Azure data mart that will bring data from disparate institutional systems (e.g., Colleague, Canvas) together for use in dashboards and reports built with PowerBI.
Staffing and re-orgs
- We have consolidated enterprise application services and included two new departments: Data Engineering and Data Management (DEDM) and Academic Business and Performance Analytics (ABPA).
- We recently created a new vice president position for data to sit at the apex of administration and academics.
- Creating positions to bridge the gap between leaders who have questions and decisions and technologists and operations staff who have data.
- We are in the process of drafting a position for a new role on our LMS admin team: a learning analytics specialist. This person will help ensure that we are getting the most robust data that we can from our LMS to inform our work and strategy.
- We have established a data analytics engineering team to develop core datasets that are easy to access by our data scientists and faculty.
- A dedicated learning analytics team focused on analyzing data from learning technology systems and providing the data to the learning outcomes units and functions for decision-making.
- Building a data mesh strategy to gather information about students' development and performance.
- Creating a student success platform that tracks students' progress and uses data aggregated from multiple sources.
- As part of our Classrooms Reimagined initiative (our ten-year strategic plan for classrooms and informal learning spaces), we have a dedicated senior data analyst focused on turning classroom and scheduling data into insight. We use this insight for strategic and tactical learning space planning, projects, and research.
- Many initiatives placing holistic students and the value of data at the center of operations, decision-making, and innovation (transformation).
- New School of Data Science, which collaborates with the community.
- We are in the process of migrating to a new ERP system. We are leveraging the new "analytics tool" part of that system to help us gain more insights into the platform and help us improve the learning experience for students and ensure that they are successful.
- Predictive capability at a fine-grain level to help students.
- We are partnered with a vendor on using machine learning models to predict student retention.
- John Patrick Campbell, "Utilizing Student Data within the Course Management System to Determine Undergraduate Student Academic Success: An Exploratory Study" (PhD diss., Purdue University, 2007); John Campbell, Peter DeBlois, and Diana Oblinger, "Academic Analytics: A New Tool for a New Era," EDUCAUSE Review 42, no. 4 (July/August 2007); Leah P. Macfadyen and Shane Dawson, "Mining LMS Data to Develop an 'Early Warning System' for Educators: A Proof of Concept," Computers & Education 54, no. 2 (February 2010); Matthew D. Pistilli and John Campbell, "Building and Scaling Analytic Capacity," grant kickoff meeting, Purdue University, January 30, 2012; John Fritz and John Whitmer, "Learning Analytics Research for LMS Course Design: Two Studies," EDUCAUSE Review, February 27, 2017; John Fritz, Thomas Penniston, Mike Sharkey, and John Whitmer, "Scaling Course Design as a Learning Analytics Variable," in Anthony G. Picciano et al., eds., Blended Learning: Research Perspectives, vol. 3 (New York: Routledge, 2021). Jump back to footnote 1 in the text.
John Fritz is Associate Vice President, Instructional Technology, University of Maryland, Baltimore County.
Mike Richichi is AVP for IT and Deputy CIO, Baruch Computing and Technology Center, Baruch College, City University of New York.
Catherine Zabriskie is Senior Director, Digital Learning & Design, Brown University.
© 2023 Susan Grajek and the 2023–2024 EDUCAUSE Top 10 Panel. The text of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.