Why Data Matters for Student Success in a Post-Pandemic World

min read

Data analytics provides a path for examining the institutional barriers that lead to students leaving or stopping their higher education journey—and sheds light on what institutional supports are effective in moving student success practices forward.

Why Data Matters for Student Success in a Post-Pandemic World
Credit: Illus_man / Shutterstock.com © 2021

Higher education researchers have long studied and shared what works to improve the retention of students. Engagement, belonging, inclusion, and students' connection to their academic major have always been critical to their success. We've heard this from leading analysts, learning engineers, and teacher-scholars for almost fifty years. Regardless of the demographics, "the stronger the individual's level of social and academic integration, the greater his or her subsequent commitment to the institution and to the goal of college graduation."Footnote1

But what happens when a global pandemic (e.g., COVID-19) hits, campuses shut down, and on-campus social and academic integration is interrupted? Can we still look to past research to determine student success in an increasingly digital future?

To answer these questions, we conducted a meta-analysis that combined data from seventeen de-identified, diverse US campuses (five 2-year and twelve 4-year institutions).Footnote2 The use of impact-analysis software (i.e., software that performs virtual experiments on retrospective data) facilitated the estimation of the combined effects from 44 student success initiatives, linked to each institution's standard demographic, academic, enrollment, and engagement data. The software's prediction-based propensity score matching (PPSM) algorithm runs a virtual experiment and matches pilot and control students based on their scores of likelihood to persist and likelihood to participate in an intervention program.Footnote3 All interventions analyzed here were run during the pandemic and many were run in response to sudden campus closures. However, about 50% had been running pre-pandemic, and campuses had to change their operations due to the pandemic. This presented an opportunity to compare the efficacies of the initiatives before and during the pandemic.

The innovative nature of this study is that a diverse set of student variables encompassing LMS engagement, academic performance, enrollment behavior, pathway progress, financial aid, and demographics was included in a rigorous matching process in the estimation of persistence increases attributable to these intervention programs. This allowed campus analytics researchers to run their own impact analysis on their unique student success initiatives. The metadata also allowed us to better understand the effects of the unprecedented, rapid institutional responses that resulted as the COVID-19 pandemic forced institutions to close their campuses suddenly and to move support services online. We then extrapolated the findings to consider wider institutional lessons for how colleges and universities might incorporate these transformative shifts into student success in post-pandemic years.

Much of what has already been revealed regarding students' persistence remains true, but much must also be reconsidered as the demographics, experiences, and needs of those now entering higher education—the "new traditionals"—change. As Edward J. Glantz and his coauthors suggest, new practices and tools now being used in higher education will take hold as changes to institutional procedures are incorporated into the "next normal" ahead.Footnote4 The efficacy of these practices and tools will need to be measured.

A Growing Culture of Evidence

Awareness that one size no longer fits all is critical to understanding the circumstances affecting student success within the diverse pool of students now attending colleges and universities. We focus here on a few core questions related to how institutional leaders can explore an informed and increasingly digital path forward, and how a campus can identify, using an analytics lens, demographic patterns in actions that worked (or failed). Those of us in higher education are rapidly learning, through failure and crisis, that change is needed for diverse populations and shrinking budgets. Student success depends on understanding and improving program effectiveness through evidence-based change and intervention insights across the institution.

After a year of "emergency remote teaching,"Footnote5 the National Student Clearinghouse Research Center Stay Informed series reported on Spring 2021 enrollment data based on 12.6 million students at 76% of US degree-granting institutions.Footnote6 To the surprise of many involved in student success research, community colleges repeated the approximately 10% enrollment decline that had been reported in Fall 2020. With serious societal implications, far fewer undergraduate men enrolled in Spring 2021, for a total loss of 8.9% in the academic year 2020–2021. The continued loss for community colleges totaled a 15% drop for students ages 18–20, with the highest declines for men, across all age, race and ethnicities.

Community college leaders discovered how crucial it will be for higher education to abandon preconceptions about students and dig deeply into data that can help make sense of student demographics. Analytics provides a path for examining the institutional barriers that lead to students leaving or stopping—and sheds light on what institutional leaders can do to support student success moving forward.

From Evidence to Action

Expanding beyond the quantitative analytics lens, we wondered whether the higher education community could add to new knowledge by studying successful actions taken based on rapid responses to a closed campus. How does a campus move from gathering data insights to taking action based on those insights as a way to support student success? Can institutional responses be collated with remote interventions and services that worked? Much of the research on what works for which students details the impact of intervention and services before COVID-19. The literature is scarce on the pandemic's effect on students (dropped/stopped, failed to persist, grade changes). In a time of pandemic, campus leaders' energies understandably shifted to emergency measures, but the struggles and losses in some student populations need urgent, collective focus to ensure that higher education responds as society adapts to a post-pandemic environment.

With closures due to COVID-19, traditional place-based education, services, and interventions were no longer available. The impact on students was immeasurable. Lives were rapidly thrown into chaos with economic trauma, health concerns, isolation from family and friends, and the dramatic shift from predominantly onsite education to emergency remote teaching models. Whereas intensive early-warning systems, just-in-time advising and mentoring, and strong individualized education planning had been shown to improve retention for at-risk students in the past (based largely on face-to-face practices), suddenly there was a significant and immediate halt. As campus leaders struggled to determine effective practices on physically closed campuses, students dropped/stopped in large numbers. Equally unclear is what will happen when these students return to campus. As institutions work to recoup financially, they will be challenged to evaluate pandemic innovations and adaptations that are being demanded in the "next normal."

Intervention Design via Data

Campuses that collected response data during the pandemic have learned lessons that will inform lasting change. Looking at this unprecedented move, Beth McMurtrie tells us some institutions reported that faculty attendance at teaching and learning workshops doubled or tripled and that instructors adopted new tools for office hours, lectures, discussions, and effective pedagogies. She wonders what this will mean for future teaching practices: "Who needs to spend time and money trudging to and from campus when you can just open up your laptop and hold office hours, host a guest speaker, or run a tutoring session?"Footnote7

For student success, pandemic lessons learned include reminders that it is crucial for faculty to know their students: what students need, which ones are struggling, why they're struggling, which students are leaving and when, and which interventions mattered. All of this data is knowable. Access to new data analytic models and tools, designed to examine current practices, has allowed campuses to meet students where they are—whether they are struggling to find contacts and connections, are unsure how to get needed services, or don't know where to turn in a crisis. Timely data matters now more than ever as student bodies become more diverse and teaching modalities more varied. Data analytics can reveal students' just-in-time needs, struggles, and connections and then quickly evaluate solutions from an impact perspective.Footnote8

Support for Newly Remote Students

We asked a number of our study participants to share how they made the pivot to totally online. One institution, Austin Community College (ACC), is a large, multi-district community college with eleven campuses. It already had a substantial online presence and a strong service unit for its growing online population. When the pandemic hit, ACC quickly ordered iPads and hotspots, knowing that many learners would need hardware and software at home. Student services staff called and emailed students to connect and let them know about applying for immediate financial and technology assistance. The staff also looked at demographic data and made a special effort to ensure communication with Black and Hispanic students throughout the pandemic. Staff contacted students who were enrolled in the previous, pre-pandemic term to encourage them to enroll for the next term, reassuring the students that online support would be available. Overall, the following variables were identified as supporting the highest improvement in persistence for students between Fall 2019 and Fall 2020: childcare, with a 7.3% lift; coaching services, with a 5.6% lift; online adaptive learning courseware, with a 5.1% lift; and students working with advisors, with a 14.9% lift.

Utah Valley University, one of the largest universities in Utah with 40,000-plus students, reported increases in persistence due to a variety of new advising modalities in response to the on-campus shutdown. Although the majority of advising was done by phone and email, some in-person advising was started in the Fall 2020 term, with a 6.3% lift over the previous fall term. Meanwhile video academic advising showed a 10.7% lift, and phone and email outreach showed an 8.9% lift. The higher the frequency of messages, the greater the persistence lift. With five or more email correspondences, persistence lift was 11.9%, compared with one email message at 7.5%. Three or more phone messages produced a 10.7% lift in persistence, compared with one phone message at 8%. Overall, persistence rates increased at a statistically significant level as the number of advising visits increased, regardless of modality.

For many of our study participants, moving to emergency remote services included campus laptop checkout programs. On one campus, there was a 3.1% lift in persistence for all students checking out laptops and hotspots. Higher persistence was found in first-generation students, at 8.3%, and Pell recipients, at 7.1% lift. The checkout program had a higher benefit for those students who did not have access to computers or hotspots at home before the pandemic, opening up questions for further study of equity and student success. Other financial stability factors that continued to be important during the pandemic included funding support from the Coronavirus Aid, Relief, and Economic Security (CARES) Act, as well as internal campus financial support for students with emergency needs.

Flexible and no-detriment grading was another policy lever implemented at many campuses. This emergency course grade option is defined as a choice provided to each student: a letter grade, pass-fail, withdrawal, or an agreement made with the faculty member. No-detriment grading resulted in 5.0% to 6.7% persistence lifts at two institutions.

Summary from Meta-Analysis Results

What have we learned from the seventeen institutions that analyzed their student success programs in relation to student demographics, persistence, and rapid response pandemic initiatives? Listed below are the major themes found through the application of the PPSM algorithm.Footnote9

  1. Post-pandemic material efficacy drop: For programs that used to rely on in-person engagement or group-based activities (e.g., lab- or group-based learning), efficacy dropped during the pandemic, possibly due to the lack of personal interactions or to operational challenges in substituting engaging online learning experiences for previously taken-for-granted in-person engagements.
  2. Student advising: Communication modalities that moved to asynchronous and digital for advising mattered little in program efficacy as long as there was personal outreach and engagement (e.g., email, video chats, phone). This was particularly true when there were existing advisor-advisee relationships.
  3. Interventions designed to be empathetic to student situations: The most effective programs revealed a theme of helping vulnerable students in times of crisis, resulting in significant increase in persistence impact results. Examples include mobile food pantries, the pandemic CARES Act, free Wi-Fi access, digital counseling, free laptop loaners, and flexible grading. As expected, there was an element of intersectionality: many students with less-than-average prediction scores benefitted far more from these interventions.
  4. Social support interventions: Programs that connect students to peers or mentors (even virtually), that promote remote collaborative study groups, and that encourage social cohesion (e.g., fraternity and sorority living) were found to be highly effective pre- and post-pandemic. In contrast, an adaptive remote learning lab initiative exhibited somewhat smaller improvements in persistence during the pandemic in comparison with labs held before the pandemic, possibly because of how the initiative was implemented or because of the isolation and lack of social contact and collaboration experienced in this new modality.
  5. Advising and other student support services: An efficacy trend comparison with baseline matching (term-to-term or pre/post-COVID-19) offers insights into how to improve program efficacy continuously. Institutions that embarked on COVID-19 response evaluations discovered causal links between program efficacies for continuous process improvement. Rapid measurement is a recipe for strong collaboration between human intelligence and artificial intelligence. Successful institutions leverage program efficacy results over time to optimize program operations continuously by linking program operational improvements to changes in program efficacy results. This closed feedback loop, which depends on human intelligence acting on causal inference insights, leads to continuous process improvement.Footnote10
  6. Intersectionality to reduce equity gaps: Students facing greater adversities and challenges benefit more from interventions. Institutions that leveraged prediction scores as an amalgamation of academic, behavioral, social-psychological, and demographic factors saw clear evidence of value in interventions contributing to student success. In general, there is a strong negative correlation between prediction scores and intervention efficacy results—meaning that the lower the prediction scores, the greater the benefits from such interventions.

Table 1 shows the most effective of the 44 student success programs analyzed in our study. They are ranked by overall persistence improvement—the "Lift" column. The "COVID-19 Only?" column indicates whether a program was run only during pandemic-affected terms or over a longer period of time that included both pre-pandemic and pandemic-affected terms. The p value of ≤ 0.05 indicates that the program's estimated persistence improvement is statistically significant. There are many programs with a p value < 0.01, meaning that their effectiveness in improving persistence is very unlikely to have been achieved by random chance. The N column represents the number of matched pilot-control pairs.

Table 1. Examples of Effective Student Success Programs during the Pandemic

Program COVID-19 Only? Lift p Value N

CARES Act funding

Yes

17.20%

< 0.01

1,869

Advising (slight decline to 12.5% in Spring 2020)

No

14.90%

< 0.01

12,303

Education Opportunity Program

Yes

11.50%

< 0.01

194

Video academic advising

Yes

10.70%

< 0.01

2,416

Fall 2019–Fall 2020 Phone 3 or more visits

No

10.10%

< 0.01

316

African American student scholarship (effective during COVID-19)

No

9.80%

0.04

181

Phone or email academic advising

Yes

8.90%

< 0.01

8,411

Fall 2019–Fall 2020 Email 4 Visits

No

8.76%

< 0.01

891

Fraternity and sorority living (FSL) experience

No

8.60%

< 0.01

2,122

Fall 2019–Fall 2020 Phone 1 Visit

No

8.35%

< 0.01

8,057

CARES Act funding (summer less effective: 11% vs. 2%)

Yes

8.30%

< 0.01

1,119

First-generation laptop and Wi-Fi

Yes

8.30%

0.03

182

Fraternity and sorority living (FSL) experience

No

8.20%

< 0.01

2,825

Fall 2019–Fall 2020 Email 3 Visits

No

7.98%

< 0.01

2,236

Fall 2019–Fall 2020 Phone 2 Visits

No

7.90%

< 0.01

1,221

Foodlink nudge campaign program

Yes

7.80%

< 0.01

2,524

Fall 2019–Fall 2020 Email 1 Visit

No

7.50%

< 0.01

17,001

Fall 2019–Fall 2020 Email 2 Visits

No

7.40%

< 0.01

5,896

Childcare (declining efficacy post-pandemic; no statistical significance in Spring 2020)

No

7.30%

< 0.01

878

Pell recipient laptop and Wi-Fi

Yes

7.10%

< 0.01

269

No-detriment grade policy

Yes

6.70%

< 0.01

4,955

Writing center (>7% to < 5% drop in the COVID-19 term)

No

6.50%

< 0.01

170

In-person academic advising

Yes

6.30%

0.01

4,640

Writing center (increasing efficacy from Fall 2019 to Spring 2020 and Summer 2020)

No

6.30%

< 0.01

411

Kudo (positive message) impact (-4% in Spring 2020; sharp reversal, but p value = 0.08, not statistically significant)

No

5.90%

< 0.01

1,867

Academic coaching (declined to statistically significant 4.5% in Spring 2020)

No

5.90%

< 0.01

7,876

Celebration award for top performers in assessment—comparable performance during the pandemic

No

5.90%

0.01

27,510

Academic coaching (decline to not statistically significant 4.6% in Spring 20)

No

5.60%

0.01

2,191

Fraternity and sorority living (FSL) experience (similar improvement in persistence in Spring 2020 in comparison to before the pandemic)

No

5.20%

0.01

10,635

Adaptive learning lab courseware (decline to not statistically significant 2.3% in Spring 20)

No

5.10%

0.01

10,482

CircleIn (remote collaborative study)

Yes

4.80%

0.07

632

Laptop gift

Yes

4.30%

0.22

334

CARES Act funding for Pell-eligible students

Yes

4.20%

< 0.01

3,678

CARES Act funding for all students

Yes

4.10%

< 0.01

5,704


Each student and campus is unique, but trends emerge. Providing direct solutions for financial need—by offering childcare, food, CARES Act emergency funding (direct aid to minority populations negatively impacted by the pandemic), and technology loans and gifts (take-home laptops and Wi-Fi hotspots)—had a significant effect on persistence. Other successful interventions were "low-tech" moves as simple as advising by video, phone, and email, along with "high-tech" software accelerators (e.g., self-paced, online learning courseware). Although institutions differed in the actions they took, initiatives that were successful in supporting students' persistence can be categorized into three programmatic themes (see table 2).

Table 2. Categories of Effective Programs during the Pandemic with Impact Results and Reasons

Program Category with Examples Impact Results Possible Reason

Offering direct help: laptop loans, Wi-Fi access, CARES Act funding, flexible grades, childcare, food

3-17%: more effective for the bottom 25% in persistence predictions personalized to each campus

Kindness and institutional awareness when students are in tough situations foster a sense of goodwill, gratitude, fortitude, and belonging/ commitment to the institution—all of which translate into persistence.

Creating connection: Connecting students remotely to peers and/or mentors, setting up collaborative e-study groups, supporting social living (e.g., sororities and fraternities)

4-8%: more effective when supporting group belonging and intersectionality in prediction scores

Social support and connectedness form an integral part of the successful college student experience.

Adding co-curricular supports: Adding new modalities, as well as increasing advising, mentoring, and tutoring options

4-14%: more effective with increased contact

Moving support services to remote (e-video, phone, email) strengthens "belonging" through the increase in anytime/anywhere contacts.

Instead of focusing on non-malleable factors (e.g., race and ethnicity) in relation to equity gaps, predictive models accommodate a large number of factors—both malleable (e.g., engagement, enrollment behavior, academic performance, pathway progress) and non-malleable—during the machine learning process. Intersectionality factors in all of these student success variables are an attempt to present a single-dimensional metric useful in framing and discovering how to lower equity gaps through interventions that touch on and are personalized to malleable factors.

Moving Forward

With the emergency shutdown of physical campuses due to the COVID-19 pandemic, supporting students remotely became a challenge. Data analytics allowed prepared campus leaders to see and support students in real time and to be able to analyze not only students' needs but also new digital modalities, supports, and services.

Each student is unique. The homogeneity we assumed of "traditional" students of the past will not be seen again. Likewise, each higher education institution is unique. Supporting the diversity of students must accommodate both population and environmental heterogeneities to be effective. We can no longer look to student success practices from even just five years past.

What do we, as a community of practice, now know? What did we learn that should guide us going forward? Engagement, belonging, inclusion, and students' connection to their academic major continue to matter in students' persistence, but these factors must be understood differently as "new traditional students" (including older, working, financially challenged, living-at-home students) spend less time on campus. During the pandemic, across the higher education landscape, services that pivoted to technology as a way to support online learners improved student retention. When students feel seen, heard, and supported for who and where they are, they are more apt to stay until graduation. Initiatives shared by the seventeen institutions in our study highlight the importance of identifying and helping students in times of crisis. Data from multiple campuses showed significant lift in retention where faculty and student services (e.g., advising, enrollment, financial aid) used technology to stay connected to students.

A number of campuses put student connection teams together and immediately started calling or texting students about enrollment opportunities. Advisors and faculty worked to stay in touch, sending increased communications to students. Personalized systems allowed for more targeted messaging (nudges) to keep students engaged and on-task during chaotic times. Technology services that had never existed before now kicked into gear, putting laptops and Wi-Fi hot spots in the hands of students who didn't have equipment at home. A move to digital tools, whether by providing hardware to connect from anywhere or by moving meetings to remote meeting tools (e-meetings, chat, phone), resulted not only in a rise in participation but also in a rise in persistence. In one example, moving student workers online during the pandemic for mentoring, tutoring, and peer support provided significant lift in persistence both for the student workers and for the students served.

In another example, fraternity and sorority life (isolated together on campus) resulted in 5-8% persistence improvements in comparison with other students matched through PPSM on three campuses. Innovative campuses will look toward diverse means of providing that same sense of belonging to the "new majority" of students—including young, digital natives who find their belonging online.

Whether campus leaders learn from the pandemic and leverage data insights culled from emergency remote teaching practices remains to be seen. The COVID-19 pandemic has given higher education institutions a jump-start in responding to an evolving student population and in offering teaching and services for a digital age. Whether campuses continue to strategically evolve, or whether they'll return to a culture that long ago ceased to be effective or fair, will be the choice of institutional leadership. Analytics tools now exist to provide insights via discovery, interpretation, and communication of meaningful patterns in data—especially for today's more time-challenged, intersectional, and diverse groups of students and the campus services that best support their needs.

An ongoing examination of persistence trends allows data-driven leaders a better understanding of access, engagement, and affordability and promotes strategic planning. Necessity is the mother of invention, they say, and higher education experienced much invention and innovation in its response to COVID-19. Colleges and universities went fully online while leaders and staff reimagined lectures, labs, discussions, collaborations, and assessments/grading. Support services also went fully online while staff used phones, emails, and texts to reach students. And higher education acknowledged inequities and provided food and emergency funding as well as technology (computers, Wi-Fi, online materials and tools) to students at home.

Campuses prepared with data analytics, tools, and skills were able to reach out and rapidly take action with the right resources to help data-identified students through their unprecedented struggles. The challenge will continue to be how best to move data to insights and then to action that supports persistence and mitigates barriers.

Data analytics allows campus leaders and staff to know their students via evidence, in real time, and helps them to be prepared to act on this knowledge. Data analytics creates belonging, leading to persistence and student success. The use of analytics in moving campus data to insights, and insights to effective action, will be key to an institution's support and practice of student success.

Notes

  1. Ernest T. Pascarella, John C. Smart, and Corinna A. Ethington, "Long-Term Persistence of Two-Year College Students," Research in Higher Education 24 (1986). Jump back to footnote 1 in the text.
  2. These seventeen institutions are all Civitas Learning clients who used the Civitas Learning software program Impact to evaluate campus initiatives before and during the pandemic. Jump back to footnote 2 in the text.
  3. David Kil, "Synergistic Data Science and Causal Inference," SXSW EDU, Austin, TX, March 6, 2018. Jump back to footnote 3 in the text.
  4. Edward J. Glantz, Chris Gamrat, Lisa Lenze, and Jeffrey Bardzell, "Improved Student Engagement in Higher Education's Next Normal," EDUCAUSE Review, March 16, 2021. Jump back to footnote 4 in the text.
  5. See Charles Hodges, Stephanie Moore, Barb Lockee, Torrey Trust, and Aaron Bond, "The Difference Between Emergency Remote Teaching and Online Learning," EDUCAUSE Review, March 27, 2020. Jump back to footnote 5 in the text.
  6. "COVID-19: Stay Informed with the Latest Enrollment Information," National Student Clearinghouse Research Center's Regular Updates on Higher Education Enrollment, April 29, 2021. Jump back to footnote 6 in the text.
  7. Beth McMurtrie, "Teaching: After the Pandemic, What Innovations Are Worth Keeping?" Chronicle of Higher Education, April 1, 2021. Jump back to footnote 7 in the text.
  8. Society for College and University Planning, "Planning For: Effective Data Analytics," February 25, 2020. Jump back to footnote 8 in the text.
  9. David Kil, Angela Baldasare, and Mark Milliron, "Catalyzing a Culture of Care and Innovation through Prescriptive and Impact Analytics to Create Full-Cycle Learning," Current Issues in Education 22, no. 1 (January 7, 2021). Jump back to footnote 9 in the text.
  10. Linda Baer, Amanda Hagman, and David Kil, "Preventing a Winter of Disillusionment: Artificial Intelligence and Human Intelligence in Student Success," EDUCAUSE Review 55, no. 1 (2020). Jump back to footnote 10 in the text.

Colleen Carmean is founder of the Ethical Analytics Group. She teaches critical thinking and applied computing at the University of Washington Tacoma.

David Kil is CEO at Healthmantic and serves as an advisor to startup companies in the artificial intelligence and machine learning fields.

Linda Baer is a senior consultant with Linda L. Baer Consultants, having served in numerous executive-level higher education positions for more than thirty years.

© 2021 Colleen Carmean, David Kil, and Linda Baer. The text of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.