Blending Human Intelligence and Analytics for Student Success

min read

Grinnell College is combining learning analytics with human-intelligence networks to increase student retention and completion. Various social and psychological factors play a role in student success, and these can be linked to learning data to paint a fuller picture of each student's likelihood of success. The college is working to understand the "science" of interventions and to provide faculty and staff with information on the effectiveness of those interventions.

Blending Human Intelligence and Analytics for Student Success

[This article is a reproduction of an EDUCAUSE Learning Initiative brief published in August 2016. —The Editors]

Undergraduate retention and completion rates are the subject of national interest, and questions of cost, value, and quality remain the focus of public debate. At Grinnell College, we believe we can achieve a deeper understanding of the factors that contribute to persistence and completion on our campus and at other institutions by examining the intersection of campus culture, the results of mixed-methods research, and our work with other colleges and universities regarding the art and science of interventions.

Liberal arts colleges such as Grinnell provide students an opportunity to discover intellectual and personal interests and acquire vital skills in an intimate, residential setting shaped by close interactions with faculty. Classes in the liberal arts tradition are often small and inquiry driven; students typically have access to excellent research opportunities, libraries, laboratories, and infrastructure. Yet the liberal arts model also faces significant challenges in terms of finances, access, sustainability, technology, and public scrutiny. To succeed in this environment, liberal arts colleges need to make compelling arguments regarding cost, value, and quality. They also need to devote renewed attention to questions of student retention and success, demonstrating an ability to deliver an outstanding education that enables students to learn, thrive, complete their degrees at high rates, and find meaningful work.

Colleges and universities have long relied on human-intelligence networks made up of faculty, professional advisors, other administrators, and students themselves to find the best balance of challenge and support for individualized learning and to monitor student progress. Because of the favorable ratios of staff to students at small, residential campuses, such networks continue to be a primary strength for those institutions.

Meanwhile, analytics offers new opportunities to improve student retention and success. Learning analytics has been defined as "the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs."1 With the advent of analytics techniques including data mining and machine learning, liberal arts colleges are in a position to join with other institutions that are developing or enhancing early-alert systems and predictive models based on these techniques.2 Grinnell is working to integrate learning analytics with existing human-intelligence networks so that alerts, predictive models, and outreach to students might be improved. We see this integration, or "blending" work, as an example of "augmentation" as defined in a recent book by Thomas Davenport and Julia Kirby:3

Augmentation means starting with what minds and machines do individually today and figuring out how that work could be deepened rather than diminished by a collaboration between the two. The intent is never to have less work for those expensive, high-maintenance humans. It is always to allow them to do more valuable work.

Special Challenges for Small Schools

Based on several years of work with predictive modeling for persistence and completion at Grinnell, we have identified three special challenges that we are currently addressing. First, we have had little in the way of comprehensive, high-frequency data such as that which could be provided by a robust, campus-wide implementation of a learning management system. Although such a system is available, it is not widely used—or used to its full potential—by the majority of faculty on our campus. Second, because Grinnell is a selective college, our persistence and completion rates are relatively high; as a result, we continually encounter the "small n" problem and a lack of statistical significance in our analyses of those who do not persist. Third, the majority of our attrition occurs among students who are not in academic trouble—that is, they have B or better GPAs. As a result, we believe social-psychological factors play a significant role in persistence and completion on our campus and at peer institutions. Having identified these challenges, our efforts focus on enhancing our human-intelligence networks, our use of analytical tools, and the synergies at the intersection of the two.

Two Useful Frameworks: Attrition as a Complex Syndrome and a Model for Thriving

At many U.S. colleges and universities, challenges to retention are often primarily associated with two factors: preparedness and financial resources. Many students fail to complete degrees because they are unable to handle the academic demands they face. They lack time-management and organizational skills, they arrive from underfunded secondary school systems that leave them without the writing and quantitative training they need, and they find themselves overwhelmed in the classroom. In other cases, students and families borrow to their limits and, faced with escalating tuition costs and competing demands, discover that they are unable to manage the financial load. Such forces can affect liberal arts colleges as well, but challenges to retention at these institutions often illustrate a series of different factors that are not so easily identified or confronted. With this challenge in mind, a holistic approach to the analysis of the student experience can be particularly valuable.

In the absence of single-variable explanations, our ongoing research is guided by the exploration of attrition as a syndrome shaped by multiple, connected, and correlated factors. In addition to preparedness for the academic demands of college and financial needs, we are exploring the impact of social and psychological factors, mental health, substance use and abuse, and the way our college seeks to sustain a sense of purpose by linking the curriculum to future careers and postgraduate life. We recognize that such factors are also at play for many larger institutions, including major public universities with much larger student populations, but liberal arts colleges can provide an ideal laboratory for research centered on the concept of thriving, as described, for example, by Laurie Schreiner and her research colleagues at Azusa Pacific University.4 This model suggests the following questions: What factors enable students to become deeply invested in their own education, develop a sense of the link between their studies and professional ambitions, remain resilient and optimistic in the face of academic stress and personal challenges, and build healthy relationships with peers? How can colleges and universities create environments in which students from a wide range of racial, cultural, and socioeconomic groups are most likely to thrive, and what tools are needed to do so?

Such a framework is both challenging and potentially transformational. Because it emphasizes a broad, holistic approach, tight causal relationships are difficult to identify. The student populations and cultural settings vary widely at different institutions, making it difficult to generalize from one case to the next. Yet an emphasis on the concept of thriving enables us to move beyond retention to consider other metrics as well, including the rates of student participation in research, internships, extracurricular activities, and other key indicators of engagement. It also puts a premium on qualitative approaches and evidence drawn from data that students can provide themselves about their lived experiences and the ways they encounter and respond to a campus climate.

From Data to Action

Under the leadership of President Raynard Kington, the Grinnell community has increased its focus on the use of data and analysis to assist partners across campus in better identifying patterns that may indicate a risk for students. This work emanates from a framework rooted in health-services research and public health, looking not at disease prevalence but at student outcomes, substituting co-morbidities with co-occurring concerns identified by our own practitioners, and focusing on intervention and demographic factors that together impact probability of persistence (term-over-term engagement) and completion (four-, five-, and six-year graduation). This work has resulted in new discussions, many of which bring anecdotal information and existing data into the same frame, providing a rich and data-informed narrative on which practitioners may choose to base their processes and interventions.

Human-Intelligence Networks and Related Initiatives

Grinnell's Advising Partners model for student success is rooted not only in academic advising but also on a wide range of additional campus resources.5 In this model, every student is assigned a faculty member as their academic advisor; a Residence Life Coordinator advisor; and a Careers, Life, and Service advisor. Beyond this triad, students have access to two more "layers" of advising and support services, based on the frequency of contact. In the first layer—where repeated personal interaction between student and staff/faculty might occur—are course instructors, coaches, chaplains, international student affairs specialists, and others. In the second layer are the staff members in administrative functions such as the academic support labs, the financial aid office, the registrar, the student health and counseling center, and more.

Within this human-intelligence network are four noteworthy ongoing initiatives. First are the Academic Performance Reports, which faculty are encouraged to submit to the academic advising team any time there is a concern about student academic performance. Second are Student Conduct reports, where student affairs staff use a commercial tool to help communicate and advise students with personal concerns (often related to family, interpersonal, or health issues). Third are Midterm Assessments, a system currently based on in-house software. These assessments are intended to be estimates of students' academic performance within a given course in three categories: satisfactory, marginal, and at risk. Rather than assigning midsemester grades—thought to be too granular and perhaps discouraging for students—these assessments and the associated feedback loop have been conceived as interventions that will help both the students and the various advising teams respond when students are in need of additional support. Although these reports do not provide time-series data or help in identifying attrition risk among high-performing students, this new system is proving to be quite helpful not only in ensuring that no student "falls through the cracks" but also in raising campus awareness of the need for the holistic approach described earlier.

The fourth initiative is an ongoing search for "student success software" to support the work of the Advising Partners network and to replace in-house systems. This search has proved to be difficult in a rapidly developing market. High-level requirements for such a system include:

  • Ease of data entry to encourage and enhance data capture
  • Supportive workflow such that data can be shared with appropriate security
  • Near-real-time access to the data for analytical purposes including alerts and predictive models

We are striving to couple the word "science" with interventions at Grinnell because we recognize the need for the scientific method as we work together across campus and with other institutions to learn more about the effectiveness of our actions in response to predictive learning analytics. We have a particular interest in behavioral economics theory in this context, though in our environment, rather than nudging students directly through some sort of electronic messaging system, we will more likely be successful in nudging the "nudgers"—the Advising Partners described above. We also recognize the need for more and more-widespread training and developmental opportunities for faculty and professional advisors. Faculty understandably seek evidence of the effectiveness of interventions as well as guidance regarding their role in any intervention. In an effort to encourage broader engagement in these issues on campus and beyond, Grinnell has hosted two conferences on student success and thriving that have brought together faculty, staff, from a number of liberal arts colleges and universities.6

Analytics and Related Initiatives

Analytics work at Grinnell is done primarily but not exclusively by the Office of Analytic Support and Institutional Research. As suggested in a recent Association for Institutional Research publication,7 we are advocating and developing a distributed (or federated) model for analytics work at our college. In addition to mining historical data for patterns among our student population related to persistence and completion, our analytics work focuses heavily in near-real-time alerts and predictive models.

One example of this shift to data-informed practice and decision making in student success is the use of what we now call "grade dynamics" after each term. We have found that students who leave the college—whether the result of their own choice to withdraw or a suspension or dismissal—have higher variability in term-over-term GPA. We also found that a decrease in GPA from any level of academic performance at our college is a potential alert. This can be simply visualized with a scatterplot comparing cumulative GPA and recent-term performance. This work is not complicated in and of itself, but we have found that the visualization and use of existing data in new ways has added a great deal to our understanding of how to identify what we see as at-risk students as early as possible. Integrated with information such as disability status, number of academic alerts from faculty, or other indicators, this type of graphic can help identify students who may not have previously been identified as at risk, even at a "high touch" institution.

Along with ongoing surveillance of student performance and outcome data such as course grades, we are currently investigating alternative predictors of student success. These alternative measures are focused on a student's response to questions regarding past behaviors, interests, college expectations, peer interactions, values, goals, future behaviors, and plans. This information is primarily found in Cooperative Institutional Research Program (CIRP) Freshmen Survey.8 To this we have also added the Duckworth Grit 12-item scale.9 Although the CIRP and Grit analyses are ongoing, we have seen some promising results in using a subset of the CIRP constructs (academic self-concept and social self-concept) to identify students who may be at a marginally higher risk for adverse outcomes. This type of information, although quite small in its additive predictive ability to current models, helps us address one of our most challenging attrition problems noted earlier, that of high-preforming students who leave for nonacademic reasons. To address this problem comprehensively, we have engaged with Civitas Learning, a commercial provider of predictive modeling services. The firm applies time and resource-intensive data cleaning, data structures, and machine learning to our existing and evolving data sources. With the help of this team, we continue to push for integrated, high-quality, alternative data sources from which to derive valuable insights.

A recent and data-informed initiative at Grinnell is our Finish Line project. This project had its origin in three simple scatterplots that used first-semester GPA on the X axis (the predictor) and last-reported cumulative GPA on the Y axis (the outcome):

  • A plot for those who had graduated in four years
  • A plot for those who had withdrawn from the college
  • A plot for all students not included in the first two groups

The third plot, which was labeled with names, stimulated a new and aggressive outreach to these students to determine if there was anything the college might do to help them cross the "finish line." Because the plot is shared with those who have direct contact with students, labeling the data points with names puts a "face to the number" and allows those who help students pull in other information that they may not have recorded anywhere else, that they may only know in passing, or that others may share during the discussion, thus activating the human-intelligence network.

Last, we need to emphasize the value we are finding in mixed-methods research to address the complexity in our persistence and completion work. Quantitative measures have indeed helped us identify patterns and understand the "what" of these issues but not necessarily the "why." Such qualitative measures as interviews, focus groups, and open-ended survey questions are important because they capture the perspective of the people being studied, particularly in terms of how these people make sense of the situations they are in as well as what motivates them to make the choices they do. Since 2012, we have completed two extensive, mixed-methods studies of student attrition and have also made major changes in our approach for exit interviews.

Looking Ahead: Moving Beyond "Autopsy Analytics"

The majority of the work we have done at Grinnell to-date with regard to persistence and completion has been focused on students who are either struggling or leaving. While developing a deeper understanding of these cases will continue to be important, we intend to turn more of our attention to success stories. Using well-developed techniques such as the grade dynamics analysis described earlier, we are able to identify not only the students who are struggling but also those who are demonstrating significant improvement and signs of thriving behavior. With the help of our qualitative research specialist, we intend to develop and share those stories widely within our community. We also intend to look closely at the "false positive" cases in mid-semester assessments (i.e., those identified as "marginal" or "at-risk" but who earn a B or better final grade), assuming that in some cases, this particular "intervention"—sharing with students their performance thus far during a term—is altering student behavior and that there are success stories to be shared here as well.

We continue to see the value of and need for time-series data, loosely defined as data collected at a high frequency for every student and that may act alone or in combination with other data to alert us to students at persistence risk for either academic or nonacademic reasons. For institutions that have achieved a high adoption rate with a learning management system or online learning, such data may be readily available, but for institutions with limited use of electronic tracking, they are not. The information collected in the aforementioned data capture and sharing tool by our student affairs, academic affairs, faculty, coaches, and other mentors will likely be a strong contender for investigation in the future. Other sources of time-series data in our ongoing work and planning include passive data collection, for example from various card-swipe systems on campus, and the results of regular microsurveys.

Finally, we have concluded that we cannot separate student success research from growing concerns nationwide about the mental health of our college-going population. In the past decade, the number of students entering college with mental health concerns has climbed dramatically.10 Students with mental illness, specifically depression and anxiety, are more likely to drop out of school and have lower GPAs than their peers.11 In order to further promote the mental wellness of our students, we are engaged in ongoing discussions with a team of researchers from RAND Corporation who share our interest in student success. With this team, we are developing a two-part research agenda to (1) better understand the context and need for mental health services and (2) introduce the concept of collaborative care to the provision of mental health services.12 Collaborative care is a team-based quality-improvement intervention that was originally introduced to address the shortage of mental health service providers. It integrates mental health care into primary care settings, using a team-based approach to support task-shifting mental health care to non-mental-health specialists.13

Conclusions

Based on recent, focused efforts regarding persistence and completion at Grinnell College, as well as the shared learning from two conferences hosted on our campus, we are pursuing a holistic, integrated approach to improve our understanding of both student attrition and thriving. Human-intelligence networks made up of faculty, professional advisors, other administrators, and students themselves will continue to be a critical aspect of student success work on college and university campuses of all sizes. In an age of increasingly complex predictions and use of ever-expanding data sources, we are actively developing and testing both in-house and commercial software systems for student success; we anticipate increased levels of augmentation from these systems in the future. Because the majority of attrition at Grinnell is among high-performing students, we are increasing our emphasis on the influence of social-psychological factors in developing alerts, predictive models, and associated interventions. This area in particular is ripe for the engagement of both human-intelligence and data-informed intervention. We plan to use data to assist in identifying students who may be at risk, but this information must be viewed in light of the "on the ground" network of faculty and mentors who have considerable knowledge about individual students and their unique circumstances. Combining the information provided from our models with the knowledge that our human-intelligence networks hold, we hope to accurately identify students in need and the interventions that may be appropriate for a given student at a particular time. This approach is commonly seen in health care, where actions may be recommended through an electronic health system, but it is ultimately up to providers—with their nuanced view, expertise, and knowledge of the individual—to determine the appropriate course of action.

In the coming year we will increase our emphasis on the study and understanding of success stories to add to our analysis of attrition, thereby moving beyond what we commonly call "autopsy analytics." Our future analysis and current results can and should be informed by a holistic view of the student experience, not just the students who are struggling or have left. These success stories may be able to shed light not on what pressed students to leave but on what kept them here—a critical element of the retention puzzle. The "attrition syndrome," as we are considering it, has multiple co-occurring risk factors, but we may also consider certain elements of the college experience as protective factors, for example, a particularly strong sense of integration with the community or a specific pre-orientation program that may protect students from other factors urging them toward the door. A focus on these elements in addition to the traditional attrition markers will expand our efforts in a way that not only identifies at-risk students but also helps optimize the college experience for every student on campus.

Notes

  1. See Society for Learning Analytics Research.
  2. ECAR-ANALYTICS Working Group, The Predictive Learning Analytics Revolution: Leveraging Learning Data for Student Success, ECAR working group paper (Louisville, CO: ECAR, October 7, 2015).
  3. Thomas H. Davenport and Julia Kirby, Only Humans Need Apply: Winners and Losers in the Age of Smart Machines (New York: HarperCollins Publishers, 2016).
  4. Laurie A. Schreiner, "Thriving in College," New Directions for Student Services 143: 41–52.
  5. Joyce Stern, "Advising Partners Model," from the Thriving at the Liberal Arts College conference, Grinnell College, April 8, 2016.
  6. Michael Latham, Randall Stiles, and Kaitlin Wilcox, "Key Findings from a Conference on Student Success at the Liberal Arts College," proceedings of the 11th National Symposium on Student Retention, Orlando, Florida, November 2–4, 2015.
  7. Randy L. Swing and Leah Ewing Ross, "A New Vision for Institutional Research," Change Magazine, March–April 2016.
  8. Higher Education Research Institute, "Cooperative Institutional Research Program (CIRP) Freshmen Survey (TFS)."
  9. Angela L. Duckworth, Christopher Peterson, Michael D. Matthews, and Dennis R. Kelly, "Grit: Perseverance and Passion for Long-Term Goals," Journal of Personality and Social Psychology 92, vol. 6 (June 2007): 1087–1101.
  10. Data indicate a 10–15% increase in students reporting ever being diagnosed with depression, as compared to the year 2000. See American College Health Association, American College Health Association—National College Health Assessment: Reference Group Data Report, Spring 2008 (Baltimore: American College Health Association, 2008).
  11. Daniel Eisenberg, Ezra Golberstein, and Justin B. Hunt, "Mental Health and Academic Success in College," The B.E. Journal of Economic Analysis & Policy 9, no. 1, September 2009.
  12. Victoria K. Ngo, Bahr Weiss, Trung Lam, Thanh Dang, Tam Nguyen, and Mai Hien Nguyen, "The Vietnam Multicomponent Collaborative Care for Depression Program: Development of Depression Care for Low- and Middle-Income Nations," Journal of Cognitive Psychotherapy 28, no. 3, 2014: 156–67.
  13. Rafiq Dossani, Victoria Ngo, Randall Stiles, and Kaitlin Wilcox, "Mental Wellness in the College Setting," personal communication, May 2016.

Randall J. Stiles is associate vice president for Analytics and Institutional Research at Grinnell College.

Kaitlin Wilcox is associate director for Analytics Support at Grinnell College.

© 2016 EDUCAUSE. The text of this article is licensed under Creative Commons BY-NC-SA 4.0