Principles for the Responsible Design of Automated Student Support

min read

Higher education's digital transformation is creating bright opportunities for automated student support, and it's time we consider how to design these automated systems in principled, responsible ways.

woman holding a mobile device
Credit: oatawa / Shutterstock.com © 2019

For many routine tasks in higher education—like answering standard admissions questions, moderating discussion forums, and proctoring exams—automated solutions present an attractive possibility.1 Automated solutions can operate at massive scales, with minimal costs, in ways that would be impossible with manual effort. For example, no university would reasonably expect all of its faculty members to manually send daily personalized nudges to students who seem to be disengaging, but an automated tool could easily do it at scale in real time.

Automation in Higher Education

Optimism for this kind of automated student support, where a system monitors risk and deploys nudges to students, faculty, or advisors, has been a driving force in higher education's digital transformation. Between 2013 and 2018, the EDUCAUSE Integrated Planning and Advising for Student Success (iPASS) initiative (formerly referred to as Integrated Planning and Advising for Students [IPAS]) has made grants2 to improve student success and degree completion through, among other things, predictive analytics and automated interventions.3 Preliminary research generally finds strong efficacy for these investments, and the optimism seems to be warranted. However, as suggested in a recent Inside Higher Ed article, efforts to develop standards for the responsible design of these systems lag behind our enthusiasm to pioneer and innovate in this space.4

When we imagine the risks of automation, our cultural baseline is anchored by catastrophic fictionalizations like Skynet and Colossus. In contrast, the possibility that college students might receive off-putting nudges about missing assignments seems negligible. Indeed, a 2014 ECAR survey of college and university administrators found low levels of concern about the practical risks of automated student support.5 However, the same survey revealed high levels of concern about stakeholder resistance, suggesting that overcoming sensitivities to automated tools is a principal barrier to institutional adoption. One CIO was quoted, "The challenges have more to do with governance, with organization, with process…." In order to overcome these challenges, it's critical to provide assurances that the system has been designed in a responsible way.

Responsible Design of an Automated Student Support Tool

When our team at Indiana University6 designed Boost [https://boost.education/], a mobile application that deploys automated real-time student support,7 we knew that institutional buy-in would be critical to the initiative's success. To earn this support, we put ourselves through the most challenging series of stakeholder meetings we could muster. From dozens of meetings, a set of five core principles continually resurfaced as discussion themes. The following five principles echoed broader concerns about the risks of automation, unintended consequences, and agency.

Student looking at phone app showing Enabled Courses and Available Courses
Credit: Photo by Emily Sterneman, Indiana University.

1. Make It Opt-In

As a mobile app, Boost is necessarily an opt-in service—you can't require an entire population of students to install an app on their smartphones. But during our stakeholder meetings, it became clear that as an opt-in service (rather than as an opt-out or obligatory service), every stakeholder was more comfortable with the tool. There were a variety of reasons for this:

  • When a support tool is opt-in, the service is more transparent and accountable. There should be no ambiguity about where the support originated.
  • In theory, nudges are more beneficial when the nudge's target is involved in structuring the intervention.8 By developing opt-in support services, students are given a fishing pole instead of an occasional fish.
  • Support systems that administer interventions to all students (rather than just those who have opted-in) could be perceived as nannyware. Some stakeholders were opposed to a "big brother" service hand-holding their students en masse, but they were happy to support the same service for students to use independently.

2. Make It Customizable

Though we'd like to believe that all courses, all instructors, and all assignments are equally important and equally challenging, this may not be the case. Students may have a better handle on some courses than others. The practical benefits of an automated support tool shouldn't be diluted by treating learning environments as one-size-fits-all.9 Instead, students should have the ability to activate nudges for some courses and mute them for others. Customization is dually beneficial: the automated support becomes personalized for students' needs, while designers are spared the nearly impossible task of inferring the relative importance of different kinds of courses and assignments.

3. Avoid Interventions that Diagnose or Label Students

Analytical and data mining applications in higher education are predominantly geared toward predicting risk.10 But alerting students to imminent risk is an idle goal unto itself, and indicating a student is "at risk" may not convey the motivational encouragement intended. Interventions suggesting that a student is at the bottom (or even at the top) of a distribution could be demotivating.11 Support messages should be encouraging and growth-oriented (see 4, below).

4. Include a Call-to-Action

This is a corollary of the previous principle and reflects an age-old tenet in direct marketing and advertising. In the context of student support, when a student receives a flag, alert, or intervention, the student should know what to do with it. One of the key benefits of Boost is that all of its support messages are sent in the immediate context of a beneficial action (e.g., read an instructor's announcement, submit an assignment prior to the deadline, plan how you'll complete upcoming work, etc.), so that students have a clear idea of how to react at the moment they receive it.

5. Make It Transparent

If automated support is deployed according to course context—assignment submissions, attendance marks, instructor announcements, and upcoming deadlines—it's important to know from whom this context is coming: instructors. Instructors have privileged insight into their unique course contexts and can provide feedback about a support tool's alignment with their expectations for student engagement and organize their courses so that nudges are more effective. For these reasons, automated tools should not operate behind a curtain. Instructors should know what kind of assignments, events, activities, and other parameters contribute to an automated tool's behavior and be able to see the frequency of different kinds of alerts deployed to their students. When nudges are triggered probabilistically, there should still be transparency in the process. For example, LoudCloud's learning analytics platform includes a "justification of alert" feature, which describes the variables that contributed the most weight to the deployment of any nudge.

Higher education's digital transformation is creating bright opportunities for new technology solutions, including automated student support, and these solutions are having a big impact on student success. As the field matures, it will be increasingly important to reflect on how developers ought to design these solutions so that their benefits are not limited by institutional sensitivities about their implementation.

Notes

  1. Dawn Medley, "Using AI to Help Students Learn 'How to College,'" EDUCAUSE Review, May 20, 2019; Amit Chowdry, "Packback Is Building A.I. to Enhance University Learning," Forbes, November 20, 2017; Matt Jaeh and Christopher Brown, "Artificial Intelligence in Online Proctoring: Where We've Been, Where We Are, and Where We're Going," EDUCAUSE Review, September 10, 2018.
  2. Funded by the Bill and Melinda Gates Foundation and the Leona M. and Harry B. Helmsley Charitable Trust.
  3. The iPASS initiative also focused on degree planning, integrated advising, and new kinds of academic counseling.
  4. Frederick Singer, "Will Higher Ed Keep AI in Check?" Inside Higher Ed, January 16, 2019.
  5. Ronald Yanosky, Integrated Planning and Advising Services: A Benchmarking Study, research report (Louisville, CO: ECAR, March 2014).
  6. The design and development of Boost at Indiana University was a collaborative effort by Matthew Mallon, Matthew Gunkel (now at University of Missouri), and the author.
  7. Boost was featured as a 2019 EDUCAUSE Horizon Report exemplar project and received a 2019 Platinum IMS Global Learning Impact Award.
  8. Samuli Reijula and Ralph Hertwig, "Self-Nudging and the Citizen Choice Architect," SocArXiv, June 12, 2019.
  9. Dragan Gasevic, Shane Dawson, Tim Rogers, and Danijela Gasevic, "Learning Analytics Should Not Promote One Size Fits All: The Effects of Instructional Condition in Predicting Academic Success," The Internet and Higher Education 28 (January 2016): 68–84.
  10. Cristobal Romero and Sebastian Ventura, "Guest Editorial: Special Issue on Early Prediction and Supporting of Learning Performance," IEEE Transactions on Learning Technologies 12, no. 2 (2019): 145–147.
  11. Jeffrey Young, "When a Nudge Feels Like a Shove," EdSurge, March 8, 2018.

Ben Motz is a Research Scientist in the Department of Psychological and Brain Sciences and a Faculty Fellow for Academic Analytics in University Technology Services at Indiana University.

© 2019 Ben Motz. The text of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.