When technology is implemented thoughtfully and empathetically, the impact can be profound.
What one factor plays the most important role in making technology work in higher education?
For those of us who spend our days managing email servers and learning management system (LMS) integrations, "empathy" may not be the first thing that comes to mind. But the past two years have reinforced just how crucial empathy and emotional intelligence can be for technology leaders seeking to serve students at their institutions.
Let me explain. Before the COVID-19 pandemic, I ran the shared services department at California State University (CSU) Channel Islands. My role included building the knowledge base for Ekhobot, our AI chatbot. After researching AI chatbots and learning that they can significantly reduce summer melt and boost retention, I thought my job would be simple: teach the bot what it needs to know, sit back, and let it rip.Footnote1
I hoped that this technology would essentially act as a "cheat code" that could, without a lot of input from me or my team on specific content, help us with the difficult task of helping students navigate and complete their education. I was ready to play my part in supporting the ambitious Graduation Initiative 2025, which aims to increase graduation rates and eliminate opportunity gaps for all students in the CSU system.
But a few months in, we weren't getting the response we had expected. Instead of gratitude and student persistence, students were opting out of connecting with the chatbot—sometimes with a bit of blue language thrown in. (Sure, the name of our institution includes the word "islands," but there's no need to talk like a sailor!)
In January 2020, my team and I found ourselves in a pickle: the transactional relationship between our chatbot and our students wasn't moving the needle on our engagement and retention goals. We didn't realize that the nature of the communication itself was our roadblock. We weren't confident enough to be picky about what the chatbot said to students. As a result, Ekhobot had no personality, and students were responding accordingly. But then the pandemic hit, and that's when things changed.
By March 2020, we still hadn't figured out the right way to communicate with students through the chatbot, but we knew we had to say something. We couldn't leave our students hanging. So, we had the chatbot send all CSU Channel Islands students a silly meme. Suddenly, students began engaging in ways they hadn't before. That led us to start experimenting with other ideas, such as knock-knock jokes and emoji smiley faces. We even built a Spotify playlist based on what students told the chatbot were their favorite songs at the time.
Underpinning all this work was a sense of empathy and compassion for the fact that our students were living through an incredibly challenging time. Sometimes that meant providing proactive information about the counseling center or other campus resources. Sometimes it meant sending an emoji or two. Regardless, the communication with students always came from a place that wasn't transactional but rather was rooted in the genuine, face-to-face interactions that we know translate to a sense of belonging and motivate students to persist and work hard.
Students' responses have provided the most powerful proof that an empathetic approach works. They treat Ekhobot almost like a pet or a friend. They thank the bot for giving them advice, and they're often comfortable acknowledging when they're feeling stressed or anxious (which we can then elevate to campus counselors to provide one-on-one support). That's pretty unique when you think about it. When was the last time you thanked the disembodied bot embedded in your phone?
Perhaps most importantly, our empathetic approach is helping us to make systemic changes across the institution. We've used Ekhobot to survey students about what they like—and don't like—about the remote learning experience. We then sent that feedback to our office of Teaching and Learning Innovations. More than one-third of students responded to the survey (a record number), and we had enough data to provide institutional leadership with a brief to help them double down on what was working and fix what wasn't.
Growing up as an avid video game player, I always loved cheat codes, especially those that let you leapfrog whole sections of the game to get to the end. I had hoped that technology could do the same thing for our students and propel them to where we wanted them to be, with little or no input from any of us humans. The pandemic jolted me out of that belief, and it also helped me to understand something far more consequential: when technology is implemented thoughtfully and empathetically, the impact can be profound.
As we navigate our way toward a new normal, one thing is certain: Ekhobot will keep sending students jokes, asking them for music recommendations, and helping them access the resources they need. The lesson we learned during the pandemic is that no one of those things is any more critical than another. Without the silly stuff, we can't build the relationships that help students listen when it's time to talk about more serious topics.
The experience of the past two years and our work to be more mindful about how we use edtech have been nothing short of transformative, and I hope other institutions can learn from our experience. If we think of technology as a tool to extend and amplify meaningful human interactions, students' experiences will be all the better for it.
Note
- Hunter Gehlbach and Lindsay C. Page, "Freezing 'Summer Melt' in Its Tracks: Increasing College Enrollment with AI," Brown Center Chalkboard (blog), Brookings, September 11, 2018; Tim Renick and Lindsay Page, "What Does It Take for Nudging to Impact College Students' Success?" Higher Ed Dive, September 4, 2020. Jump back to footnote 1 in the text.
Tara Hughes is Interim Chief Information Officer at California State University Maritime Academy.
© 2022 Tara Hughes. The text of this work is licensed under a Creative Commons BY-NC 4.0 International License.