John O'Brien talks with three IT leaders working with data and analytics to develop a more fair and equitable journey for students.
John O'Brien: Welcome everybody to an EDUCAUSE Community Conversation. Today, we're exploring the intersection of two topics that I feel really strongly about and that are really fundamentally changing higher education in important ways. Analytics and diversity, equity and inclusion. And today I'm thrilled to be joined by three guests who offer a range of perspectives on this critical issue. And at this time, I'll just ask each of them to introduce themselves, starting with you, Kim.
Kim Arnold: Hi, thanks for having me. My name is Kim Arnold and I am the Director of the Learning Analytics Center of Excellence at University of Wisconsin-Madison.
John O'Brien: And I'm a lifelong Minnesotan. So I feel like I'm back in the bread basket for a time so that's wonderful. Wendy, do you want to go next please?
Wendy Puquirre: Sure. Hi everyone, my name is Wendy Puquirre. I'm the Equity and Justice Research Analyst in the Office of Equity, Diversity and Inclusion at University of California-Merced. Our office is going to become a division soon so now we're going to be edgy. Where the Equity and Justice and Inclusive Excellence Division.
John O'Brien: Jonathan?
Jonathan Gagliardi: Hi everybody. I'm Jonathan Gagliardi, I'm the Assistant Vice Chancellor for Academic Effectiveness and Innovation at the City University of New York.
John O'Brien: Wonderful. I'm just so, so grateful that you all took the time. Given the topics you're working on, you are the busiest people in the universe and it means a lot to me that you took the time to be here and support this conversation that's so important for our community. I know that our community is super interested in the different ways that we can use analytics both in learning but also to advance the important work of DEI, and we can all learn from some of the great work you're doing.
So less of me and more of you and let's start the conversation. So first of all, I'd love to just take a deep breath and pause and explore this nuance between the idea of just advancing this urgent work at the same time doing no harm and keeping an ethical mindset front of mind as we do this work. Could you talk a little bit about what it means to have an ethical approach to data and analytics in the work you do while at the same time, actively addressing DEI issue and opportunities?
Kim Arnold: I can start off a little bit with that question, John. And I think part of this in the learning analytics space, which is really what I live and breathe every day, it often requires quite a bit of just reframing, right? And thinking that the do no harm is kind of the ethical standard. We see this in a lot of codes of practice, and it's a really important thing to keep front and center in our mind.
But there's a difference in doing no harm and then getting to the action, getting to that verb, right? So we want to actively use the tools and methods that are at our disposal as a tool for social justice, right? And so by actively flipping that paradigm and thinking, "We're just trying to avoid something," we're saying, "Well, we're actively trying to leverage these analytics techniques as a tool of social justice."
And I think often what we see, especially on the learning analytics side of things, is that tools or methods that have been existing for a while, reports that we've run on campus for decades and decades, they're often perceived to be in place to perpetuate this long standing bias and power dynamic rather than as these levers to actually support and power and uplift the learners. And so that's obviously an issue. And so what I think is really important is that we try to shift saying we're challenging some of what's existed for a long time.
We're challenging, for example, the privileged baseline, right? So this myth of an average student, right? The "average student" and letting kind of the edge cases settle down on. When we're doing statistical models, we often see that that's what we're trying to predict, how close are you to the average student? And that obviously doesn't feel super good to the students that we're trying to support if they're constantly falling in those edge cases. So kind of thinking, how do we challenge these myths that kind of exist in the analytics world and how do we use some of these tools at our disposal to really challenge some flawed issues around control, ownership, agency that's really deeply ingrained in the American educational system at large.
Jonathan Gagliardi: Kimberly you brought up a really great point. And John, I think it's a wonderful question to open up with. As we all think about the ongoing analytics revolution and the implications for using data in ways that translate into meaningful actions that hopefully thread the needle. And I mean hopefully thread the needle between, student success, equity, institutional sustainability, of course, because those things really can't be tangled from one another. I think that there are a couple really key things to keep in the back of our heads. And again, Kimberly, I just want to note, you alluded to higher education generally being regarded now as a series of entrenched structures that are as much about perpetuating the status quo as they are about introducing disruptions, transform lives and change minds for everyone, regardless of their background. And for me, at least the City university of New York, we live and breathe the idea of equity and social justice.
And so as we begin to think more about how we're going to be using data and analytics in pursuit of our mission, our vision, our values and our identity, you're right to point out doing no harm is a very passive orientation towards the use of data. And that's one of the things that really sticks out to me as a big peril of traditional approaches to how colleges and universities work. And so, you're right, data are a tool that we can use to specific ends. It's also true that at least historically speaking, colleges and universities have leveraged data to identify risk and mitigate risk rather than flip that paradigm and begin to think a little bit more about how risk actually represents the maximum potential for impact you can have in any particular student or the people, communities and economies you serve.
And so taking a passive approach rather than a proactive one, really actually introduces a number of opportunities for both intentional and implicit bias to get baked into working models on a daily basis, which believe it or not, runs counter often to our stated goals of trying to broaden diversity, increasing equitable outcomes and creating more inclusive college communities, whether that's done face to face, in person, hybrid or asynchronously for that matter. And so I think a lot of that stuff is kind of been put in full relief as a result of the issues we've seen arise from the pandemic, but those issues were already present and data can either help us eliminate those as much as we can, or it can actually feed into them, which is something I think we all have to have an eye towards.
Wendy Puquirre: I actually really love this question. I'm really glad you asked it because I'm an EDI analyst. And so what that means is, I really sit directly at the intersection of data analytics and EDI work. And to me, the doing no harm is important in so far as when I think of data analytics, I think of myself being able to hold up a mirror to the university with data, this is what you look like. This is how the university is impacting these different groups, right? I hold up that mirror. And when it comes to DEI issues, I specifically focus and center the mirror on historically marginalized and excluded groups of people to speak to their experiences and to ensure that decision makers are able to act and improve the circumstances around historically marginalized groups, right?
So from that perspective, the biggest danger to me, the biggest threat to harming the people I'm centering is that often they still are minorities. And in data terms, that means they're very small ends. And depending on how far down I drill or anybody drills with a product that I created or provided, it means that they might be exposed in some way, right? And although it's not me actively causing harm, the inability to think about these things, particularly for executives, right? They have such a high level eagle-eyed view of the institution that they are often just not thinking about these things. And so it's up to me to ensure that whatever I do, whatever product I produce and provide to somebody, particularly as it involves small groups of people that are still very much important and we're trying to increase their numbers in this context, I need to make sure that I protect their identities, that I do everything I can to educate the people that I'm providing the tool to, to make sure that they understand how to wield the tool appropriately and ethically.
And I think that part is actually a lot more challenging than I expected. The reality of having to interact with folks that are going a thousand miles an hour because there are so many things going on. They're busy, they need an answer, they need to make a decision they need to do... That's the cadence and rhythm of their day to day, the threat to harm is you are going to misuse what I'm giving you and I need to figure out how to create enough space to empower you to learn how to use it, use it effectively, and still maintain your cadence of whatever it is that you're doing.
John O'Brien: And you captured the exact challenge which is the combination of urgency and care. And oftentimes, as things are moving as quickly as they have lately with the sense of urgency around these issues. And I don't think anybody's thinking of intentionally misusing data but the opportunities to accidentally misuse data, it's not trivial. So once you gather the data, you've gathered the data and it may or may not be used in ways that weren't initially intended. In fact, it probably will be because the models will become different over time. Are any of you taking steps to,, I don't know, a checklist or looping in a chief privacy officer or establishing some codes around the data that you collect to guard against to the extent possible that not happening.
Wendy Puquirre: I can speak a little bit to that. So I'm fully situated, although I'm an EDI analyst, I'm an analyst. I'm in the Office of Equity, Diversity and Inclusion and dotted line to the institutional research and decision support team on our campus. And so they've done an excellent job of establishing guidelines and rules for a lot of the tools that they've produced to protect people. And for this very reason, if you're doing EDI work, if you're doing breakouts by different marginalized identities, depending on how far down your drill down is, things are going to get real small. And so there are rules in place to make sure that if it's under a certain threshold, we don't see account on anything that's publicly available, which means that, right now, we're in that phase of trying to navigate that and figure that out because we're also a very small campus, right? We're a new university.
So we're already dealing with relatively small ends. And then depending on where you focus your lens, your ends get really small. And so in my position, I'm also having to develop some of these rules and policies within our office to ensure that even our office is aware, they're not data people, right? I'm the data person. And so I also have to educate my colleagues on the importance of these guidelines and strategies for protecting people's identities, and then it's also a constant back and forth with the executives to make sure that they understand, I'm not trying to hide anything from you. I just need to create some rules and strategy so that you are also protecting other people when you go talk to this Dean or this Chair.
And then the other piece of that is the Dean of Governance is so important, and I don't know that everyone really understands its value. Maybe it's just my context where I'm like, "We need a data governance board really quickly. We need everybody on board and we all need to be on the same page so that we can collectively create these rules," because my position is new, our office is relatively new, so I'm kind of having to create things sort of from scratch, but I also am not in the best position to create these rules.
John O'Brien: Thanks Wendy. So when, in my welcome you, I sort of threw out the phrase using analytics to advance DEI goals. And that's a great phrase, but I'd love to hear you tell me what does that exactly mean in your world? What do you see happening in this field right now that supports this work? We'd love to hear what that phrase means to you on the ground. Jonathan, you want to start out?
Jonathan Gagliardi: Yeah, I would love to John and thank you so much for the question. I think for us at the City University of New York, and there's a wonderful resource that Taffye Benson Clayton had pulled together as part of the American Council Education's DEI learning community about a year and a half or two years ago where she really kind of set down some key terms. So just kind of ground us in that, we're thinking about, diversity, we're thinking, not just in terms of counts of people of course, but as importantly, the inclusion of a number of perspectives and making sure that folks from different cultures, different races, different religions, different places and spaces in the world really have an opportunity to engage in a vibrant academic community in ways that wind up being beneficial to any number of folks both directly and indirectly. And that really goes right into the idea of inclusion for us at the City University of New York.
And so really, ultimately, how are we helping those historically marginalized and minoritized populations who CUNY was born to serve, reap the benefits of one of the nation's preeminent edges of social intellectual upward mobility and equity for that matter. And then last but certainly not least, it really does boil down to equity.
And so for us, whether we're thinking about this in terms of access to our services or we're beginning to think about academic outcomes. And not just grades, although those are important, but institutional learning outcomes for that matter, it becomes really critical for us to take a multidimensional lens to the ideas of diversity, equity and inclusion, try to identify places and spaces where those things converge so that we're able to leverage scarce resources for maximum impact and hopefully do so in ways that fulfill our obligation and our commitment to the students that we serve to create that really diverse, inclusive experience in ways that promote equity in terms of outcomes and satisfaction with your education.
And so really, where I spent a lot of time over the better part of the last or so is focusing on two spaces where our university system is uniquely poised to try and effect positive changes in that regard. Those include everything from measures of academic momentum, which have really squarely landed on this broader idea of transfer. For us, any one of our senior colleges winds up having between 50 and 75% of its students who come from our community colleges, who we know are engines of democracy, certainly, but also entry points for those historically marginalized and underrepresented or low income populations as they pursue a post-secondary education that is high quality, that's of value and that's affordable. And that's no guarantee given the headwinds that we've been facing in higher education, but more broadly speaking as a society itself.
And so we're beginning to take a deeper dive into how we assess institutional learning outcomes across the intersectionality of race, income, gender, and any number of other traits that we're able to collect in partnership with our students to try to figure out are there courses or sections or modalities or majors for that matter that some students seem to be running into bigger issues and what I would probably consider to be institutional roadblocks, not personal ones, that we need to try to remedy and change.
And so that really kind of brings me back to two of the things that Kimberly and Wendy saying about that, which is what are those structures? How do you flip that mirror and what is it that you're doing, whether it's from an academic enterprise perspective or whether you're talking about how we deploy technology or meet basic needs in ways that really do thread the needle between those three constructs and ways that make us a better university for all of our students as we begin to look ahead around the corner and figure out how we're going to emerge from the pandemic in more successful ways. So we can take a deeper dive into that, but I want to make sure there's time for other folks to chime in too based on their experiences.
Wendy Puquirre: Yeah, I think Jonathan said it very well. I think as I've mentioned earlier, really from a quantitative perspective, if any of you have like a quantitative social science research background, my training is in sociology of education, and the bulk of the research that I read and studied has often been based on, I think Kim said it earlier, the average student, which is white middle class male typically, especially if you go way back to the '60s when like a lot of this research is coming out of sociology of education.
And what I get to do that I know that is very niche through my experience is that I actually get to center historically my marginalized groups in a lot of the quantitative models and data that I present. And when I talked about the mirror, that's the mirror I get to hold is that for however many decades, we've been thinking about higher ed as a place that serves a very specific group of people, whether we were aware of it or not, right? Often people, I don't think ever really gave it another thought until fairly recently.
But I get to hold up that mirror and say, "Look, look over here, look at these groups that have been ignored for very long. Look at what their experience in this context that you take so much pride in. Look at what their experience is like. Can you make it better? Can you live up to this vision and mission that you have of the university of yourself, of all of these things and do everything in your power to ensure that this mission is fulfilled even for these people? Especially for these people, right? Especially for people that have haven't had access? And what can you do to improve their experiences?" So UC Merced serves like 75% first generation college students, and that alone opens up a can of worms for folks, right?
And it means that everybody has to be hands all hands on deck on making sure that every structure, every service that we're offering can speak to the experience of they are the first to experience this. And are you doing it effectively? Are you actually serving them effectively? Or did you bring the model that you had at your previous job? At that other university? Do you understand how to shift? That's what our office is here to do to just kind of help people think a little bit through that because that's not intuitive for a lot of people, right?
I get the sense that most folks, this is how it's always been, this is how they were trained, this is how it worked at the other university. Now they're here, now they serve 75%, first gen 60% color grant recipient, 60% URM. It takes a lot of imagination, a lot of really thinking outside of the box and re-imagining what higher ed should look like to properly serve people who have historically not been served.
And I get to do that. I get to give them the data and kind of help them think through it, using the data, but really they're the implementers, right? They're the ones that actually have to go do the work. And there's only so much I can do, right? Especially at a high level view of institutional numbers. But I can guide them and do my best to help them and educate them, and I'm hoping that that's what we're able to do in our office.
Kim Arnold: Yeah. So just as I'm listening to my colleagues here, I get so excited having these conversations because historically, those of us who are on this call, it's probably the case that a lot of institutions, if you are talking about the power of data and analytics, you often are doing it with a relatively small group of people, people with very specialized training. And if you're talking about DEI, it's probably the same way I think with a lens to social justice, inclusion, diversity and equity are becoming much, much more common, but I'm just reiterating what I probably should have said at the beginning that I'm really excited to be here kind of having this conversation. And as I was listening to folks kind of talk, one of the things for me that's been a challenge sometimes is to help frame up for folks like, "Tell me, so you say you want to use analytics, you want to use it as a vehicle for diversity, equity and inclusion. What do you mean by that? Help me understand it."
And I have fumbled through a lot of things over the past couple years and I've really been thinking about this largely brought on by the pandemic and some specific challenges with analyzing data and trying to keep as much bias out as possible. And I've really been struggling. How do I articulate this? And not to oversimplify it, but for me, one of the things that this comes down to is that saying I want to use data and analytics as a lever for diversity, equity and inclusion. It comes down to acknowledging and respecting the humanness of each and every learner, right? And so it's this human centered component that especially on quantitative analysis side sometimes gets a little bit lost in the shuffle. And I think we've seen so much of this democratization of data over the past decade or so. And I think that's the really exciting thing is we can say analytics is now, there are methods of involved there where we can highlight and lift up the uniqueness and the humanness of each of these learners.
We have ways to do that, we have tools that are developing in our software all over. None of it's perfect of course, but that's what it really entails. When people ask me, "What are you really saying?" When you're kind of saying that there's a whole list of things that I'm saying. But that's the thing that just comes out time and time again. And this is where we come to really considering a student is so much more then in just the courses that they're taking, right, or the program that they're in. And it's trying to kind of find ways that we can capture that and then better help support them. And at the end of the day, my thing is how can we better empower them with this huge deluge of data that coming? And that often means shifting focus from institutional needs for data analytics and focusing again more on the student as our kind of unit of analysis, if you will.
John O'Brien: Thank you. I'm imagining people listening to this podcast and thinking, "I want to do this work. We're not where we want to be but I want to be engaged in this sort of work." I'm just going to observe that I think I've heard the phrase threading the needle used like two or three different times and it seems like that is a success capability for your work and I'm wondering how many of you actually knew that that was what was needed. I mean, I think that you're navigating this super complexity of the issue itself, you're doing essentially shuttle diplomacy for issues that cross all these different divisions on campus. And you're trying to affect major change without triggering the immune system of the higher ed ecosystem. I mean, it's really quite inspiring or intimidating. Were you prepared for this kind of work when you started or did you learn it as you went?
Jonathan Gagliardi: Yeah look, it's a balancing act, John. I mean, that's just the truth of the matter. And so many people will ask me what I do and I will often answer, "Well, I trade in soft influence and diplomacy," because as I think many people have alluded to, this is really ultimately an exercise and shared governance, change management and adoption. Because as Kimberly was mentioning before, there aren't enough resources to effectively scale analytics across campus in a standardized way unless people trust you, feel like they can come to the table and ask you questions and be forthright about their own limitations and perhaps the shortcomings of their own lenses, which I have many of by the way.
And so it's really a big tent conversation if we're going to think about how analytics are going to be adopted in a large enough way where we can really tackle at a very personal level, as Wendy was mentioning, issues related to diversity, equity and inclusion. And not even issues, I shouldn't say that, more of the opportunity that is present in making sure that we are a more student focused university able to change on a dime and transform rather than try and get people to conform. And I think that that's a major issue that you've touched on that we're all running into at this point in time. And really it's because we're in the early stages of adoption at a broad scale.
Wendy Puquirre: I can very quickly say I was not prepared for how relational the work has to be because of... I think the worst thing you can do, I'm learning right, that personally I'm like very, "Yeah, social justice." Very animated and pumped and angry and all of these things. But in this work, analytics is the least personal I think when you're doing it, Part of the reason I love my job so much is that like I get to solve puzzles every day, right? That's part of doing data analytics, piecing things together, and I would argue that's not necessarily emotional work, but DI work is emotional, it does trigger the immune system of higher ed as you named it, I love that. It is very upsetting and a lot of what it requires is reflection.
But how do you invite reflection when we just talked about how the cadence is urgency, go, go, go, we need to do this. And it's like, that's not how it works. You have to slow down and really think about it and be a human person right now and try to imagine and think about the people you're trying to serve as real human people who need you to do well by them so that they can go on and do what they have to do, do what they came here to do. And I was not ready for walking people through these very emotional experiences. And it requires a lot of compassion in a way that how many other analysts are inviting this type of reflection and walking people through these very emotional topics? It's not the norm I would say.
John O'Brien: Yeah, that's was my sense of this too is like, it has the word analytics in it. So you think you're going to be that person solving puzzles and then but you're also somebody hands you the mic and now suddenly you are the voice of this and suddenly you're meeting with different groups and needing to understand their perspectives. I'm just really admire the work you're doing. And I do think one of the things our listeners will want to come away with are some promising practices. So it'd be great just to hear from each of you, what are a couple things that people really, that you would recommend that are ready to go that people can try.
Kim Arnold: So I don't think there's a magic answer here is part of the issue but there are a lot of really promising practices. And the two that I would point out to have people look into, which I think are not super, super shiny, but absolutely critical to success of keeping DEI and kind of data analytics at this close intersection. One is data literacy. And there is a lot of really promising work going on in data literacy and has been touched on multiple times during our conversation today. But historically this type of work has been done by people with very specialized skills. And when we're trying to hand insight off to folks who don't have deep understanding about data structures, it's really hard to kind of try to pick out any bias that may or may not kind of come up in there, but also just making sure that people aren't interpreting reports that we hand them or tools that we give them and they aren't interpreting things and taking action that could cause harm unintentionally and being able to just understand the disparate impact that for, in my line of work, for example, a given model could disparately impact different groups of people.
And so data literacy, understanding how to act, what you can tell and what you can't, is really a foundational need for any kind of organizational capacity. And it's not built in, in general. And so there's a lot of really promising things I think in the data literacy side. And then the flip side of that, the second practice that I would say really promising kind of highlights is lots of work in different disciplines around ethical frameworks, codes of practice, these type of things to help make you think through a lot of this. And I think we're at the point right now where we get stuck quite often in trying to make the research real, trying to make the academic research practical and put it into place. And I think that both of these things are very practical ways to kind of start, build, understanding, build a foundation so that ultimately insights that come from analytics can really be leveraged in meaningful ways to help support individual students and practices.
Jonathan Gagliardi: I would just add really quickly things you could do right off the bat, particularly if you're scaling out the use of analytics is one, lean into having conversations. Don't assume your analysis, as logical as your findings may be, reflect the reality of an institution where students. Talk to students and understand what it is they're experiencing that can help you square up whether or not you're on the right track or not. And we don't use enough of what we've already got. And so this idea of disaggregation and the intersectionality of categories we can already disaggregate, there's still a lot of untapped potential even doing that. I mean and I've always joked around, but if I can get my college to use descriptive statistics at the intersectionality of many of these categories on a daily basis, I'm going to make some moves and we're really going to help to change the educational trajectory of many of our students.
And so use what you have, don't assume you're right, talk with people, and make sure it's a community process of analysis because it's when you go off into that rabbit hole and you come up with something that you might actually trigger that institutional immune system that you were talking about earlier, because folks will say, "Whoa, that isn't right. And let's talk about why that might be." So I just think really good points and yeah, use what you got and don't assume you're right. Those are two things you could always do and talk to people. Just talk to people.
Wendy Puquirre: Fully on board with what they both said. I think the only thing I can offer is recognize that higher eds just forever conundrum are the of silos and be curious. Go and talk to people everywhere. When Jonathan said this is a community process, that is what it is. You need to talk to everybody, you need to understand what people, how they experience the university in their little silo and be curious, think about it, work in community, work with other people.
John O'Brien: So the work is challenging. On some days maybe it feels impossible. The urgency is high and there are also some pretty formidable obstacles in the way. So what are the biggest obstacles for you that get in the way of sort of adoption at scale for some of these efforts?
Wendy Puquirre: I think the biggest challenge for me has been the silos and realizing that what that means, especially with this kind of work, something that can be so triggering to the immune system of the university. That is part of my job. I have to go talk to people in different spaces and truly try to connect and understand them and how they might use some of my products, even if they don't ask me to create something for them. I have to be thoughtful about how I approach this because just connecting, that alone, I think is so much more powerful than people realize that I think in general in higher ed part of the climate is people are tired and worked out and feel unheard to some extent. And so part of my curiosity in talking to people is making sure that whatever I produce is going to be useful to as many people as possible. And the added bonus is that I'm more likely to get wider buy-in if I am connecting with people on a regular basis.
Kim Arnold: I could say, from my perspective, one of the biggest barriers outside of some of the other things we've talked about in this podcast, their cultural in nature. And that comes down to it's really difficult as humans and so obviously as institutions and organizations, it's difficult to have the hard reflections and have the critical conversations about changes that need to happen about practices that we've done in the past that maybe are not super supportive of diversity, equity, inclusion. And acknowledging that things that exist structural in our individual practices, acknowledging out loud that those have not been the most beneficial in the past is always a really hard thing to do.
And so that ties in there with the change. But when there's institutional commitments to saying, "We can do better, we know we can do better, we're going to leverage analytics to help do that," then it becomes more of a collective, you're not singling out any individual or any individual practice. You're saying, 'We're acknowledging we can do better," but that is really the hardest thing in thought paradigm shifts anyway, up to data governance and just thinking about data ownership, things like this. Anything I could name right now, it all boils down to change resistance and the really critical importance of the cultural elements.
Jonathan Gagliardi: I would just add to that. I think a major underlying and invisible component to that, which people don't always talk about, is that the incentives all run counter to one another. And so when we begin to think more and more about the manner in which data are being used, it usually suggests an altogether different approach that runs counter to the things that might be in it for me as a provost or a president with very short tenures for that matter, who may not want necessarily to try to adopt a new toolkit or a playbook, even though we know it may be the right thing to do. And I've been lucky enough to know leaders who are willing to just kind of scrap the table and say, "Here we go."
That's one. I think the tension that exists between research, big R research that's of peer review quality versus the need for timely, accurate, relevant, and current data, that can create a lot of mistrust between academic and administrative sides of the shop. And really for me, and I think folks may have heard this before, give me some directional data, I'll socialize it with the community, they'll tell me if it's wrong and they'll tell me how we can make it right. I'm not necessarily trying to pass muster on peer review every time we're looking at past earns through data.
And then I think we leave a lot on the cutting room floor because we don't focus on the points where colleges can either scale out or splinter. And that often boils down to the schools, the department chairs and the mid-level leaders who occupy different divisions and functions. And so when you go from five deans to 40 department chairs, if you don't get broad scale adoption in a standardized way, you're going to create unevenness, you're going to create inequity, you're going to actually perpetuate the issues that people run into. And so I wish we would focus more on designing tools through the lenses of the folks who are going to use them most at those critical junctures of an institution where you either going to scale it out or it's all going to whither on the vine.
John O'Brien: Yeah, Jonathan, you're reminding me when you... I mean, you talk about people want to embrace data and analytics projects that are going to help them make decisions now, now, now, or are going to have results in a short term. And my goodness, my experience in my home state is that when we looked at the achievement opportunity gap that those gaps started in third grade. So data and analytics isn't going to be any calvary coming over the hill solving problems in the next year or two, right? And so that work becomes more challenging in all the ways you're talking about politically or getting buy in and getting people to invest in things when it has that kind of solution horizon, it becomes a challenge. The EDUCAUSE Community is a pretty big tent. It includes technology professionals as well. And I'm curious, how can they support this work that you're undertaking?
Wendy Puquirre: I'm actually part of a work group that's a UC system-wide digital inclusion work group that came about after the pandemic and it is a lot of technology professionals. I think the best thing that they could do is, like I said earlier, just talk to your DI office or talk to people who you know are doing this work and ask questions, ask how you can be of service. What is it that you on an individual level can do every day in your unit, in your division, to make sure that whatever product you're developing, whatever it is that you do in your day to day is going to contribute to an overall EDI effort?
Kim Arnold: Much like Wendy said, find your community and make that personal commitment. That's the two things. And there's lots of places to find community. EDUCAUSE is a great place to go within your institution or professional academic organizations that are out there. So find your community, best way to start.
Jonathan Gagliardi: The invisible and valuable infrastructure of technology and data are increasingly vital to colleges and whether or not they're able to live their mission, vision, and values, embody who they want to be and differentiate themselves at a time when headwinds really do make it important for institutions to carve out their own trail, hopefully in a data informed and an equity driven way. And so when it comes to the members of EDUCAUSE who I know are incredibly diverse and who span a number of institutional roles, you get a front row seat in helping to shape how this relatively, early adoption tool is going to get used and scaled out across colleges and universities. And so embrace it, it's a chance for you all to privilege your roles, to increase the ability of the good work you do and really honor how you in particular can really help to shape what is going to be a driving force and how colleges operate over the better part of the next few decades.
John O'Brien: You've described the amount of effort involved. Why is the time right? Why now?
Jonathan Gagliardi: Why now? I think, to put it very succinct, there's an evidence imperative, and it is multifaceted in nature. Colleges have to use data in a number of different ways and one of the most important ways they can do that, in ways that actually support what they do is to really double down on the use of data and analytics towards DEI work. And so I'm hopeful that the many opportunities and challenges that have put institutions in this position will also be leveraged to create renewed institutions with a sharper focus on serving students and not themselves. And so it really there's an evidence imperative and it can be used to great effect if we focus it in the right way.
This episode features:
Director, Learning Analytics Center of Excellence
University of Wisconsin-Madison
Assistant Vice Chancellor for Academic Effectiveness and Innovation
City University of New York
Equity and Justice Research Analyst
University of California Merced
President and CEO