When Digital Access, AI, and Data Privacy Collide

min read

EDUCAUSE Shop Talk | Season 2, Episode 10

Higher education institutions are increasing digital access for students while also maintaining privacy and data protection. To support students, colleges and universities can promote technology democratization and efficiency, collaborate with AI for student success, support libraries for digital access, and streamline the student technology experience.

Listen on Apple Podcasts Listen on Spotify

Takeaways from this episode:

  • Access to digital tools, including online and blended courses, increases flexibility for students with varied academic needs.
  • Artificial intelligence (AI), digital, and data literacy are crucial to helping students, faculty, and staff balance commercially available digital and AI solutions with privacy concerns.
  • Institutions should collaborate across internal stakeholder groups and with solution providers to ensure data privacy and security is considered when procuring digital tools.

View Transcript

Sophie White: Hello everyone and welcome to EDUCAUSE Shop Talk. I am Sophie White. I am a Content Marketing and Program Manager for EDUCAUSE, and I'm one of the hosts for today's discussion.

Jenay Robert: And I'm Jenay Robert. I'm a Senior Researcher at EDUCAUSE and I will be your other host today.

Sophie White: Great. We are so excited. Today we're talking about balancing digital access and privacy in higher education, and we have Kim Arnold and Patsy Moskal with us as guests today. So I'll introduce them and then we'll jump into it. So thanks Kim and Patsy for being with us. Kim Arnold is the Director of the Teaching and Learning Program at EDUCAUSE she has 20 years of experience in higher education with deep roots in learning analytics, data governance, ethics and privacy, student learning, assessment theory and design and student digital ecosystems. And Patsy Moskal is Director of the Research Initiative for Teaching Effectiveness at the University of Central Florida - UCF. Since 1996, she's been evaluating the impact of online blended and digital learning on students and faculty. Her research interests include personalized and adaptive learning, learning and data analytics and instructional models to improve student success. She also supports faculty scholarship of teaching and learning resource research involving digital learning. So thank you both and you also bring a lot of experience over the years to this conversation. So we're really excited to chat with you.

Patsy Moskal: Well, thank you for having us.

Sophie White: Absolutely. And I'll just give a little bit of background of how we kind of landed on this topic. So the balancing digital access and privacy came out of the 2025 EDUCAUSE's Top 10, which is a report that identifies the Top 10 technology issues in higher education. And it was kind of an interesting look at expanding digital access to tools for students to support their learning while also keeping privacy needs in mind. So that's kind of the balance that we're looking at in this discussion and at EDUCAUSE, we also have a showcase, a collection of resources related to this topic that is releasing at the end of May, 2025. So we'll have some more ways to dive into it, but I think it's a really interesting topic because it ties into teaching and learning, which you all obviously represent. And then there's also privacy implications. We also, in the showcase we're looking at where the role of librarians fits in, where AI fits into this conversation. So there's just a lot of pieces that feels like that are coming together here. I don't know if you all agree with that assessment or have any thoughts to start. Okay, Kim, you're nodding.

Kim Arnold: I was going to say we only have, how long do we have four hours to talk about this or? Yeah, connections are really coming up. Yep.

Patsy Moskal: Yeah, absolutely. I think this is the world that we live in now, particularly with digital, online learning, growing. And you're exactly right. It's not just us, it's librarians and others who are involved.

Sophie White: I'm curious, Patsy, to start, what does your discussions on campus look like? So what are you talking about related to digital access at UCF?

Patsy Moskal: Well, I mean we've been, as you noted in my bio, we've been doing this for quite a while. Yeah, I keep saying, I quote the Indiana Jones, it's not the age, it's the mileage. But online learning for us was really a necessity because we didn't have enough classroom space to be able to have students complete their degrees. So UCF was jokingly known as, you can't finish back in the day, which is not the marketing spiel that you really want to sell when you're trying to recruit students.

So we started really early 1996 was, I mean that was a lifetime ago, but we don't remember. But online learning was just starting back then. And we used it as a way to help provide access to students and grew to what we are today. And then we had the little bump a few years back that was known as the pandemic. And that was challenging for those of us who've been involved because a lot of what we focus on is really maintaining quality, so offering online learning. But for years the message was is it as good as face-to-face as though face-to-face is the gold standard? And there's a wide range of quality within face-to-face. There's a wide range of quality within online learning. So we've required faculty development, we have quality reviews. So we really have tried to lean into the quality since we started and that's paid off for us.

And now it's every student at UCF. So all the students, I think it's 91 percent, 91 or 92 percent of students take at least one online or blended course if they come to the campus. So it's higher for undergraduates, like 94 percent. So if you come here, you're going to take an online course for sure. And so that's been the talk. The pandemic, the focus wasn't on the quality piece. It was on how do we survive this pandemic? And I think it was easier for those of us who had been in this space for a while, but we still have had to try to get away from that emergency remote instruction, which is we made a point to differentiate that from what we know as quality online learning. So the thought that somehow people would get tired of that and run away from online learning. We've seen just the opposite. We see online learning was growing before the pandemic. It kind of was growing faster now after the pandemic. And I think a lot of that is because we have so many students who are working full-time for part-time while going to school. And if you're a working professional, you're trying to work, many of them tell us they don't want to have student loan debt.

The only way to do that is to be able to offer flexibility in some of their educational opportunities. So online and blended courses really help students do that. And for the majority of our students, they take a mixture. So they take some face-to-face, some blended, some online, and they tell us that's really to balance it with what's going on in their personal lives. So for us, that's kind of been the message from the beginning. It's even more so now, I think.

Kim Arnold: Yeah, Patsy, I was going to say, I think UCF has been way ahead and has been doing so much online learning and so many other institutions tried to catch up during the pandemic or had to transition things really quickly. And I think that was really a lever in higher ed to really start this march towards the democratization of technology, making it more broadly available regardless of modality, whether you're teaching hybrid and person or online. And that's kind of the part of this conversation I was really interested in talking through is we've just seen so much and quick on the heels of Covid, we saw just this burst of AI. How far into the conversation did I make it? We made it 10 minutes before AI came up. And there's that balance that we have. So I'd be interested, Patsy or others, this balance of the ubiquitous nature of having technology, AI being cutting edge, being the first to do or being in that leading group while some of the more foundational needs of the implementation for me specifically, maybe the lack of privacy, being able to keep up with that is one of the things that's a super interesting challenge and trying to make that balance happen.

Patsy Moskal: Yeah. Well I will tell you that I did very well for not mentioning AI because 99% of the meetings have some AI within the first five. Absolutely. I was proud of you. A lot of our conversations, but we have that conversation too. And it's funny, we started, I think we've been doing online learning a lot, but AI is just a completely different thing now. So I do think talking from colleagues from all over the country, we're all at different spaces. It's kind of like the wild west with what's going on with ai. And we, of course, we are a Microsoft school and so we have the lockdown version of copilot, but in our shop, we've been actually doing some ongoing surveys of stupa students and faculty with looking at their perceptions and uses of ai. And copilot is not the favorite. So we have one lockdown version and people are all over the place with what they're using. So everyone's using a myriad of products.

So it's really kind of interesting what's going on. You're mentioning privacy, and I think that with regard to ai, I think it really speaks to some of the we need AI literacy infused in every university, every college, because I think that's really the piece that this is one technology that is moving faster than anything that we've seen before. And so I think it's really hard. Higher education is like a giant freighter. You can't move very fast, so we don't move quickly with the exception, I think the pandemic and that was necessity. So it's kind of hard to control that. But I think there's a lot of conversations and a lot of things going on in that space in terms of how do we manage that and how do we deal with privacy? How do we deal with equity, how do we deal with the ethics? What should we be doing? What shouldn't we be doing? And we're finding that students are very focused on particularly the ethics and the equity part of it. And I don't know if that's prior, somebody's put the fear of God into them about ai, but they're very in tune with that and maybe afraid of getting in trouble.

And then many of the faculty I think need some serious guidance on what do we do, what do we teach? And there is, I think we have this thought that this is coming whether we want it here or not. And so we kind have an obligation as educators, how do we make sure our students are prepared? Both they've had the literacy involved, but they also understand how this is used in their individual disciplines. And we're frankly still trying to find that space where we're going. I think that's what we're doing right now. A lot of conversations about that.

Kim Arnold: Yeah, yeah. I mean, it's here, right? It's here. It's not coming, it's here. And I think it's interesting paradox, and this is another aspect of this conversation I'm excited to have on the teaching and learning side. We always aspire to have quality learning experiences based on deep pedagogical practices. And as digital environments are becoming more ubiquitous, it's just all around us. And again, that focus on cutting edge and doing the first to be there. And that, I think that paradox of how do we maintain that quality? As you were saying earlier, Patsy, right? The quality of the education that we're offering with the desire of the students and instructors to jump into things without maybe some of our governance structures or privacy or ethical considerations being fully baked in at institutional levels. And I grapple with that. I have that conversation a lot. How do we get through this paradox? How do we balance that?

Jenay Robert: And I think that comes back to that AI literacy piece that's so important. And it might be my new favorite topic because as much as every conversation turns to ai, I think increasingly those conversations are revolving around AI literacy. And it's encouraging because I think at one point over the last couple of years as the proliferation of generative ai, at least has taken place before our eyes that sometimes there was a message that was getting a little lost in some of the work, at least that I was seeing being done for me and for others in the space where people would think we were saying, everyone has to teach with AI now. And I think people are kind of understanding that that's not what we're saying. That's not what leaders in this space are saying. There are very few people, there are people who are saying everyone should teach with AI all the time.

I'm not going to say that doesn't exist. It does. But most of us aren't saying that. Most of us are saying, Hey, that's context dependent. It depends on what you're teaching, who you're teaching. It depends on a lot of factors. But what we do say pretty confidently is that you have to integrate AI literacy in some way. All our students are going to need this skill or need it now. All of us need this skill now. So that's what I say very confidently is we have an ethical responsibility to our students and that trickles throughout the entire institution. But at the end of the day, we have an ethical responsibility to our students to prepare them to live in this digital world, which is highly influenced at this point in time and will be in the future by AI.

Sophie White: That makes me think of, we had a shop talk discussion similar to this one a few weeks ago, and it was about the EDUCAUSE Students and Technology Report, which released earlier this spring. But we were talking to Yvette Chan, she's a fourth year student at Penn State, and she has already had some job interviews with potential employers that have specifically asked her how she will use AI in the roles and what her competency with it, which was really interesting I think to me to see how are we applying what we're learning at institutions to the workforce afterward.

Jenay Robert: I just saw a colleague post I think on LinkedIn, someone I know personally, not just someone out in the ether, someone I know personally saying that they've had to start using some AI tools to tweak their resume cover letter because the things that they're applying to, they're just not getting, their materials aren't getting past an AI screening. So now they have to use AI to pass the AI screening to get to a human. And I'm sure everyone can imagine what my judgment is on that, but I'll reserve it for another conversation. The bottom line is that that person had to understand that there was an AI screening taking place. That's why they weren't making progress in the job market. They had to understand that other AI tools might be able to be a way to get past that screening. There's a lot of literacy involved in that.

They have to understand what data they're putting into those materials and be okay with or not okay with feeding that to AI technologies. So I think that there's just so much that's practically impacting us, and not just in the job market, not just in learning, but in life. In life. I read a report about this, I don't want to misquote the statistic, but this massive up tech in medical devices that are using AI technologies as just someone living in this society. And you go to the doctor and you have to make a decision about medical care. You need AI literacy to do that. You're going to scroll on social media, you need to understand the AI there. It's all over the place. So we kind of really need it throughout our lives.

Kim Arnold: Yeah,

Patsy Moskal: I think that's true.

Kim Arnold: Yeah, sorry. That's okay, Kim. No, I was just going to talk. I think as you were talking Jenay, I think this goes back to the ethical lens you were talking about, right? So it's not just providing the technology and the environments, but really thinking about the ethical lens of what we're doing. And in a lot of times in teaching and learning, FERPA is a north star, right? FERPA tells us what's appropriate and what's not what you can do. And I think there's this new emerging thing that's been coming in the past, probably the past decade, but it's really emergent right now. This idea of FERPA is interpreted institution by institution. So what's appropriate to do with data about students, about learning is interpreted differently from institution to institution. But there's a layer beyond that. I think in higher education, we all need to just keep, I don't even want to say in the back of our mind, it's something we should be actively pursuing.

Is that just because our governance structure kind of tells us that we can do it, it is acceptable. There's this other ethical side of should we do it right? And Patsy, I know over the years, you and I have had so many conversations about this push and pull between, it's acceptable according to our governance policy or whatever, but then this, are we keeping the student centric to what we're doing? Is this something that's creating benefit for them? And I think that's something I think a lot about in my work over the years. And I think a lot about now in my role at EDU cause is how as an entire field can we kind of move this all boats rise. And part of this for me is transparency. And I think Jenay, you were talking a bit about this and helping people understand what data's being captured.

It's always amazing to me when I talk to students and Patsy, I'll be quiet in a moment, I know you have some information on this, but it's really amazing how much students when you talk to them, don't actually understand how much data is being gathered about them, how it's being stored, how it's being used, who's accessing it. And I think that's beholden on institutions. That falls back on institutions to say, we do need to make students aware. We do need to be transparent about what we're doing. And these are parts of ethical codes, which I think many institutions haven't got beyond that ferpa, is it acceptable? Check the box. But those ethical considerations. And there's definitely some examples out there, but that's something I'm always really interested in. And I think AI is just a screaming example for all of us to really look at that framework and say, how do we do this better?

Patsy Moskal: Yeah, I totally agree with that, Kim. And I know this wasn't meant to be a conversation about ai, but I think it's a perfect example of some of the things that we're talking about with regard to equity issues and privacy issues. And it's almost like on steroids. So particularly when you look at the different AI tools, and we can all jump in and do something with a free tool, but then the paid versions have so much more to offer than the free tools. So what does that say about equity when we're talking about our students? So we try to equalize that by giving everyone access to copilot, but then that has limitations and we know that folks are using their own tools anyway. And then in terms of the literacy piece, I think helping students understand what it means when you're putting things in an AI tool, what is that doing?

Where's that data going? Who else has access to that data? I think that's really something we need to focus on in terms of making sure that all faculty, I mean we're kind all on the same page with ferpa, right? We've mandated that we all have to do FERPA training, so on a regular basis. So we all do that. And those of us who do it year after year after year after year, you already know the quiz, we already know what we have to do, but we need similar kinds of structures so that it is an automatic thing. One of the things the students have expressed is they really want to know. So the fear is if I have two instructors with ai, one is on one end of the continuum, like outlaw, I do not want you using it for anything. And another who is embracing it and really wants students to try to use it, you have to be very transparent about that.

I mean, we can't have students who don't know which way you're swinging and then they make the mistake and somehow get in trouble for using AI when they shouldn't. So being crystal clear with what you have on your syllabus, making sure that it's clear for assignments, making sure the extent that you allow them to use ai, if it's, I mean, I think this becomes very complicated. You mentioned not all classes should use it, and without a doubt there are classes. I mean, how should AI be used for helping you write to what extent, what do I do? Can it help me with spellcheck? Can it help me with grammar? There's a wide continuum there. And so I just think the transparency, I think ethics will be something we're really going to have to lean into and figure out how it's impacting. And that's going to be difficult. We always are trying to look at how are tools, the digital tools that we use, how do they impact the students who had the most difficulty succeeding? And so it's like those are often underserved students. And so that's like, alright, well we want to make sure they're not being disproportionately impacted by new tools that come out and how do we try to equalize that? I think that's just going to be a universal issue that we're going to have to address.

And frankly, the same with faculty. I would say that too. Yeah, I was going to say, it's not just students always pass. I think.

Yeah, I mean in our division, we're actually using our vice provost had the foresight to put the money for us practicing with some AI tools to see how we could better use it to do our job. So my team has been looking at using AI specifically to help us with some of our analysis. And for some, what are the limitations, what can we do? And we've compared and contrasted across different platforms even So for people who do research, it's fun to play, but legitimately it could be very helpful in the future if it helps minimize some of the, particularly the qualitative work, keeping in mind that we also have to keep those privacy matters. I can't throw somebody's information in there and just have it be training models on someone's materials. Yeah,

Kim Arnold: Well I think that's so much bigger than ai. We have devolved a bit into the AI space, but I think anything in the digital realm, it's really important that instructors understand what using certain tools like what the data ramifications of that are and how it might inadvertently create some risk for the institution as well as for individual students in terms of privacy. And I think a lot about that, and I've heard so many anecdotes. Last week I was at a forum, a national forum for open education, and the number of people there that said, the vendors of all of these tools literally sit outside my office. They'll just sit there and then I'm like, this is cool. I'm going to sign up for it. I'm using it in my class. This is a really amazing tool. It's doing a functionality that the core set of technology that my teaching and learning department doesn't support, and there's a concern there and that they're not always institutionally vetted.

And depending on what data is going into that, that can cause some real risk for institutions and AI as a perfect example. But I think there's a whole suite of tools out there designed to support the teaching and learning environment that are amazing. And when you hear that word free, it's always the, let me click into that. Free never means free. So what is the backend? And thinking very critically and knowing that if you are using those tools as you teach a course, what does that mean for the privacy of your students? What information might that disclose unintentionally and inadvertently? So I think this idea that we've just been circling around this core critical data literacy is just a super, super important thing that we have to try to crack the nut on slowly.

Jenay Robert: You've touched on a lot of topics that we write about in this year's Teaching and Learning Horizon Report, specifically that the digital literacy piece, all of the AI stuff, of course. But I really appreciate, I think, Kim, you had said that the AI conversation is really just helping us think about some topics that have been important for a long time and now it's just those topics on steroids, I think was what you had said. Right? And I totally agree with that. And one of the things in this year's Horizon Report is evolving teaching practices, which is something we've always talked about. We want to have research informed, teaching practice, et cetera. It's always been something. But now with the increasingly digital nature of our world, thinking about all these tools that we have available to us, thinking about the tools students expect or need to know how to use that is pushing us to be better practitioners too. We have to constantly update our practice and with the pace of things changing the way they are, it's just so challenging.

Sophie White: I agree. And I think this is making me think more about the privacy and security side a bit, and what you just said, Kim, about open educational resources and industry folks sitting outside of offices. I think when I joined ed, cause I learned the term shadow IT for the first time. So this idea that people are going out and purchasing IT solutions without the proper vetting system by privacy and security offices. And I think that's just so important, especially with the advent of AI tools that are saying that they can solve every problem that we've ever thought about, that there has to be this communication between these various stakeholders at the institution to make sure that even if it is slower and there are a lot of processes that will seem annoying at the time, that there is a really important reason for it for the safety of the institution, managing risk, and also the students and their data as well.

Patsy Moskal: I mean that's really a good point. That's really something that's important. I mean, we have a emerging technologies council within our division and we try to, you get hit with a lot of emails from various vendors who have products and they try to consolidate similar products and vet them, and there's a committee of folks across all the division, the different units across campus that would be important, our IT folks and our institutional research and digital learning and students and faculty. And so it really tries to get at that, making sure we're all on the same page and we're looking at that from a governance perspective and being careful with the privacy and all of those things. It is, I guess the shadow had not heard these shadow shadow term, but that's a good term for it because yeah, I mean I definitely think that's as the pace of technology.

I actually think that maybe that's pandemic impacted too, or maybe it's just the function that technology is so much more advanced than it was pre pandemic. I don't know. But it seems like there are so many more tools that just come at you all the time now. And I confess, I'll tell you guys, so plug your ears. I'm one of those people who either checks the box or keeps the box unchecked to please don't give my name and email to vendors and I really hate it. Speaking of data privacy, I hate going to face-to-face conferences with a QR code. Don't scan my QR code. I don't want a hundred emails, but, and I still get them. But yeah, you still get the very personalized email where I'm like, who the heck are you? I don't even know what your tool is. So I don't know the company. Yeah,

Kim Arnold: And I think we're hitting kind of on the flip side of what we've been talking about, and I think the solution providers that are out there are critical to us being able to provide these deep pedagogical, meaningful learning environments. And so I want to be careful that my statement earlier, I wasn't trying to villainize that in any way, but I think it's a way where there's a partnership between the software vendors and the institutions and there's a lot of tools out there. The HECVAT the recent value or the recent update of the heve, or maybe it's upcoming, I might have my date a little bit wrong. Sorry.

Jenay Robert: You got it.

Kim Arnold: I think that's a perfect example of software solution providers trying to be more mindful of some of the cybersecurity and privacy elements. I don't think that the HECVAT goes super deep into the privacy element, but that's a place where I see as a community we can kind of leverage and work towards some certification and standards for privacy. And because ultimately it's a very symbiotic relationship. None of our teaching and learning enterprises run without these software solutions. And so I think it's just being a mindful, and I don't think they're out there. There isn't anybody out there trying to be nefarious, but it's again, bringing the literacy up across the board, whether that's providers, administrators, teachers, students, and saying, what do we have to do to safeguard the privacy of our students, the intellectual property of our instructors? I can't believe we haven't touched on that yet.

That's another thing that comes up all of the time. And so how do we build on this good work that's out there and build these partnerships in a way where we're getting the functionality and this explosion of stuff Post pandemic, Patsy, you're right, there's been so much post pandemic, but there's the need for every single one of those things are coming out of a very real teaching and learning need. So I think it's like how do we work together and ensure that those standards are being met, that we're being as secure, that we're considering ethics and privacy. And of course, there's only so many hours in the day we know, right? But I think that's what I really look forward to is saying we have some real opportunities right now to say, how do we form those partnerships and really try to make a foundational kind of sweep and help new and emerging tech companies coming up with these great ideas of functionality that don't exist in other tools. How do we help them think more critically and bring some of these elements? And a lot of them are doing it already, but heck, that's a great step towards that. I personally would like to see more focus on the kind of ethical and privacy side, but that's probably more on the institutional side. But so yeah, it takes us all right.

Jenay Robert: And I do want to just put in a little plug for anyone who's listening or watching, if this part of the conversation is exciting you and you're like, yes, I want to know more about partnering with a tech companies and being more collaborative, what does that look like? That is an active area of conversation at EDUCAUSE right now, and it's something, I'm not even directly involved in those because I'm a researcher rather than someone who works with corporate engagement or partnerships directly. But Leah Lang, who's our Senior Director of Partnerships and Corporate Engagement has been doing incredible work at bringing together people from institutions, people from our corporate membership and people from the association space like us and saying we're all part of the same community. In fact, today before this recording, I was in our first virtual partner summit. We had one onsite at the annual conference last year, and we'll have one again this year.

But we're doing amazing new things and by we, I'm just taking credit for what my colleagues here are doing in building those bridges and facilitating those conversations. So I would say, if this is exciting to you or interesting to you, keep an eye on what we're doing in that space, but also feel free to reach out to one of us to see how to get plugged into some of those efforts because it's so vital and it's one of the things that excites me. The most single piece of our community is going to be able to make progress on this alone. We all really have to work together.

Sophie White: I'll also add one more note. If folks are unfamiliar with HECVAT as an acronym, that might be something that people who are outside of higher ed or new to the EDUCAUSE landscape are new to, but the HECVAT is the Higher Education Community Vendor Assessment Toolkit. I did just Google this in the meantime to make sure I got that right. But essentially it's this really great community built survey that when solution providers are interested in procuring services with an institution, they fill this out. It has to do with privacy and security questions. It is universal. So companies can fill out the survey and then send it around to various institutions and they'll review it. Anything else you all want to add about the HECVAT? It's an open resource.

Kim Arnold: Check it out. If you don't know what it is, check it out. It's going to be important moving forward for sure.

Sophie White: Yeah, I think for me it's a really inspiring way to look at how, again, solution providers and institutions can collaborate together. So to streamline procurement to make sure that we're considering these privacy and security implications before contracting services. So I think it's a great example of how folks in this space can collaborate together to make everyone's lives easier.

Patsy Moskal: I think I would make a pitch too. I think having that is very beneficial because we talked about equity. There's also that helps maintain equity across the different institutions. So folks who may not have as much experience as those of us who've been doing this for 30 years or whatever. I mean, it's really helpful to have those guidelines and guide the checklist of things that are important to look for and things to know about a vendor going into it. So yeah, I think it's really important. And Kim, I think you're right. I think you said it's on the institutions. I mean the institutions really have to be keeping track of this and aware of this and working towards looking at the security and the privacy issues with respect to vendors. And we are always, there's so many, I mean, I think it's great that there are more vendors out there, but it can be overwhelming for institutions and everything sounds great when you listen to the sales pitch. So you just need to make sure whether it's free or not, look at what's under the hood and what you're getting. And so I think any kind of guidance and standards is helpful, I think for institutions in that regard.

Sophie White: Patsy, I'm curious in terms of looking at digital access, from your perspective, how have you found it most effective to work with the privacy and security folks at your institution? What does that relationship look like or those conversations when you're thinking about adding a new digital tool or expanding something you're already working on?

Patsy Moskal: Well, I mean, in all honesty, I'm kind of where Jenay is because I'm the research side of it. But I can tell you we try to have those conversations. Some of the emerging technologies council of for instance or committee that brings together people from those different kinds of areas. We also, I think the folks who are very focused on information security and we have the courses we have to take. So there are some standard things that everyone on campus has to take, particularly some things or if you're at a certain level and so forth. So I think we probably, like many other folks, you learn about tools from either going to a conference or getting an email or whatever. And so then I think it becomes a conversation and trying to make sure that when you're having the conversation, you're connecting with the folks on campus who make sure that you are meeting the standards for governance, security, privacy and all of those things.

So I think that's kind of been the method for us. I think we formalized it within the division because there were so many of the tools coming and it's impossible for one person. You need a conversation with people across campus who are all in involved. We haven't talked about the library, but librarians also were part of the conversation. So I think, I don't know if that answered your question, but I think it's just bringing together the people who are, there's pieces of the puzzle scattered across campus and making sure everybody's in the same conversation. We're all on the same page and doing your due diligence to check out the companies and make sure it meets whatever standards we have. And then that is a conversation that I know folks higher than me, folks above my pay grade have with vendors in terms of making sure that the standards are met or if they want to do business with UCF. And I think we're fortunately, or unfortunately, we are pushing 70,000, I think we're 68,000 and change students. That's a big get for most vendors. So they really are looking at how do we get involved with UCF? So that helps us a lot. Yeah,

Kim Arnold: And the context is so critical there, how things happen at UCF or different from different institutions.

It'd be nice to say you can just copy and paste from one institution to another, but the political and policy landscape are just very, very different from institution to institution. And I've often talked with folks over the years about having some type of, on the research side, we are clearly governed by the IRB. So there's one clear space. You go on campus, you get that. Is this on the teaching and learning side, it's very different and it would be nearly impossible to create a body like that that could govern all teaching and learning kind of practices why you rely so heavily on your peers. But I think that's one of the really wicked parts of trying to think through this is just the contextual differences, not only from institution to institution, but even department to department sometimes. So when you're trying to think what are the technologies that are going to be most impactful and most supportive of the students or help the instructors optimize their teaching and learning, there's so much context just floating around in the ether there that it would be really nice if you could just copy and paste it, right?

You just can't do that. But we can dream.

Jenay Robert: This is where I think really, go ahead, Patsy.

Patsy Moskal: Yeah, I was just going to say those of us who were on campus, and if you guys remember, I know we're often siloed, so when you say from department to department, I mean you may not talk to your colleagues in a different department if they're even in the same building. So it is like trying to make sure that, I think that's where we try to facilitate that in some way. And even for some of the digital tools, we actually, our innovation lab, they try to maybe get out ahead and pilot test some of them and involve different departments so everyone is on the same page. And then look at all of these issues as well as how are the faculty and students viewing whatever the tool is, is this something that's going to be used before you procure it and then realize, oh, there are all these other issues and maybe they're not going to use this tool, or maybe this other tool is better. So we try to do that, but it is hard. There's just so much, it's like a fire hose. There's a lot out there now. And I think other institutions, I mean, we're fortunate that we have some mileage going back to the mileage. We have some mileage with this stuff, but it's very different depending on the institution, how much bandwidth they have to be able to vote to this can change a lot. So yeah,

Jenay Robert: I think this conversation really highlights the importance of seeing our procurement process and the people involved in the procurement process as our friends.

I know I have definitely been guilty at different times when I worked at institutions of seeing that process as not my friend or as a hoop I have to jump through or not really understanding how important it is. But as the years tick by and as these technologies mature and grow and proliferate and all those words, there's so many things to think about when it comes to adopting a new technology. I am not an expert in all of those things. As a practitioner, as a researcher, as a teacher, I am not a deep expert in accessibility. I am not a deep expert in privacy or cybersecurity. And increasingly part of the conversation with these big data tools is impact on the environment. I'm not a deep expert in how these tools impact our environment, but if we have some processes in place at our institutions that we can lean on, that can make all the difference.

So I think maybe an action item for people listening to this is to, if you're not super aware of the procurement processes that your institution or who those people are that are involved in that work, maybe reach out and get to know that process a little bit better and understand what's going on at your institution. To Kim's point, it's so dependent on what's happening in your local context that places like EDUCAUSE we provide lots of resources that can help, but at the end of the day, it's really got to be a local effort. So I hope that's maybe one useful takeaway from this episode.

Sophie White: I agree. And it's very complex and nuanced. I think we started the conversation just talking about how many different stakeholders, departments, considerations, even types of literacy, we need to have these conversations. And now we're ending it saying, make sure you collaborate with each other and procurement is your friend. So I think this is a great place to wrap up. Like you said, Kim, we could talk about this all day, but I have a lot of food for thought related to AI and how that's an example of these larger foundational conversations that we need to have in higher ed related to balancing digital access and privacy.

This episode features:

Patsy Moskal
Director of the Research Initiative for Teaching Effectiveness
University of Central Florida

Kim Arnold
Director of the Teaching & Learning Program
EDUCAUSE

Jenay Robert
Senior Researcher
EDUCAUSE

Sophie White
Content Marketing and Program Manager
EDUCAUSE