In this episode, hosts and guests explore how higher education institutions are navigating the balance between protecting privacy and securing data, while simultaneously championing innovation, promoting transparency, and fostering collaboration.
Takeaways from this episode:
- Transparent data privacy practices can help higher education institutions build trust.
- Higher education data privacy and security strategies should prioritize sensitive data.
- Privacy is a dynamic and evolving profession in higher education. Collaboration between cybersecurity and privacy professionals is crucial.
- Institutions can leverage EDUCAUSE resources to advance their cybersecurity and privacy strategies and improve the efficiency of their teams.
View Transcript
Sophie White: Hello everyone and welcome to EDUCAUSE Shop Talk. I'm Sophie White, I'm one of your hosts for today, and I am a content marketing and program manager for EDUCAUSE.
Jenay Robert: And I'm Jenay Robert. I'm a senior researcher at EDUCAUSE and I am your other host.
Sophie White: Great. So today we are really thrilled to have in our virtual studio two special guests. We have Isaac Galvin and Stephen Collette. I'll introduce both of them in just a minute. We were chatting beforehand and talked about this as kind of a water cooler conversation and they actually used to work together and now no longer do. So I'm hoping this is kind of a reunion tour too. So Isaac Galvin is the community program director of cybersecurity and privacy for EDUCAUSE. He started with us not too long ago and we're really thrilled to have him here today. Isaac leads the strategic development of initiatives and programming that help higher education institutions address emerging cybersecurity and privacy challenges. He works with members to shape widely used resources, including the HECVAT. There'll be a quiz on what the heck that stands for later, cybersecurity and Privacy Guide and the Cybersecurity and Privacy Professionals Conference. He holds the certified Information Systems security professional C-I-S-S-P certification, and previously led the University of Illinois' cybersecurity training program where he developed innovative approaches for creating security awareness across diverse campus populations. So thanks again, Isaac being with us today and bringing your great experience.
And we also have Stephen Collette with us. Stephen is a manager of data and privacy analysis and operations at the University of Illinois Urbana Champaign. He manages the privacy operations team within the office of the CIO. The team manages privacy matters that involve both domestic and international privacy laws and regulations as well as the ethical side of privacy. Prior to joining the University of Illinois, Stephen worked at a Fortune 100 company where he led both its records and information management program and their multinational privacy program for its headquarters organization and the nine distinct institutions it owns. So you have private industry experience too, which I think will be really interesting as we're talking about this topic today. Great. So thank you for being with us today. Appreciate you both.
So to kick it off today, we'll be talking about a matter of trust and specifically as that relates to safeguarding privacy and securing institutional data in higher education. This issue came out of the 2025 EDUAUSE Top 10, which is a signature report that EDUCAUSE puts out every year about the Top 10 issues in higher education and technology. So we're really excited to explore it. We also have a showcase that is coming out in February with some more resources about this topic. So to kick us off, anyone feel free to jump in, but when you're thinking about a matter of trust, so how institutions can build trust by protecting this data, I think the question that I keep seeing is how do you do that while still allowing institutions to function and innovate and allow space for all of these core competencies that an institution has but still keeping them safe? How do you think about reaching that balance?
Stephen Collette: Well, first off, thanks for having me here today. I appreciate the opportunity. For me, my first go-to answer is transparency. The easiest way to build that trust is being transparent with what you're doing with data. So if it's student data notifying them at the time of collection, this is what we're going to do with it, this is how we're going to process it, this is who we're going to share with it, the data with. So that's, I think the very foundation of the trust is just being honest and open with whoever's data you're processing.
Isaac Galvan: Yeah, I like that. You mentioned the different kinds or well, we could talk about some of the different kinds of data even to take you back a step and there's, when you think about how the data's being used, it's just as important to think about what it is and to share with people that they are going to be participating in those activities. What are some of the different kinds of data that we're talking about, Stephen?
Stephen Collette: Speaking as a privacy person, the one that I think about the most is just personal information. It's a broad category, depend on who you talk to. It has different definitions, but for the most part it's generally defined as any data that can be related back to a person either on its own or paired with other data. So pretty much all of the data that's out there can be somehow tracked back to being personal information. So you have that kind of category or broad area. There's personally identifiable information, which is very similar kind of definitions that I've seen out there. But then you jump into different categories of, generally it's broken down by risks. So high risk data I think of did social security numbers or driver's license numbers, things of that nature, maybe some demographic data falls into that as well. Then you kind of step down the risk chart from there and stuff that is not nearly as sensitive as those but still want to be kept secure or more secure than just general public data. What did I miss?
Isaac Galvan: Well, one thing that we've engaged in together is with working with health data and people's health data, sometimes institutions and universities are involved in working with their student health centers or for other reasons, collecting health related information. And that comes with a whole other set of challenges as well.
Stephen Collette: Definitely. And another kind of challenge in that area I think for universities, especially those that have health centers or hospital networks, is where does FERPA end and HIPAA begin? Who has authority over that data? And it's something that has to be navigated almost on a institution by institution basis or department by department sometimes.
Sophie White: So you mentioned health data as an example of data that you might have to address differently. What are the different approaches look like? How do you decide to, what do you decide to do when you have something really sensitive health data at play versus maybe something that's a lower risk element of data? How's your decision making change?
Stephen Collette: Well, with HIPAA, there's eighteen identifiers. There's different data elements that they have classified as identifiers, and so HIPAA can be very specific but also kind of broad in that respect as well. But when you're talking about super sensitive data, most of the approaches are almost exactly the same for the different kinds of data, whether it's health data, social security numbers, a lot of the approach for privacy is very similar because at the end of the day it's all very risky data. You don't want it out there, you don't want to expose individuals. It's just the different flavor of what that exposure might mean for somebody. Social security number being released out into the wild is one thing that may impact credit scores or the ability for them to claim benefits, but for your health data to get out there could expose some very private or sensitive things about the individual themselves. So you do want to be careful with all those different areas, but a lot of the approaches are very similar.
Jenay Robert: I feel like every time we have a shop talk episode about privacy and security, I'm revealing something embarrassing about myself, but it might be a good thing. I like to take one for the team and help connect these ideas to audience members who might be in the same position. But I had to say that when I was a researcher doing research at an institution and my graduate degree is in education, so I work with human subjects all the time and at some point along the way, someone in it says to me, okay, but what level is this data? I'm like, what do you mean level? And by then I had been a researcher for I don't know how many years and I just simply did not know that I should be working with my privacy and security folks to understand these levels and identify when I am doing research, what levels my data are at and what are the different privacy and security measures I needed to be taking depending on that I thought I went through IRB review. I'm good. And so I wonder if you see that same disconnect for certain stakeholders and institutions and maybe it's not just us poor researchers that have that issue, maybe it's others and how do you bridge that gap?
Isaac Galvan: Yeah, I love that you mentioned the different levels of data classification and that's something that as someone who worked a lot in training and awareness, we tried to set as one of the keystones for working with our staff who across the board, I mean not just people who are working with a very high risk, but to remember that when you come to work and you sit down to access a computer system or you're sharing information with others to remember that that data has some value and has sensitivity assigned to it. So the way that you handle say a lunch order for your coming team meeting will have to be very different than the way you handle a full staff list of your department with their home addresses and home phone numbers on it. Those informations have to be handled very differently. And so that should be in the course of your day, something that you think about as you move from task to task, what level of sensitivity do I need to think about as I work with this? And even if you're not 100% sure of all of the requirements that your institution might have for you when working with very highly sensitive data, you can know for sure when you're doing something that they wouldn't like sharing it outside that list of employee home addresses. I should not share that with a third party because of the nature of the data. I might not know my university's policy on that is specifically, but I should know that that data needs to be protected because it's at that high level of data sensitivity.
Stephen Collette: Yeah, I think that's your example that you gave is pretty common. A lot of people don't even really think about it in any solid terms. I think it's one of those things where as a privacy professional, if I go to somebody to talk them, Hey, let's talk about the privacy or how you're handling the data that you're using, and people can almost get a little defensive because I am not going to do anything wrong with the data. I know what I'm doing here. I am not going to do anything wrong with it. I'm not expose this or anything. It's not really about that. It's more about trying to make sure that you have the proper controls, security, privacy controls in place to make sure that the data is being handled appropriately, the access controls are in place just to make sure that the data is being handled in a very careful and planned out manner.
We don't want to fly by the seat of our pants with us. We want to make sure that we have a repeatable process. And a lot of it goes towards education, which Isaac's team when he was at the University of Illinois was really helpful with getting the message out. But one of the biggest challenges I think in privacy is just how nuanced it can be. So Isaac brought up an example about lunch orders that maybe you don't really need to keep that too secure. But I had a conversation last year about a tool that would collect dietary preferences and the options included kosher and halal meals, which could reveal somebody's religious background. And so that could be an area where you need to shore up your security protocols just because that kind of data could be exposed. But again, it's situationally dependent, so it's never a clear cut. This is how you do this in every single situation. It's always what is going on with the data that you need to really kind of dig in before you start making privacy plans.
Isaac Galvan: And that's why I love talking to Stephen and people in his field because they have these really deep insights into the ways that you really should consider privacy from the start if you're going to be implementing an IT solution because you just don't know. There are always implications that I say the average person, but someone who's an IT person primarily might not think about first. So
Sophie White: Yeah, I've thought a lot. We recorded a shop talk episode in San Antonio with David Seidel that was about cybersecurity and ai and he had a great example of even just as an everyday user of an app that you might not think of as a data privacy issue, like Grammarly for example. It's an app where you're feeding words into something and it gives you grammar recommendations. But what if you're feeding a resume in there or a job acceptance offer that then is telling someone online who gets access to that data, something internal about your organization's HR or hiring policies or whatever that is. And that was kind of an eye opening moment to me of it's not necessarily like I'm sending a third party list of addresses, but it might be this tool that I think is making my life easier. And there are these kind of repercussions that have to be considered too.
Stephen Collette: It's one of the joys and curses of being in privacy is you see into these things a little bit deeper than the average person and you start thinking, well, do I really need to download this? Do I really want to expose my data to this just so I can have this new app that does this one little thing that would make my life just a little bit easier? Or do I just avoid it altogether? And it's like I said, a joy and a curse because I miss out on a couple of things just because I don't want to deal with it.
Sophie White: Sure. I'm impressed to use a computer at all.
Jenay Robert: I think what really got me thinking about this topic being a consumer of apps for example, was starting to realize that there are company acquisitions that happen surely for the sake of data sharing just to attain bodies of data. We don't care about your company or what it does, but we know you have all of this data and so we're going to buy the company so that we can own the data. So that kind of woke me up to this idea of like, okay, well I trust this vendor or I trust this company, which I don't think I do trust any companies, but for the sake of argument, but do I trust whoever they might be sold to someday or ten years from now or fifteen years from now? And when I talk to people in my day-to-day life who find out what I do and are curious enough to ask researcher what kind of research, then we get into these types of conversations.
And I think another piece that people don't necessarily think about is that ten-year time horizon and fifteen-year time horizon or longer. So yes, the way the data landscape is right now, there are certain things you don't mind being out there, but what's it going to look like ten years from now when the social climate might be different, the political climate might be different. There's all sorts of things that can change in the future for the rest of your life, and so that data is out there forever. So are you confident that the world is going to be just as safe or safer in the future as it is now? I sure am not, and that's something that I think I really love about the Horizon Report work, for example, that it gives us that perspective.
Sophie White: Yeah. I'm curious, Janay, if you want to dive into any of that in this showcase, we discussed the 2024 Horizon Report on cybersecurity and privacy. I think my big takeaway from it was that there are all these changing trends in the Horizon report. You talk about social, technological, economic, environmental, political, all of these major change agents in the world and how higher education needs to set the foundation. So despite those major changes that we're seeing, not just inside higher ed but outside that we can make sure data is as secure as possible. Are there any kind of highlights or takeaways from that Horizon report that you want to share with the audience today?
Jenay Robert: Yeah, a couple things. It is probably one of the most exciting projects I get to work on because it's outside of my content expertise field. So when we do a teaching and learning horizon report or something along those lines, I'm like, oh yeah, this is my world. But doing this kind of work, I always am learning kind of as a stakeholder in this process. So I love that about this. A couple of things that kind of pushed the boundaries of my thinking were there's discussion in this horizon report about what is the perimeter of an institution and how that's not such an easy question to answer. So I think I always used to have this quite naive view of there's some sort of a firewall or something. I don't know. Once I put my password in there, then everything I do is safe after that and not really thinking about the fact that it's a very squishy environment on the edges.
So if you're on campus, maybe that's one thing. If you're on your cell phone, maybe that's another thing. If you're using certain apps, those are different. So that was probably the biggest one that made me go, oh, okay. Yeah, that really challenges my fundamental beliefs about cybersecurity. And then the other one, supporting agency trust, transparency and involvement. That was one of the key practices in the report because to me, as I've kind of been on this journey of learning about cybersecurity and privacy, understanding the value of each individual's action in this process. So I've been more and more interested in how we engage people on an individual level, how we involve them, how we help educate them, and I think that's why I'm so happy to share all the stories that make me look so silly. I just feel like it's more common for people to have those misconceptions like I did, and we really want to reach those people.
Isaac Galvan: Thank you for sharing. I'm so glad you share those Janae because it's really like cybersecurity and privacy I think have the deepest impact when people are able to make a personal connection to it and see how it applies in their life and how it can, one thing in promoting positive security culture, you want people to think about how it extends to all aspects of their life and day and protecting themselves and their family as well. So I'm so glad you bring that up.
Stephen Collette: Yeah, one of the examples,
Jenay Robert: Oh, sorry.
Stephen Collette: I was just going to say one of the examples I have for what Isaac just said, I was working with a group on campus to create a privacy notice or privacy policy for that was more based in GDPR protections and working with compliance and council's office to get that all written up. And a couple of weeks into it, one of the members from the team talked to me after one of our meetings. Like this has been really eye opening to me because now I'm looking at all these privacy notices for all the things that I've signed up for and I'm seeing some either inconsistencies or confusing language and all the things that we had been working on in our own work they're seeing in out in the wild, and they were starting to change how they were acting and what they were signing up for, being just more aware of what they were getting themselves into.
Sophie White: Yeah, I know I was reviewing the Horizon report before this, and one of the recommended practices I think was embedding cybersecurity and privacy into students' education too. Not just computer science majors, but every student as someone who's using digital technology probably for the rest of their life, that awareness is so valuable. So I think it's a great example of staff, but also students, how we can be good stewards of their information, but also teach them how to use these skills going forward.
Jenay Robert: Yeah, I love in the scenarios when we go through that sort of what would it look like if we had this future where from kindergarten we were teaching students about privacy and security. And when I first heard this idea, I was like, I don't know. Is that really something that could ever happen? But then the more I thought about it, the more we live in a digital world, the more fundamental this is for educating people. And I think to some extent we are teaching kids at home, at least I think at least some families are, if not teaching them explicitly good security and privacy practices implicitly, you're probably passing along to not so good practices. Just clicking. I agree to everything and not, but I think we can have those conversations with our kids, well, you want to download job? Okay, let's look at the little safety warning and let's read through this. What does this mean? So at obviously age appropriate levels. I'm not saying read contracts with your five-year-old. I'm just saying you can make it part of the conversation.
Sophie White: Yeah, that's a great point about, that's not necessarily a fun exercise, but a really valuable one. I mean, I guess I'm always overwhelmed by the length of those contracts too. So I am curious, do you all have tips or how do you look at so many contracts at your institutions to make sure you're not missing a privacy flag? What does your procurement process look like to make sure that you're keeping all of those concerns in mind?
Stephen Collette: We have teams of people who are looking at the contracts. So we have the privacy review, we do a security review. We have the contracts group, we have council's office. Look at it. We have all sorts of people involved with it, but there's certain sections of the contract that I'm more concerned about As far as privacy goes, usually it's about the data handling or the security section. There's also some amendments that we put on our contracts, FERPA or data handling agreement for different things. So I really just break it down to the sections I need to look at, because otherwise I'll be looking at it all day.
Sophie White: I know that, oh, go ahead, Isaac.
Isaac Galvan: I say sometimes you want to look at some of those factors before you're reviewing a contract phase and you don't want to have to put the brakes on something because of cybersecurity or privacy concern wasn't addressed at the right time. And contracts are pretty, they're pretty inked, right? They're pretty well set. So how do you make sure, and I can give some feedback on this too, but Stephen, how do you guys make sure that privacy is included early in the conversation and not when there's already a contract that's signed by one party or another party already?
Stephen Collette: There's always room for improvement, and our team is still relatively young. We've been around or at the university as a team function for a couple of years now. So we're still getting introduced to different areas of the university. So we're not having as many of those conversations from the start. Usually we're brought in as somebody wants, has decided we're going to go with this vendor to do this thing. Can you do a review? It's like, okay, well let's back up. Let's talk about what we're going to do with this. And we provide recommendations on how to handle the data or adding consent language or notifications into things. So we try to catch it early if we can, or at least early in the contract phase so that we can get them thinking about it as the contract moves through. There haven't been a ton of things where we would want to stop a contract. I don't think we've ever stopped a contract to be honest, but certain things will like, Hey, let's talk about this a little bit more in detail. Some of the reviews that we've done for our researchers have been big enough where we've had the researcher go back and redo their proposal because of certain things that they just flat out missed, and nothing wrong with that. But that's why we're here is to help refine those proposals to make sure that everything's private and secure.
Sophie White: I'm sure that takes a lot of deliberate relationship building too, to make sure that you have those relationships at the institution that people can be honest with you and hopefully start involving you earlier in their projects. I am excited too. I know that we have HECVAT 4 coming out the next edition of the HECVAT and includes privacy specific questions. Isaac, I don't know if you want to dive into that at all that maybe give an overview of what the HECVAT is for folks who aren't familiar with it and talk about why it can be a powerful tool for higher ed.
Isaac Galvan: Yeah, I would love to. So the heck that, it's a survey tool that's been developed over the years, over many years by a really great group of people in higher education of all different institutions. It's a group of volunteers who over the years have developed this survey and this survey, the heck that it's meant for service providers who want to move into the higher education space to quickly answer a lot of questions that a higher education institution would want to know before engaging in a procurement process. So we're really excited about this new version. It's already been built up over the years to include lots of questions about your product, what kind of data your product will be working with, what kind of infrastructure your service runs on, things related to different compliance needs and regulatory needs. This year we're expanding with some more focus areas and growing it.
Accessibility, of course, is an accessibility of your product is of course a big part of managing your risk and compliance when it comes to services that you're purchasing from a third party or any service you're developing in-house. But this gives the vendors a good chance to demonstrate their commitment to accessibility. We've also added some questions at the guidance of our development team, the volunteers. We've added some questions about AI and the use of AI in services, online services, third party supplied. So of course, institutions are going to need to know as they engage with these IT enabled services, what are they using with that data? And will the data that's sent up be used to training and many questions about privacy and the way that these service providers interact with privacy and if they have that as part of their core set of principles. So we're really excited about.
The new version should be coming in the next couple of weeks, and that'll be coming soon, but I do think it's a tool that you should look into or encourage your service providers if they don't already have a HECVAT available. A lot of service providers do have that form available already, so to be willing to share and expedite that process. The other great thing about it, as Steve had mentioned, sometimes these processes bring in a lot of stakeholders to the table. The HECVAT is a nice centralized form that teams across a lot of different concerns can use to have their conversation on equal ground. So it's very good for building consensus.
Sophie White: Thank you. Yeah, I think it speaks so much to the maturity of privacy too as a profession. The fact that we have these privacy questions that the community recommended adding to it, I think when I really figured out what the heck that was, it's one of my favorite projects, just the way that so many volunteers and groups across higher ed collaborate on it and just one of these ways that while we're talking about workforce challenges and privacy and cybersecurity and higher ed, all of these resource challenges, these things, but the community came together to think about how can we create efficiencies? How can we collaborate with industry for something that works for both higher education and solution providers? So it's just a really inspiring project, I think, and I'm excited to see us releasing the next version. Can anyone name all the letters of the acronym? HECVAT Does anyone know what it stands for?
Stephen Collette: We'll, not prepared for that question.
Jenay Robert: What if we Google? Is that okay? I mean, HE think is higher education. The A I'm pretty sure is assessment. V might be tool is the V vendor. I got some of 'em. Just ran out of order.
Sophie White: What am I missing? It gets Community Vendor Assessment Toolkit?
Stephen Collette: Yes. Okay.
Sophie White: Okay, good. I've heard it referred to as only an acronym a mother could love, but it is a beautiful product once you get beyond the mouthful, that is the acronym. So thanks to all the folks who've worked on it over many years.
Stephen Collette: Yeah, I'm really excited about the privacy section. In my past job in the private enterprise world, our security team used the HECVAT or HECVAT light back in the day, and it was all security questions. And I come in with privacy, I'm like, Hey, can you answer these twelve questions that I have that aren't on the HECVAT because they're all privacy based. So I'm really excited to see privacy get added in there because it was always a challenge to get the vendor to go back and answer more questions when they've already submitted the HECVAT.
Sophie White: Yeah, that's really great. I think it speaks to, in this showcase, we have a video from ED'S review also about privacy being at the forefront of data risk management, and a lot of it has to do with privacy as a profession maturing. So I guess, Stephen, I'm curious, based on your background at a Fortune 100 company and now in higher ed, kind of how have you seen privacy as a profession change over the years? How does it look now compared to when you started and where do you think it's going?
Stephen Collette: Oh, I love this question because it gives me a chance to tell you how I got into privacy. So I was just a little old records manager on a team ethics and compliance team, and one of the sister teams that reported to the same manager was the manager of privacy. That was the whole team. That was the privacy function. I started in privacy on April 30th, 2018. That's the exact date because I got back, I had been out sick for a couple of days. I got back and our manager quit and was having a final wrap up meeting and sat us all down and I was told, congratulations, you are now in charge of the cookie banner and consent language or consent management for our entire organization. And we need to have that in place before GDPR goes into effect in twenty-five days.
Sophie White: No pressure.
Stephen Collette: No pressure. And I think my first question was is consent management? What does that mean? What does a cookie banner? All these things that I were not in my world. And so I think a lot of people got into privacy in very similar ways before GDPR went into effect. That's the European privacy law, and it was, we need somebody to pay attention to this. So you are the person with either the title that is closest related or you're a lawyer, or you are somebody who has some extra time and you're now the privacy person. So before that, I think privacy really sat within the legal function. So it was mostly attorneys who were reading through these laws and helping their organizations get in line with what the privacy practices were. And so now what's really exciting, we're six years, almost seven years out from when GDPR went into effect, which was GDPR was the first real comprehensive privacy law that really outlined rights for individuals and made some really big requirements for everybody.
And so since then in the last six years, we have people who are making a conscious effort to join a privacy role, people who are interested in privacy, actually something to be interested in. Privacy was kind of nebulous and it kind of sat in different roles all the way across the organization, little pieces here and there. And now that there is a need for a dedicated individual or team to focus on privacy, people are making the decisions to join the team. And that's really exciting because that really means that organizations are taking it seriously and it should be.
Sophie White: Yeah, that is really exciting. I've definitely, in reviewing the horizon reports, saw the recommendation for a chief privacy officer, at least at some institutions for that to sit with one individual. I know that I think my more formal introduction to privacy at our cybersecurity and privacy professionals conference a few years ago, Pegah Parsi did a full day workshop on it, and I was really unfamiliar, honestly, I didn't know what to expect and thought, okay, I need to learn about privacy. I'm going to go to this full day thing. I was expecting to be bored, and I thought it was fascinating. It was the combination of legal, like you said, but also ethics, the cybersecurity elements, it, stakeholder collaboration. There's just so much that goes into it, and you have to be so thoughtful in the work that you do. So I'm really encouraged to see how much it's growing and becoming kind of a serious office and priority for institutions.
Jenay Robert: Can I just say, and I don't mean to sound like a fan girl here, but being introduced to privacy by Pegah Parsi is kind of like being introduced to classical music by Mozart. What? That's not easy.
Stephen Collette: Pegah is amazing. There's a handful of really amazing privacy professionals who are really good at explaining this in an accessible way. She's one of the best.
Sophie White: We'll get her on for another Shop Talk, but I agree. Yeah, Pegah, if you're listening to
Jenay Robert: This, kind of enjoyed it.
Sophie White: Well, I loved the cybersecurity versus privacy that we've had a few times. They be the same office. Should they be different? That's always a really fun conversation to think about and look at too.
Stephen Collette: Yeah, I think it was unfair to put cybersecurity up against Pegah last year.
Sophie White: That's true.
Jenay Robert: Anyone against Pegah is doomed to fail. Fail. Well, now we have to put Stephen and Isaac on the spot and say, what's your opinion on that bay question?
Stephen Collette: I have no comment on the matter.
Jenay Robert: Very safe, Isaac.
Isaac Galvan: I think that they have a lot of overlapping function and a lot of dependency on each other. I think that they're both two interdependent processes. I mean, they really have got to work together to achieve the ultimate goals of the institutions and the people. So that's my very political way of saying they're both important and they, they're both in my job title. So I represent, yeah, I got to represent a lot of perspectives. But I mean that's a big reason why I took this position is I do think that the privacy, the better we could communicate to people in a real way outside of the legal documents and the legal ease of it, and help people recognize that they also contribute to protecting privacy. Every day is so important and it doesn't take a lot of time to do, but we've got to have good communicators and people who are willing to share that idea.
Stephen Collette: Yeah, I think that was a very safe answer. But I think privacy, in my career, in privacy, I've reported into it. I've reported into security functions. I've reported into audit, accounting, finance, all sorts of compliance, different departments and very distinct areas where privacy kind of touches each one of those. But really isn't that. And the one that I think, well, there's two that I think privacy really fits in well with. It's the council's office or the legal department, and then IT security. There's a lot of overlap between those two areas with privacy. But the risk with being aligned or in the counsel's office or the legal department is that you become seen as being one of the attorneys or the law, and it's a very different kind of relationship and it can sometimes get in the way. Right now my team is at the university is with, we're in a shared team with our cybersecurity team, and that has been amazing. We work really closely together. We are aligned in what we're trying to do, and we can lean on each other to get things done. I think it makes a lot of sense with how our organization is set up to be in that kind of relationship. Other organizations that may not work out. But I think here at the university, it's a really good setup that we have.
Sophie White: That made me think of one question, which is we do have a lot of folks who are in more IT leadership positions who we support at EDUCAUSE I guess as a privacy officer, how do you like to work with it? How do you collaborate together and what is it's role in the privacy creating matter of trust, all of these conversations we're having?
Stephen Collette: Oh, wow. I think, at least in my experience, security seems to be the first one invited to the party. They get brought in because security has been around for decades and they're established and they have good frameworks and they're doing all sorts of things that people are aware of now and privacy. There's still some areas where people don't really understand what privacy really is supposed to be doing. And so with security getting in the door first, that has been very helpful for privacy because like Isaac said earlier, we overlap with security so much. A lot of our core principles are almost identical. It really helps us get into the conversation as well because they've already done some of the groundwork. They've laid that down for us, and we're able to maximize on that. And that's why being part of that group here at the university has been so helpful because we have such a strong security team and security function that's really helped us make those connections.
Jenay Robert: I think this was an element of one of the, is it an EDUCAUSE Review article you shared, Sophie, that'll be in this Showcase where I was reading the transcript today and this idea that perhaps privacy is a good way to get people to sort of care about cybersecurity. And I felt like that really resonated with me and again, exposing, embarrassing things about myself. But I think I always kind of felt like, okay, if a university gets hacked, they messed up, but it never felt personal to me. And Isaac, I think you mentioned something about that earlier in this conversation too. So when the privacy element comes in and you understand how that trickled down to, well, what's going on with my data and what kind of data does the institution have about me? Then you understand that both pieces are important.
Isaac Galvan: What's kind of a little scary to think about is every one of these privacy breaches, that data doesn't breach into nothing. That data breaches into the hands of bad actors who use that data for depending on the age of their target. I mean, sometimes you hear about even young people's information being taken from school systems and things like that. I mean, those will unfortunately be probably targeted for decades based on that information. So these privacy breaches have real longstanding impacts with privacy and cybersecurity breach two M1 have longstanding impacts at these victims. They're going to have to know on the extent to which their information was compromised and then have some understanding of what that means for how they're going to have to react.
Sophie White: Yeah, I noticed that in this Top 10 article discussion that a matter of trust, we use this title rebuilding trust for the entire report, how to rebuild trust in higher education as the industry is being put under fire in a lot of ways. But then as it relates to data privacy and security, that's a matter of trust of individual people. Our institutions are stewards of their data. So if we sacrifice that, if we let it get out to bad actors, then they lose trust in the institution as a whole. So there's a massive reputational risk.
Stephen Collette: Definitely. One of the things you said in my intro was that my team focuses in on the ethical side of privacy. And the way we read that is we have laws and regulations that tell us what needs to be done, and we take that as the bare minimum. Just because it's legal doesn't mean it's right. And we can go beyond that than we do because it's what we should be doing. We should be treating our students' data with respect, and that builds that trust that we were talking about earlier on is without having that transparency there, without going above and beyond just what the requirement is really helps set the stage for being a trustworthy institution. At some point, people will make decisions based on whether or not they trust a university to handle their data correctly or appropriately. And I think every university is probably one big breach away from losing that reputation for keeping that data secure. So it's really important that we do everything we can to protect the data.
Jenay Robert: As we're getting closer to the end here, I wanted to maybe leave off on a note of something actionable. Not that you haven't all set a bunch of actionable things, but just to leave people with something, our listeners will be, some of them will be privacy and security professionals will be technology professionals. I hope. Fingers crossed, we have some teaching and learning professionals listening to this. So important to spread this word. But for the people listening or watching, if they say, okay, I want to do something today or this week that helps build trust at my institution. What is a small action or maybe a big action, but what's something people can kind of sink their teeth into right away?
Stephen Collette: I've been calling myself a privacy professional, but I hold the opinion that everybody who handles data is a privacy professional. It just happens to be in my work title. But I think everybody has a responsibility to treating the data with respect, like I said. But it's also about seeing the person behind the data that you're, the personal information that you're handling, who is that person? It's easy to look at a spreadsheet and not think about the people whose information you're looking at, but think about the people who have given that data, whose data that belongs to, and are you doing everything you can to make sure that you're treating it respectfully, keeping it secure, not sharing it with people who don't need it, all of those kinds of things, and having that kind of mentality. It's a little bit of a switch with how we normally operate, but it doesn't add much in the thought process, but it's just a change in approach.
Sophie White: Yeah, I think that's really essential. Thinking about all of this data is maybe not a bunch of ones and zeros, but there is a person represented behind. This is so key to what we do. So thank you, Stephen. Isaac, anything you want
Isaac Galvan: To Yeah, I'd like to add, I think one thing you could do right away is get to know if you're at an institution, get to know your privacy office. Even if you don't think you need them or need to interact with them today, you are going to have a question one day about your business process. And I'll tell you, the privacy people are very approachable and they're not scary and they like people. That's usually why they got into the business. So they like to protect others around them and the institution as well. So get to know 'em and for institutions, have a privacy person, have a privacy contact, someone designated that people who are interested can reach out to, and that's the great way to demonstrate that you're aware that this is important.
Sophie White: I love that. Yeah. Privacy people aren't scary. I feel like if there was one underlying thing to take from this conversation, that's it. They are great people.
This episode features:
Isaac Galvan
Community Program Director, Cybersecurity and Privacy
EDUCAUSE
Stephen Collete
Manager, Data & Privacy Analysis and Operations
University of Illinois Urbana-Champaign
Jenay Robert
Senior Researcher
EDUCAUSE
Sophie White
Content Marketing and Program Manager
EDUCAUSE