Hotline: Cybersecurity and Privacy | May 2025

This Month: Training, Trust, and Procurement

min read

Welcome to "Hotline: Cybersecurity and Privacy." This new monthly column will tackle the philosophical, moral, strategic, and organizational quandaries related to higher education cybersecurity, privacy, and data. This month, Mike answers your questions about required security training, trust, and procurement.

Stacked tiles with various icons: key, lock, shield, etc. The top on is a phone in use.
Credit: HowLettery / Shutterstock.com © 2025

Dealing with Tedious Training

Dear Hotline: Why are security trainings so often generic and disconnected from my actual job as an instructor or staff member? Can't they be more relevant to our day-to-day work? It feels like only senior leadership gets a say in privacy and data policy decisions. What about those of us who use the systems every day?

Seeking Relevance

Dear Seeking: I really hate running. But when I read your question, the first thing I did was look at my dog and think, "He's getting a little pudgy." So, we went to the park and ran loops. Now that I'm back home, showered, and the dog is snoring at my feet, I feel the fortitude to answer your question. You have sneakily asked two questions, and because I respect an overachiever, I'll answer both: (1) Why is security training so underwhelming, and (2) Why do only senior leadership get a say in privacy and data policy decisions?

If it makes you feel any better, cybersecurity training everywhere is underwhelming and, as you say, generic. Perhaps that's why there isn't any reliable data that cybersecurity training has a long-term impact on people's behavior. At my last institution, we had a postdoc analyze all our incident and training data for correlation, and the only thing she found was that no such correlation exists. It's embarrassing that the best higher education institutions can do is so disappointing. But you should be kind to the good people trying to create interesting and effective materials. The sheer scale and diversity of users make it virtually impossible to tailor the material to everyone. Do you tailor training by age? Profession? Year in school? Education? You see the problem. I'm a big believer that we need to cull the "awareness" material from these courses and focus more on training individuals on specific practices—perhaps through a "just-in-time" delivery system. Everyone needs to know how to send a spreadsheet securely or use a password manager. Simply reminding them that the internet is a dark and scary place only reinforces a negative connotation with cybersecurity.

As to your second question, why do the senior leaders, who were hired specifically to create institutional policies and manage the organization, get to, um, set policy and make decisions? I can tell you from experience that policy decisions are rarely created by fiat but are developed with input from a host of institutional stakeholders, including legal counsel, the faculty senate, and subject matter experts. Institutions often have mechanisms for public input as well. I suspect, however, that you might be referring to what I call a little 'p' policy rather than a big 'P' policy. Little 'p' policies stem from the host of operational decisions implemented by administrative units, such as the IT department or the cybersecurity and privacy office. If you haven't already done so, claim some agency in the matter. Don't wait for someone to come knocking on your door and say, "Please, may I have your opinion?" Reach out and ask how you can be involved. Of course, I acknowledge that organizational culture or unreceptive management can be genuine barriers, even at the little 'p' level. For those who run an operational shop, I'll share one tactic my department used earlier in my career. We had an open-door policy for our weekly staff meetings that we advertised. Having a random student, faculty, or staff member show up and share their views on where they intersected with the office initiatives was always refreshing.

Working Through Trust Issues

Dear Hotline: Why does it feel like IT and cybersecurity teams don't trust us (faculty and staff)? How can we build a relationship that feels collaborative rather than punitive? As a faculty member, how can I teach students about digital responsibility when I don't fully understand the risks myself? Is that even my role?

Puzzled Professor

Dear Puzzled: You're not actually puzzled at all—you're perceptive. If you're stuck waiting your turn at the DMV to get your RealID so you can travel, I'd suggest taking a peek at a recent Showcase Webinar hosted by EDUCAUSE on this question (full disclosure: I was one of the presenters).Footnote1 Be sure to keep your phone volume up and your earbuds out to give everyone around you the full airport experience.

The feeling you describe is likely more real than imagined. Enterprise IT shops tend to view the world as a pretty homogeneous place. (We support tens of thousands of users; why can't you just use Windows?) Faculty and research environments are fluid, entrepreneurial, and opportunistic. Security practitioners often struggle to understand why they face pushback when deploying the same technologies used by Apple, Microsoft, and others. Nobody understands anyone else. But I suspect you're not looking for a philosophical discussion on trust but are instead seeking some practical advice.

Exercise and insist on radical transparency. Place senior cybersecurity and privacy leadership on faculty senate committees (ex officio if need be) and workgroups. Request regular presentations on cybersecurity and privacy plans. Identify formal and informal liaisons to the security and privacy offices. Try to attend regular staff meetings. Ask the cybersecurity office to review any cybersecurity plans submitted to funding agencies. Buy your CISO lunch once a month. It's astonishing how effective regular communication can be at building trust. Mistrust, like conspiracy theories, breeds in darkness.

Bring empathy to the table. Your cybersecurity team isn't trying to surveil you, despite using surveillance-adjacent technology. Often, regulation or compliance drives cyber initiatives. The same tools that get pushback from faculty are not just commonly used today—they are increasingly necessary to ensure your institution can function. The security team is simply a bunch of professionals trying to do their jobs, although they may not deeply appreciate your sensitivity to these issues nor the practical reasons for that sensitivity. For any security practitioners reading this, remember that faculty are not merely "staff" in the ordinary sense but are a formal part of the governance structure at most colleges and universities. Their role as the engine of the institutional teaching and research mission has to be respected. Remember, while your job may be coaching the institution about risk, trust often begins with understanding, and understanding starts with empathy.

Oh, I almost forgot your last question about whether it's your job to "teach students about digital responsibility." While I believe higher education institutions have an obligation to ensure students in 2025 learn digital responsibility, it seems unfair to say, "OK, chemistry professor, teach cybersecurity and privacy." But it is fair to ask, especially given the ubiquity of digital tools in the sciences and humanities, how cybersecurity and privacy practices are integrated into the disciplinary workflows students are being taught. That topic is too broad to discuss here, but it's worth an extended conversation involving faculty and cyber professionals at your college or university.

Going Rogue (and Why It's a Bad Idea)

Dear Hotline: By the time we add all the security and privacy controls to software, our faculty and staff have to jump through so many hoops to use it that they directly buy their own accounts and use that instead. How can I help them better understand why we do what we do and why it's in their best interest to use the software we provide? Is there a better way?

Frustrated in Some R1 University

Dear Frustrated: I took over managing my parents' finances when they began to need more support. Despite living halfway across the country, I could do this using my phone and my thumb (my right thumb, that is, as the left one has a mind of its own). Whether it was related to banking, utilities, or caregivers, my right thumb came to the rescue. For better or worse, this is real life for your faculty and staff. When they come to work and find they can neither use these same tools readily nor adopt alternatives without jumping through a gauntlet of security, privacy, and procurement protocols, it's no wonder they look at us like we're mad. I've run that process at multiple institutions, and even I got frustrated.

So, your question is spot on: "How can I help them better understand why we do what we do . . . ?" Unfortunately, I think part of the challenge is that the explanation is highly nuanced, which doesn't fit in an email signature very well. One reason is ensuring the institution has liability protections, another is ensuring certain cybersecurity or PII handling protections are included, and another is guaranteeing cybersecurity staff have access to the application data needed to investigate or detect account compromises or misuse.

Despite these nuances, I'd like to recommend giving presentations when new services are licensed that line-item the benefits of the institutional service versus the commercial/free service. Work with your procurement office to integrate security and privacy requirements deeply and seamlessly into the procurement workflow so they seem less tacked on. Develop materials that highlight all the positives from institutional licensing agreements. While these strategies are helpful, they likely won't be enough to satisfy those who are the most frustrated by current processes. Therefore, in the spirit of Vilfredo Pareto (that 80 percent of the complaining comes from 20 percent of the users), invite some of the more vocal critics to participate in procurements they have a vested interest in. Not only will that help them understand the why of your process, but their input may provide you with a novel perspective on the risk you're controlling for—and maybe even change your approach to the procurement. Risk acceptance is too often overlooked.

Have a cybersecurity or privacy dilemma you'd like Mike to unpack? Submit your question through our anonymous form.

Note

  1. For more on this, see Michael Corn, "A Matter of Trust," Michael Corn (blog), Substack, March 17, 2025. Jump back to footnote 1 in the text.

Michael Corn is a Consultant at Argos Consulting.

© 2025 Michael Corn. The content of this work is licensed under a Creative Commons BY-NC-SA 4.0 International License.