Ed Hudson and Michele Norin on Cybersecurity [podcast]

min read
Community Conversations | Season 1, Episode 9 | Originally recorded on 7/12/21

Two IT leaders discuss a cyberattack that occurred at each of their institutions and share insights into preparing for future threats.

Listen on Apple Podcasts Listen on Google Podcasts Listen on Spotify Listen on Stitcher

View Transcript

John O'Brien: Welcome to a new and special community conversation. Today, I'm joined by Brian Kelly. Hey Brian, who's director of our cybersecurity program and we're joined by two technology leaders, who have not only survived, but powered through security incidents and are here to talk about them. So, a welcome to ED Hudson who's the CISO for the California State University System and Michele Norin, who's former EDUCAUSE Board Chair, and the CIO at Rutgers university. Take it away. Brian.

Brian Kelly: In cybersecurity at EDUCAUSE, we often talk about the importance of sharing what we've learned with each other and today we're going to talk about campus security incidents and what happened and what we learned during those incidents. So, Michele, we'll start with you. Could you talk a little bit about what happened during an incident and more importantly, what you, or as Ed had said, what we learned during that.

Michele Norin: Brian, I'm glad to be here and be glad to talk about that without being too specific because it's a little bit of a fresh circumstance. Rutgers was involved in a ransomware attack situation. It did not affect our data per se. It affected data of one of our partners, but it did affect our infrastructure. So, we had a role to play in responding to the circumstance. I will say that in my career this is the first time I've been this close to that kind of a cybersecurity incident. So, for us as an institution, as a support organization, we learned a lot and we were able to observe. Even though we were involved in it to some degree, we were able to observe and participate a little bit at arm's length, which gave us, I hate to say it this way, but it really gave us an opportunity to observe how this situation occurs, how we responded, how the players responded.

In the end, we recognized that that kind of an attack really does warrant a different response. We had to bring different players to the table. We weren't necessarily driving the response, the entity who was driving the response, the incident response. There was a playbook in there that they were operating around, and so even the idea of having a playbook was something that we learned a lot about and I'll stop there, but that was the circumstance that really brought us to a point where first of all thinking through what we would do in that situation, and secondly, just comparing that to other kinds of incidents, which unlike any other organization, we've had different kinds of incidents. So, that is kind of the reference point for me right now, in terms of cyber security.

Brian Kelly: Ed, you want to go?

Ed Hudson: I'll just preface it that higher education is experiencing just an unprecedented increase in particularly ransomware attacks right now, but attacks on our institutions in general and the CSU is no different than universities across the country, and we certainly see that that increase; it's unprecedented in my career as well. You just see a very different kind of attack. For us, we are the largest four year public university system in the country. So, my scope is 23 different campuses and our chancellor's office. So, we have had some successful attacks. By successful, I mean, we have had ransomware attacks that have locked up a few systems.

We also had a fairly significant event at the end of last year that while the ransomware was not detonated, it had a significant impact to it operations. We were able to maintain academic and business operations at the campus, but the amount of effort that the particular IT team at this campus went through over about a four to six-week period was unprecedented for us. In that case, what ended up happening was the threat actor, which we were able to tie to a nation state, came in and appeared to be doing some extended reconnaissance and, so while the malware wasn't detonated, we found them in the system. The system with our protectors that were in place alerted to them, but they came back several times and so it was a very laborious process to root them out from where they were in the network, how they'd gotten in and where they had gotten in and then trying to parse that out was an extremely impactful event for us.

Out of that came a system-wide, what we call the data hygiene project, where we looked across the entirety of the CSU in some key areas around network architecture, user access control and account management, accounts for students, accounts for faculty and staff, but also service accounts, and so it was incredibly impactful to us when we went through that.

Brian Kelly: So, I know Ed, when your incident was occurring, you started took the time to step out of that room and let me know that you were experiencing an incident and what you could share so that we could work to try to get that out to sort of anonymously out to the broader cybersecurity community so that they could learn from your experience and I think that's hugely important. We talk about collaboration. So, what takeaways are most urgent when we think about communicating to campus leadership and also across institutions to our peers?

Ed Hudson: To put some context to that, part of what happened when we were dealing with that event at one campus, because we were obviously sharing with all 22 other campuses of what we were finding, because we're all in higher ed, have some similarities in the way that we architect our networks, but also we found code from the threat actor that referenced another university outside of California. So, while the campus was working that particular event, my job is to orchestrate the resources from a system-wide perspective, and we thought it was important to get that word out to the broader higher education community with what we could share at the time and what we knew the threat actors actions were. So, we talk all the time about how the threat actors, the bad guys, the hackers, they're sharing information all the time.

John, pulling this whole conversation together with Michele and I and Brian and as we share with our colleagues and counterparts across the country, I think it's really important that we talk about what's happening, what are the kinds of attacks that we're facing, what are the indicators of compromise so that we can help each other more effectively battle this.

Brian Kelly: And Michele, do you want to give us the perspective from the CIO seat during an incident?

Michele Norin: Be glad to. I completely agree with what Ed was pointing to there as well, in terms of sharing information. I think the circumstances and the situation that we had, there were some very fundamental key takeaways from that. One is think through what you would do in that circumstance. I mean, those of us that have been in the cybersecurity space for a long time, typically you come up with your game plan, right? How are you going to respond? Who's on point? How do you structure? So there's a foundation of that I think are important for some of these new threats, ransomware, any other kind of threats that come through. So, revisiting those to be sure, do we remember what we need to do here? Secondly, these new situations, you know, like a ransomware attack in my view, it is different.

The players are different. You start in a different place. You need to have your legal team ready and prepped for what that might look like. Likely, you'll need external resources to help you investigate, do the forensics. I know for us we have a great team here at Rutgers, cybersecurity team on par, but they're not necessarily able to dig in a deep forensic mode. And so thinking about who would we go to for that kind of work, understanding what that would look like, having the conversations with institutional leadership about, "look, if we have one of these situations, this is how we're going to have to approach it." We're going to need leadership to engage, to think about things like the legal guidelines to be following here. What can be said, what can't be said, are we going to pay a ransom or not? Who do we need to be working through?

So priming that kind of conversation, I think, is important so that it's not a big surprise one day if I get to walk into the president's office and say, "Ah, we got to have a conversation." So, I think coming up with a blueprint or some kind of a playbook specific to that circumstance is important. Like, I said, we learned a lot in being able to observe, okay, there are some things that are pretty routine, we've done these under circumstances, but there's a whole other lane here that was definitely new and some of it even boiled down to who's on point for the incident, not the cybersecurity team in this case, it was the legal team that ended up being on point, which is different.

It's a different set of questions and circumstances. So, I would recommend doing a tabletop, trying to understand how that looks, working with other entities who've gone through it to say, "hey, what'd you have to do, what should we think about, who would we line up here" and just try to learn as much as possible so that you're not caught off guard if you ever end up in that kind of a situation.

Brian Kelly: I think it's interesting, you both have sort of come to the value of the planning, the value of the playbooks.

Ed Hudson: It can't be overstated, the importance of having a playbook, having an incident response plan that contemplates this and that is in line with your business continuity and disaster recovery efforts. We, in part, as a result of this, reached out to our emergency managers at all campuses and have brought them into the fold now. We're typically in California, we're focused on things like fires and earthquakes, right? So, now we've brought them in to say, "let's take advantage of that existing incident response infrastructure," and we just recently did a multi-campus tabletop exercise around ransomware so that we could test that out and be prepared so we can respond most effectively.

Michele Norin: And just to add Ed's description there, go through the scenarios, bring the different roles that you think you're going to need in a situation to the table to practice, help people understand what their roles could or should be in an incident. Absolutely agree with [inaudible 00:12:18] emergency operations incident response procedures, very similar, and they do the same thing. They come up with their scenarios and "okay, what are we going to do if it's this, what are we going to do if it's that?" So, spending the time and going through those and you have to get it to the Nth degree. It's just getting enough to where you know the major steps and you're not having to struggle with that part when you're responding.

Ed Hudson: You know, I think the takeaways for us is that the way that we have, in the past, architected networks in higher education and the way we've administered things like user accounts and system accounts, can lead us to be more susceptible. So, our takeaway for what happened to us last fall was to look holistically across the entire system at those five key areas around network architecture, network administration, user accounts, system, account management, and it accelerated our rollout of multifactor authentication. We were already doing that, but those were key takeaways for us. It really accelerated things in our system.

Michele Norin: And I'll add to that as well and I'm glad you mentioned multifactor. It spurred us too. We're like that's it, we got to move a little more quickly to two-step or two-factor login for our campus, and we were able to do so pretty rapidly actually so.

Brian Kelly: The next question we'll jump to for the both of you is around what you wish you had known pre-incident. So, magic wand, time travel, you go back, what you wish you would do.

Michele Norin: Maybe a couple of things here. I wish we had moved more quickly with multifactor. I think that would have, I wouldn't say completely diminished the threat, but it could have helped in the circumstances that when we actually learned how did this occur and how did this start, I wish we had moved on multifactor more quickly. So, that was definitely a key takeaway, a key thing I wish we would've known. The second thing that I'll point to that I wish we would have known was just that dynamic factor in terms of the legal ramifications to that situation. We were learning that on the fly. Wish we would have thought that through a little more specifically, so we would know what to expect there.

Ed Hudson: Thanks, Michele. I would echo that. We had started rolling out MFA and we had made great progress in a lot of places and multifactor, it's not a panacea, but it does inject a pretty fair amount of protection. So, if I could go back, I think two things, one would be understanding how critical it was to have MFA in place, particularly for students. We've been focusing on what we all consider those high risk kinds of situations, HR, finance, and legal, but what we've seen is a dramatic increase in attacks on students, the diversion of financial aid and those monies, which oftentimes that's literally the food and roof over a student's head and oftentimes our most vulnerable students are receiving that financial aid. So, I wish that we would have accelerated that.

The other thing, I think, if I could go back would be the implementation and maturation of endpoint detection and response. If we'd had that more holistically, it would just have allowed us to respond, I think, a little more quickly than we did. We responded effectively, but we could have been much quicker if we'd had some good endpoint response detection and monitoring in place at the time.

Brian Kelly: Before we wrap up, do either of you, Ed, I know you may want to talk a little bit about professional development and cybersecurity workforce development and give you and Michele sort of an opportunity to freeform on those topics or something that we've missed in this conversation.

Ed Hudson: Yeah. Thanks, Brian. Those of us who have a certain number of birthdays behind us came up through technology and information security ranks in environments that we didn't have the formal education opportunities that we have now. We're super proud in California that 10 of our campuses now offer both baccalaureate and master's degree level or certificate level courses in cybersecurity. Four of our campuses have been recognized as national centers for academic excellence in cyber defense education by Department of Homeland Security and NSA. So, I think, an opportunity for us is to close that gap between the academic and the operational and have those students participate or be aware of or have the CISOs, ISOs of campuses work with those students. As a result of all of this, we're seeing research projects by students on ransomware and how to more effectively deal with it. So, we all in higher education have access to an incredible brain trust of innovation and education out there, and I think we can do a more effective job of working hand in hand with the academic side of the house.

Michele Norin: Just to chime in, I completely agree with Ed on that. Any way we can expand the profession with more expertise, and more people with that skillset, the better. I think all of us can observe this is a big topic. It's very visible. I think we're, all of us are, reading every week, sometimes daily about some threats, some hack, some compromise somewhere and so there's a lot of visibility around it. We need to have the professionals who can not only understand, help, defend, but try to position us in a way that's safer. So, I think any way we can groom professionals, I am totally, totally supportive of doing so. I just think that we're in a different place. This is a long haul topic. It might ebb and flow, might get quiet for a while, but we're not in quiet mode right now and so I think any time we can raise the visibility and raise the awareness of the importance the better.

Ed Hudson: Michele, I'm going to give a shameless shout out to EDUCAUSE, full disclosure. I'm one of the faculty members, but we're launching the first ever cybersecurity oriented new IT managers program this fall and that comes from a direct request and information from universities that say we need those pathways to educate and so we're really looking forward to that and we've had some other faculty members that are just amazing thought leaders in cybersecurity and higher education, and I'm really happy to see that we're doing that in the partnership with EDUCAUSE and that aspect of professional development is really terrific.

John O'Brien: We will let the record show our first product placement in a community conversation, definitely a keeper. Having been a leader who has been through major incident and lived to tell about it, I really don't know how you'll answer this question, but are you more worried and more agitated as a result of having been through it? Or are you actually calmer having been through it?

Michele Norin: Well, maybe a little bit of both. I mean, there's something to be said for learning on the fly. Not that you would orchestrate this just to learn it on the fly, but there's something to be said for being in it and then you come out the other side and then you have the opportunity literally to think what would we have done differently? What worked, what didn't work, what would be changed? So, to some degree, it is a little bit of, "okay, so we got through that." On the other hand, I just think this whole topic, there's that constant pressure of what is it that we don't know that is going to come back to bite us on anything. It's just a constant theme in the threat situation.

Ed Hudson: I would echo what you just said, Michele. Certainly having gone through such a significant incident, it's the most significant incident that we've had in my career in terms of the impact to the IT team in particular. As I said, we were able to stave off any significant outages or anything like that, but I watched this IT team work 16 to 18 hours a day for five or six weeks. So, you have a lot of missed opportunity at the campus and you have all of that impact at a time where we're already very impacted by COVID and the pandemic working through all of those issues as well. So, I think there is a little bit of, "okay, I've gone through that" and we've done it successfully and we came out the other side and we were able to identify some gaps of areas. We could do this better. We could do this, wish we'd known, but I also have angst because I don't see it abating. I only see it increasing.

I don't think we've seen the worst of it. As you look at the advent of more technologies that make our environments more complex, I'm concerned that the threat actors will avail themselves off. If we start seeing machine learning used to extend their reach into our environments, I think we don't know what we're going to have to combat a year from now two years from now. So, it's great to have gone through it and learn those lessons and say we're more effectively prepared, but not knowing what's going to happen tomorrow and what the threat actors are going to do, certainly keeps me on my toes as a system.

This episode features:

Ed Hudson
California State University System

Brian Kelly
Director of the Cybersecurity Program

Michele Norin
Sr. Vice President and Chief Information Officer
Rutgers, The State University of New Jersey

John O'Brien
President and CEO