Key Takeaways
- Models such as "cyberwar" and "cybercrime" have not proven effective for information security in higher education, suggesting that a different analogy could provide unique insight into key problems.
- Healthcare may be a more relevant model for information security in higher education, a fundamental premise of which is that bad things will happen, and, in the absence of total prevention, "survivability" is an important and valid objective.
- Infosec work is ongoing and evolving rather than discrete and bounded, which suggests a focus on continual improvement and survivability; the study of successful strategies in healthcare potentially can enhance our ability to achieve both.
Joshua Beeman, Chief Information Security Officer, University of Pennsylvania
We have witnessed some spectacular information security events in the past 18 months: the Target and Sony breaches, Heartbleed malware, and Snowden revelations, to name a few. With information security challenges described almost daily on the front page of major newspapers, you no longer have to be in IT to be aware of them.
Unfortunately for security practitioners, the added attention can sometimes foster a sense of urgency among their managers that undermines thoughtful planning and may lead to misunderstandings about what is possible.
Remaining Optimistic in a "Thankless Job"
Last May at the EDUCAUSE Security Professionals Conference 2014, Charlie Miller posed a provocative question.1 Comparing and contrasting the state of information security in 2007 and 2014, he asked, "Are we any better off after seven years of applied effort?" In large measure, he concluded that we were not.
Similarly, other practitioners 2 used familiar and compelling examples to describe the Internet and applications as a troubling technical dystopia, where years of investments have failed to make an appreciable difference in our ability to prevent misuse and disclosure. Perhaps it was for this reason the New York Times described the chief information security officer role as a pitiable and thankless job.3
Higher education has had as bad a year as any business, with numerous high-profile reports of security compromises. And with limited funding, adequately staffing the infosec function is a growing concern for even the big players,4 let alone cash-strapped EDUs.
I do not dispute the facts and figures, or the technical, economic, and political realities of our current limitations. But I propose that the models such as "cyberwar" and "cybercrime" are not as effective at informing the discipline of information security in higher education. A more effective analogy can provide unique insight into key problems, helping to best focus our attention and efforts.5 Ideally, it would also bring a more realistic perspective to the discipline, allowing us to both acknowledge our many shared successes to date and retain our optimism for the future.
Alternative Models
An analogical model improves our understanding of one system or environment by comparing and analyzing another existing — and hopefully better understood — system or environment. Models can be a powerful tool to bridge the comprehension gap that often exists between specialists and laypeople. They can also provide unique insight into, and solutions for, complex or intractable problems.
Governments have put forth a model, further propagated by the media, of "warfare" to characterize the lengthy struggle taking place in cyberspace. This analogy is appealing at an emotional level: conflict between two parties, waged until one side is defeated, with the commendable goal of an eventual return to peace. Another popular model, "cybercrime," resonates because it reflects the real loss and damages felt by companies and the apparent opportunistic vandalism that sometimes characterizes these attacks. Lastly, models based on "predator/prey" relationships are often invoked, whereby targets (prey) respond with a variety of defenses against attackers (predators), similar to those seen in nature.6 Camouflage (security through obscurity), mimicry (honeypots), and defensive or warning adaptations (tarpits, blackholing) are a few examples.
Each of these models describes some aspects of the infosec discipline well, and they most certainly play a role in improving our capabilities. However, I frequently encounter their limitations when trying to apply them in my own environment. The warfare analog breaks down when applied to most of my user base. The university's faculty, students, and staff cannot, and should not, be counted on to act as conscripts in my cyber "army." They lack the necessary training, time, and interest; my priorities are not their priorities and never will be. Similarly, the criminal model feels incomplete. Even if I were to deputize my users, we would make a poor police force, with low arrest numbers and high rates of recidivism.7 Lastly, the predator/prey model is fascinating, but if I'm being honest, some of the systems on my network are as likely to be the cheetah as they are the gazelle.
"Survivability"
The higher education environment provides several interesting challenges. Traditionally, the mission of most colleges and universities — teaching, research, and service — embraces collaboration, openness, and freedom of expression. In addition, many EDUs are highly decentralized, with diverse technologies selected to be inexpensive and tolerant of individualism rather than uniformly managed and tightly controlled. And increasingly, many higher ed entities perform more than just the functions of education, scholarship, and professional services such as medicine or law. They must also serve as a landlord to thousands of tenants (students), be a competitive retailer and hotelier, and host major sporting events, all while running an independent police force, facilities, and waste management services. This complexity is more akin to that of a small city than a single monolithic company.
I propose that healthcare may be a more relevant model for information security in higher education than our current approach. To be clear, I am not talking (exclusively) about HIPAA and the increasing regulatory compliance associated with information security. Similarly, I mean to broaden the metaphor far beyond common discussions regarding "viruses" and epidemiology. I refer to the entire discipline surrounding human medicine, a fundamental premise of which is that bad things will happen, and, in the absence of total prevention, "survivability" is an important and valid objective.
Some aspects of healthcare that have strong infosec corollaries include the following:
- Illness and accidents happen all the time. Diseases are complex and evolving, and people will always get hurt. This is a fact of life.
- Any given diagnosis may have multiple treatments. The selected treatment will depend largely on the context (i.e., your doctor's personal preference, your overall health or budget, and the current state of the science). Just because one treatment works for me, that doesn't mean it is the right one for you.
- Vaccination and immunization programs can effect significant change. Some problems are bad enough to institute preventive measures and amenable (with dedicated effort) to attempts at eradication.
- Hygiene has a big impact. Following basic rules and guidelines can go a long way to eliminating the most common problems.
- Most people are not doctors, they're patients. While they absolutely have a responsibility to monitor and ensure their own health, the majority of people in the health system are not medical experts, nor are they expected to be.
- Healthcare (and the surrounding environment) is complex. Hospitals are themselves like small cities, with many different departments and functions, ranging from medical services to retail, maintenance, and infrastructure.
My discussions with colleagues on this topic have informed this list and also led me to believe that there are many more similarities and comparisons to draw.8 If we accept the analogy, then the next step is to examine how it helps inform our own work.
Applying the Model
Healthcare has some immediate lessons when applied as an analog for information security.
The Value of Evidence-Based Medicine (EBM)
EBM is ultimately about helping clinicians evaluate risks and make good decisions. With limited resources, information security in higher ed must be effective at triage, diagnosis, and investing in solutions with the highest return on investment. This argues for the collection and sharing of data and practical metrics, and a focus on addressing the highest risks with the most effective controls.
Good Hygiene, Clearly Communicated
It seems every year that the Verizon Data Breach Incident Report identifies unpatched software as a fundamental vector of attack.9 Of course, everyone knows that they should keep their machine patched, leave their firewall on, and have a strong password, right? Unfortunately, this information can get lost or deprioritized among all the other security messages people receive. And it's human nature to become less diligent about commonly repeated tasks. (Even doctors need to be reminded about the proper technique for washing their hands.) The community must agree on the four or five most critical "basics" of what people need to know and do to safely own a computer, and create universal and oft-repeated campaigns to make sure they are not forgotten.
Know When to Get a Consultation
Your doctor doesn't ask you to know chemistry, biology, radiology, or anything else about the broad discipline of medicine. But he or she does expect you to come in if you are hurt or sick, or plan to do something unusual, like travel abroad. Drawing on the lessons of Atul Gawande, we should arm our "general physicians" (IT support personnel) with effective checklists to ensure that any such visit focuses on the most critical topics, uncovers the most serious problems, and catalogs the information so that it can be referred to year after year.10
Quarantine/Segregation
The need to separate newborn babies from patients with infectious diseases seems obvious. But many campus networks still assume that "one size fits all." The best way to preserve open, high-speed research networks ("science DMZs") is to aggressively pursue technologies and architectures — such as VLANs, multiprotocol label switching (MPLS), or software-defined networks (SDN) — that support different networks for different purposes. This segregation protects populations from one another and allows you to prescribe effective treatments that otherwise might not be tolerable to some users.
Different Conversations for Different People
Healthcare is a necessity, but it is also a big business. The healthcare model underscores that there is more than one conversation taking place, and that we only succeed when we can communicate effectively across multiple levels and populations. Healthcare practitioners have many different types of conversations, including those among physicians, between physician and patient, and between hospital workers and administration. To be effective, infosec practitioners must be equally skilled at communicating with peers, clients, and the administration, each on their own terms and with the proper vocabulary. Failure to engage across all three arenas will greatly impede success11 — that is, the reduction of risk and the likelihood of survivability.
Peer Review and Benchmarking
Because of the life and death consequences, hospitals and physicians are held to a high standard. Peer review and benchmarking are used effectively to help correct mistakes and ensure continual improvement. Higher education has excellent peer networks available,12 and they should be used to promote more regular and formal peer reviews. Similarly, an increasing number of tools are available to help assess a program's overall maturity and benchmark it against peers (for example, the EDUCAUSE Information Security Program Assessment Tool and data from the EDUCAUSE Core Data Survey, or CDS 13). Such reports are often more accessible to management, and can be very effective in clearly communicating areas that require additional investment.
From Theory to Practice
I plan to test the applicability of some of these lessons at my own institution in the hope that they can further enhance the security of our electronic assets. During "consults" I plan to promote the use of checklists linked to our existing risk-assessment program; to define and broadly communicate "good hygiene"; to more carefully study key outcomes in hopes of informing future investments; and regularly benchmark our program against our peers. I will look for further instances of this model's application or extension, and whether there are additional specific examples from healthcare that can help us avoid potential pitfalls as our practices mature.
Information security is still a new and evolving discipline, and looking ahead we should expect major shifts in our understanding and capabilities. Our own "germ theory" moment is hopefully still to come — maybe an adjustment to software development or the widespread adoption of virtual desktop computing — and will revolutionize how we approach and implement security. We must continue to be thoughtful and rigorous with the tools that we have and learn from the mistakes we have already made, while remembering that, at least for the foreseeable future, our work is ongoing and evolving rather than discrete and bounded. We must therefore focus on continual improvement and survivability. The study of successful strategies in the field of healthcare can greatly enhance our ability to achieve both.
- Charlie Miller, "Failures of the InfoSec Community," EDUCAUSE Security Professionals Conference 2014.
- Quinn Norton, "Everything Is Broken," May 20, 2014; and Peter Welch, "Programming Sucks," April 27, 2014.
- Nicole Perlroth, "A Tough Corporate Job Asks One Question: Can You Hack It?" New York Times, July 20, 2014.
- Violet Blue, "Cybersecurity Hiring Crisis: Rockstars, Anger, and the Billion Dollar Problem," ZDNet, August 26, 2014.
- Marion Poetz, Nikolaus Franke, Martin Schreier, "Sometimes the Best Ideas Come from Outside Your Industry," Harvard Business Review, November 21, 2014.
- See, for example, Hemraj Saini, S. B. Dash, T. C. Panda, A. Mishra, "Prediction of Trustworthiness in the Cloud Computing Environment Using Predator-Prey Model," International Journal of Cloud Computing, Vol. 2, No. 5 (2013).
- This is not to downplay the important and impressive work being done by some companies working with law enforcement. For example, see Mohit Kumar, "Microsoft's Digital Crimes Unit Successfully Disrupted the ZeroAcess Botnet," The Hacker News, December 7, 2013.
- Joe St Sauver, "We Need a Cyber CDC or a Cyber World Health Organization," Anti-Phishing Working Group Counter e-Crime Summit, San Francisco, May 31, 2007.
- Michael Mimoso, "DBIR: Poor Patching, Weak Credentials Open Door to Data Breaches," Threat Post, April 22, 2014.
- Atul Gawande, The Checklist Manifesto (New York: Metropolitan Books, 2009).
- Ashley Carman, "Report: 31 percent of IT security teams don't speak to company execs," SC Magazine, July 17, 2014.
- For example, EDUCAUSE Security and REN-ISAC.
- Data from the EDUCAUSE Core Data Survey is available in a self-service tool for CDS participants called CDS Reporting. This tool allows institutions to create peer groups made up of other institutions, interact with data in dashboards, and review metrics on IT staffing, funding, and services.
© 2015 Joshua Beeman. The text of this EDUCAUSE Review article is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 license.