Metrics Mania! Review

min read

Last month, EDUCAUSE hosted a webinar called Metrics Mania! Using Metrics to Bolster Your Higher Education Information Security Program. A key recommendation was that metrics be institutionally relevant. For instance, they should indicate the degree to which the institution's information security goals are being met. Another important observation at the August event was that metrics are not the same as goals. One presenter said it best — metrics are like the walking dead when they morph into organizational goals or strategic initiatives because they stop functioning from a measurement framework.1

Navigating the wide space between collecting data and creating metrics that are institutionally relevant and not allowing metrics to become goals can be overwhelming. About 62% of institutions track information security metrics.2 Of those institutions collecting information security metrics, the five most commonly collected metrics are:

  • Vulnerability scan coverage (40%)
  • Patch management coverage (32%)
  • Patch policy compliance (32%)
  • Number of known vulnerability instances (30%)
  • Incident rate (27%)

When measured over time, the above metrics could help measure progress against a goal of ensuring that all institutional computing resources are securely configured, deployed, and maintained. Other institutional information security goals and related metrics might include the following.

Goal: Provide the resources needed to properly maintain an institutional information security program function.

  • Metric: Percentage of institutional IT budget spent on information security, measured over time
  • Metric: Ratio of information security staff to total institutional IT staff, measured over time
  • Metric: Percentage of all IT staff who receive training on how to incorporate information security best practices into their daily job function

Goal: Ensure that institutional personnel are trained to carry out assigned information security duties and responsibilities.

  • Metric: Percentage of institutional personnel that have received information security training, measured over time
  • Metric: Reduction over time in information security incidents due to preventable human error

Goal: Ensure that all institutional IT resources are securely configured, deployed, and maintained.

  • Metric: Percentage of IT resources deployed with security software properly installed
  • Metric: Percentage of IT resources that undergo information security maintenance according to formal maintenance schedules
  • Metric: Percent of incidents reported involving institutional IT resources and classification of incident type (e.g., unauthorized access, malicious code, improper usage, etc.)
  • Metric: Exploits prevented at the network (or e-mail gateway or machine level) due to securely configured, deployed, and maintained IT resources.

Goal: Significant reduction in sensitive data stored on institutionally owned computing devices.

  • Metric: Percent increase over time of institutionally-owned computing devices on which sensitive data scanning tool has been deployed
  • Metric: Number of incidences of unapproved storage of sensitive data found on institutionally-owned computing devices over time
  • Metric: Reduction in sensitive data exposures due to stolen or vulnerable institutionally owned computing devices

Goal: Require all departments to have business continuity plans in the event of an incident impacting IT systems.

  • Metric: Percent increase over time of departments with business continuity plans
  • Metric: Percent of total departments with updated, tested business continuity plans
  • Metric: Outcome of 48-hour power outage on departmental IT systems (e.g., did the plans work and how quickly did IT departments respond to bring systems back online)

Institutions must know why they are collecting certain metrics and what those metrics purport to show about the state of an institutional information security program (or goals within that program). By implementing metrics, institutions can use them to help answer the hardest of questions: Do our information security investments help further institutional mission and goals? Are we more secure today than we were before? How do we compare to our peers in this regard? Are we secure enough? The last is perhaps the most important question in today's environment.


  1. Charles Dziuban from the University of Central Florida kicked off the August 2017 Metrics Mania! event with the presentation Considering Measures, Metrics, and Benchmarks.
  2. The figures for commonly collected metrics are derived from previously unpublished 2015 data from the EDUCAUSE Core Data Service.

Joanna Lyn Grama is the director of cybersecurity and IT GRC programs at EDUCAUSE.

© 2017 Joanna Lyn Grama. This EDUCAUSE Review blog is licensed under Creative Commons BY-NC-SA 4.0.