As costs plummet, VR/AR use is escalating on campuses, bringing security issues that must be addressed.
The increasingly rapid pace of technological advancement presents continual opportunities — and challenges — for the research and education communities. Most recently, advances in head-mounted displays (HMDs) for both virtual reality (VR) and augmented reality (AR) have made dramatic improvements in the devices' efficacy and affordability.
Since the early 1990s, higher education has been experimenting with VR, which is a computer-generated environment that simulates a realistic experience. Historically, however, these efforts have been focused on large room-scale systems driven by dozens of displays and computers (such as cave automatic virtual environments). Because these efforts were extraordinarily expensive and required experts to operate them, their deployment was primarily limited to large research institutions. AR, which offers a live view of a physical, real-world environment that has computer-augmented elements, has been an area of interest since Harvard's Ivan Sutherland created a rudimentary AR headset in 1968. AR has been difficult to implement, however, and the required processing power, real-time 3D spatial mapping, and display technology have all been historically insufficient to create high-quality AR experiences.
Today, new HMDs can provide these high-quality immersive experiences at consumer price points, reducing costs by almost two orders of magnitude. Because of this paradigm shift, VR and AR are poised to become an integral part of the higher education technology environment; on some campuses, this is already the case.
Integrating VR and AR in higher education makes possible many applications. However, their use also raises security issues. Researchers at the University of New Haven, for example, have demonstrated a vulnerability that let an attacker trick a VR user into crashing into a wall. In another case, a University of California, Davis, researcher showed that VR tracking sensors can be compromised to allow attackers to peek into the user's physical space. To successfully address both existing and forthcoming VR/AR security risks, institutions must understand those risks and apply general security principles to mitigate them.
VR and AR in Higher Education
As drastic cost reductions lead to broader use of VR/AR systems in education, their use cases move from lab demonstrations to production environments that sometimes involve sensitive data. To manage the potential risks, it is important for a campus information security team to ensure that appropriate security protections are in place. As always, the risks will depend heavily on the use cases and whether the systems process or use sensitive data. Further, as institutions use these systems more pervasively on their campuses, effectively managing the risks becomes all the more critical. Among the factors an information security team may need to assess are basic information security requirements, privacy, and policy.
The research and education community has not been shy about experimenting with emerging technologies to increase efficacy and outcomes, and VR and AR are no exception. Following are examples of how campuses are using VR and AR today. Although each use case has its own set of potential security issues and proposed solutions, commonalities exist among the use cases. Addressing these commonalities, sometimes by simply extending existing security approaches, could become standard practice for new projects.
Field Trips
Perhaps most obviously, VR and AR let students virtually visit locations that they cannot visit physically — from the Smithsonian museums to the Amazon rain forest to the surface of Mars. While the education community has long conducted virtual field trips using video technology, VR enhances student engagement and outcomes through increased immersion. Some institutions, for example, are offering prospective students virtual visits to get a better feel for their campus. Others are taking the concept a step further. The University of Michigan football program, for example, uses VR to give potential recruits a chance to feel what it's like to play in America's largest stadium in front of more than 107,000 screaming fans.
Potential security issues. Field trips are a general use case aimed at sharing static public information, so the data are not confidential. This limits the risk. However, digital vandalism to the environment that affects the data's integrity could be a risk. Consider a website defacement in the VR/AR context. Do you want your virtual tour to potentially expose the student to inappropriate content? Do you want your rival campus to tell your potential athletic recruits that it has a better program? Should you limit student registration lists for field trips to only those students who have not opted out of the Family Educational Rights and Privacy Act (FERPA) directory information? Depending on the system's complexity, it may have multiple servers or devices connected to your network, and all of them may need to be secured.
Proposed solution. Every device used in the use case will need to be secured. At a minimum, devices must be kept up-to-date with patches and follow effective security practices. You also need to control access to content and content updates. If it's a system hosted by a third-party cloud provider, all cloud security aspects should be addressed, and, most critically, you should manage access to your content.
Distance Learning and Training
After years of working with video conferencing, we're excited by the ways in which VR is enhancing distance learning.
- A professor at the University of British Columbia has already delivered lectures in VR, complete with full-body motion capture mapped to an avatar.
- The Stanford School of Business is offering an online executive education program delivered entirely through VR.
- Penn State students and faculty have shown how implementing VR can improve learning outcomes versus traditional online methods.
Many people learn more effectively by doing rather than just seeing or hearing. VR gives these and all students the opportunity to experience the activities they are learning about, whether that is working at an archeological dig site, guiding airplane landings on an aircraft carrier, or conducting surgical procedures.
Another great example of learning by doing is architectural design. Being able to visit and explore a building before any construction actually begins is a huge step forward for this field. Drury University's Hammons School of Architecture is among the many programs beginning to take advantage of this technology. Such an application isn't limited to the architecture field, however; one of our favorite examples involves primary school students in Ireland who are using VR to recreate and visit Irish historical sites.
Potential security issues. In addition to the issue raised in the previous use case, these applications are used in the classroom and thus entail potential sensitivities around student data, which must be secured. Further, if students bring their own VR/AR devices to the classroom, the complexity around security increases because the devices' security aspects and integrations may be unknown. User authentication is also important to determine participant identities and thus avoid potentially serious problems such as students attending classes without registering or anonymously harassing each other in the VR/AR classroom where an institution could identify harassers from the authentication records. Such authentication can also protect against having someone copy an avatar to impersonate the instructor. Availability is a further key concern, as a network outage or DDoS attack would impact a class more seriously than in an online asynchronous classroom experience.
Proposed solution. In addition to the previously proposed solutions, integrating with campus identity management and authentication systems may be critical here. Also, system usage may need to be logged to a campus security information management system to correlate the logs with other campus logs for incident response. Depending on the data and the system, network connections might also need to be encrypted.
Collaboration
Collaboration is perhaps the most important VR/AR application because its implications extend far beyond research and education. VR and AR will change the way we collaborate over distance. Both the University College of London's Immersive Virtual Environments Laboratory and the University of California, Davis, have demonstrated how interactive virtual avatars can be mapped onto local physical spaces using AR. Case Western Reserve University has been working with Microsoft to prototype collaborative medical applications for the HoloLens AR platform. All of these pieces are coming together to help realize VR's collaborative promise.
Potential security issues. This may be the most complicated example, as the data involved in a collaboration can be sensitive and the collaborations may cross campuses. If you're collaborating on something with complex and extensive security requirements, such as research or medical cases, you need to implement strong security controls to protect the data. Collaborations in VR and AR may even be integrated into existing campus collaboration tools. In that case, you must ensure that someone in a VR/AR environment could not use someone else's access into a collaboration tool. You may also want to consider how ransomware might work in a VR/AR environment to protect against a compromised user encrypting the data in the collaboration tool.
Proposed solution. When possible, secure all integrations to ensure that security controls are consistent across the environment. You may also need to review the usage logs to verify that access is appropriately logged. You may even need to back up the data.
Security Principles for Virtual Reality and Augmented Reality
To address common issues across use cases you can start with core information security principles that build on what your information security teams may already be doing in existing programs. In planning for security, you might also need to account for highly publicized security issues around malware, ransomware, DDoS attacks, and so on, which can impact VR/AR systems in unique ways. Performing a risk assessment during a project can help identify which security controls a VR/AR system needs.
Physical Security
One challenge with current VR HMDs is that they completely block users' visual connection to the outside world and can also block most of their auditory connection. This can dramatically diminish situational awareness, which is something colleges and universities always encourage their students to practice. Students using VR alone in an open space, such as a computer lab, could be vulnerable to having their possessions stolen literally from under their noses, or worse, they could be vulnerable to assault. One easy solution here is for VR to be used in pairs so one user could maintain situational awareness while the other uses the HMD. Use in pairs is not always possible, however, so other solutions (such as locking lab spaces) should be considered in space planning.
Physical Safety
Along with physical security, VR/AR systems can entail basic physical safety concerns. Users may be prone to accidents during or after use. When first using a system, people can be disoriented; they might fall over when trying to regain balance or accidently hit someone with the controller when changing directions. Balance can also be an issue when exiting the environment. When using VR/AR systems in the real world, it is important that users continue to maintain situational awareness to avoid physical dangers such as traffic, buckled sidewalks, or other potential hazards.
Incident Response
Institutions and users must be prepared for and capable of responding to any untoward incidents in a VR/AR environment. Most campuses have existing incident-response capabilities, so you can often simply extend these into the new environments. Access to logs from the system will be critical when responding to an incident. It might also be necessary to monitor the system in real time to determine whether inappropriate usage occurred that requires investigation.
Data Security
You should also implement basic data security. Many times, VR/AR systems haven't implemented encryption for network connects, which is standard practice in more traditional communication tools such as instant messaging apps. Many VR/AR systems also rely on third-party apps or integrations with dubious security. As with other collaboration applications, the system might cache information on a local computer or network server; those data also might need to be secured, which could mean encrypting the data. These same systems can also be used as a jumping-off point for accessing the rest of your network. Further, a DDoS attack can create unexpected results in a VR/AR system, so be prepared with a business-continuity and disaster-recovery plan if a system is critical for a business process or classroom. The impact of compliance and legal requirements may be difficult to determine in these new environments, but sensitive data will still need to be adequately protected!
Identity and Access Management
VR and AR can potentially improve identity and access management. You might, for example, be able to identify other people you're working with by looking at their avatars during interactions — assuming the avatar wasn't copied or accessed by an unauthorized party.
Linking avatars, accounts, and access control in a VR/AR environment to your existing identity and access management system could allow users to bypass the need to use and recall new usernames and passwords. The account could even be linked to biometric data collected by the VR/AR system to provide greater identity assurance for accounts. These same biometrics, including the tracking of physical movements, introduce new possibilities for impersonation.
Unauthorized users could copy an avatar and even the rightful user's physical movements, which could make it difficult to identify an imposter. When system use and access are logged, such as indicating who used an avatar, along with when and where they were using it, the information can be used to investigate incidents or monitor sessions for unauthorized usage.
Privacy
VR/AR systems can collect far more personal information than traditional systems, and this can considerably impact user privacy. For example, VR headsets with live mics can record all conversations, while tracking systems/HMDs with always-on cameras can record video of private spaces. Further, eye-tracking technology can record what a person looks at. Add in the potential biometric data collected, and you have a treasure trove of personal information that might need protection.
To limit the impact of unauthorized access, data might need to be regularly purged. Also, the institution's data retention and privacy responsibilities might require clarification. For example, if a campus records eye movements and what individuals look at during an VR/AR experience to see if students are paying attention in class, should the campus retain that information as a student record and/or store it in a student's academic advising portfolio for evaluating student success?
Future Challenges, Final Thoughts
In addition to these immediate challenges, other issues are likely to arise in the future. With AR, for example, malicious software or a DoS attack could temporarily "blind" a user and, for example, block an oncoming car or hide the face of an assailant. AR use in real-world environments, such as in medicine and industry, creates opportunities for malicious attackers to impact life and safety. It will also be important to consider whether to conduct sensitive business — such as high-security research or conversations with your board — using these systems. Such considerations could impact system usage policies.
Challenges aside, for technologists, it's an exciting time to be working in higher education! Many of these VR/AR advancements have the potential to improve student outcomes and collaborations in research and education. None of the security items described here should stop you from experimenting with or implementing these advancements on your campus. The issues simply highlight the need for educators, technologists, and information security teams to partner to ensure that they implement the new technologies in a way that meets their institution's risk tolerance. Such a partnership is essential because each group brings its unique perspective and experience and few have the depth to work through potential solutions alone.
Ben Fineman is a Program Manager for Cloud Services at Internet2.
Nick Lewis is a Program Manager for Cloud Services at Internet2.
© 2018 Ben Fineman and Nick Lewis. The text of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.