© 2009 James Hilton. The text of this article is licensed under the Creative Commons Attribution 3.0 License (http://creativecommons.org/licenses/by/3.0/).
EDUCAUSE Review, vol. 44, no. 3 (July/August 2009): 8–9
The following excerpt is based on a conversation with Gerry Bayne, EDUCAUSE multimedia producer.
To listen to the full interview, go to http://www.educause.edu/ir/library/multimedia/hiltoninterview.mp3
When I look at the IT budget, I always try to recognize that there are two broad ways to think about information technology. A large part of information technology is essential, like a utility. Those of us in higher education absolutely have to have it. We have to have Internet connections, we have to have ERPs, we have to have registration systems, and we have to have communication services. Investments in these essential IT services are the largest part of most IT budgets. But there are also strategic IT investments — that is, investments in those IT services that help provide a college or university with a competitive advantage in its teaching and research missions.
At the University of Virginia, for example, I would make a distinction between the public computing labs and e-mail services, both of which are essential and consume a good deal of resources, and the high-end technical support that is needed for computationally intense scholarship and research. The essential services tend to be those that have commoditized, meaning that they have standardized and their prices are generally starting to drop. The strategic services are experimental: they are higher risk, and they are often one-off.
So as we look at how to approach budget challenges, the following key questions arise: How do we drive down the cost of essential services? How do we ride the commoditization wave and take advantage of economies of scale? How do we let go of services we no longer need to provide? How do we continue to find money, even in constrained budget times, to invest in new, strategic areas of information technology?
Perhaps we need to change some of the assumptions under which colleges and universities, especially IT departments, have operated for a long time. In the early days of IT services, colleges and universities were the source and the providers of most technology on campus, and they were also the source and the providers of the demand for that technology. Twenty-five years ago, if you were using computers and technology in a college/university environment, odds are you were using it to accomplish something directly related to your schoolwork or your research. Today, my seventeen-year-old daughter does her schoolwork on her computer. If that computer failed, she would of course be concerned about getting her schoolwork done, but that would not be the reason for her primal scream of pain. She would scream because her social life is inextricably connected to technology. She lives and dies by her online connection independent of the demands of her schoolwork.
IT organizations typically assume that they should be responsible for making sure that all of the technology works. My question is: in a world where technology has consumerized — where there are multiple vendors, multiple providers, and multiple sources of help — what is the correct role for the central IT organization in help-desk support? Put another way, what level of end-user support is the right level, and how much responsibility and cost should the IT organization assume?
It is not that I think IT organizations should get out of the help-desk business. Instead, I think that we need to figure out which services we are — and are not — going to support, and we need to find ways to drive support toward more specialized services. For example, how do we help biologists who suddenly need to parallelize their code and have access to high-performance networks and computational cycles? How do we free up our time and attention to get them the assistance they need rather than being distracted and buried by all these consumer devices and services that come with their own antivirus and connectivity problems?
At the University of Virginia, we are starting to rethink how we provide help-desk support. We are also starting to rethink our public computing labs. We have a large installation of public computing labs, and the demand — measured by the amount of time and the number of seats needed — continues to go up. And yet, when we look at how the public computing labs are being used, we find that 95 percent of the use is for commodity software that is available on the computers that students already own. So at Virginia, even though 99 percent of last year's incoming class owned laptop computers, students are using the public computing labs almost exclusively for access to browsers and office productivity software. We have to ask ourselves: should we be reallocating the resources from the public computing labs to other, more strategic purposes?
Arguably, it is convenient for students to be able to leave their laptops at home and simply stop at the public computing labs to check their e-mail and browse the web. Of course, when they leave their technology at home, it is not present and available in the classroom. If we quit enabling students to leave their laptops at home, professors might start taking more advantage of that technology in class. (Of course, that will also cause controversy, because some faculty don't want technology in the classroom.)
The point is that as we all look at budget cuts, we need to recognize that the world, especially the IT world, changes very quickly. A world in which the higher education institution provides all technology services just doesn't — or, perhaps more to the point, shouldn't — exist anymore. We need to start looking at our fundamentals, and we need to find ways to shift our resources away from investments in essential IT services and toward more strategic investments in technology.