© 2010 Jerrold M. Grochow. The text of this article is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License (http://creativecommons.org/licenses/by-nc-nd/3.0/).
EDUCAUSE Review, vol. 45, no. 1 (January/February 2010): 58-59
Although the future of information technology in higher education may not be on the list of important issues for many senior academic leaders, it should be. At a time when fluctuating endowments, increasing government regulation, competition for faculty and students, and new approaches to research publication — indeed, new approaches to scholarship — are unleashing a tsunami of change, it shouldn't be a surprise that information technology might not receive the same attention. However, the fast-paced evolution of information technology affects all of these issues, and that makes the future of information technology an important issue too. Information technology is essential, and it can be strategic, and how colleges and universities harmonize these attributes of the IT landscape will make all the difference to their future.
There is no question that higher education is heavily reliant on information technology. Virtually every function and process in institutions of higher education is controlled or operated by IT systems — whether laboratory experiments, research simulations, classroom teaching, building management, or general administration. Researchers in almost every field are using advanced computational systems to solve fundamental and practical problems, ranging from energy to economics to medicine to the nature of the universe. IT systems are improving education through the use of feedback systems, interactive simulations, and new approaches to delivering information to students throughout the world. Although a few individuals usually spearhead any particular local IT advance, integrating these advances to benefit the college or university requires active cooperation among faculty, administration, and IT leadership, as well as significant investment in IT infrastructure.
IT investment is among the largest budget items on college and university campuses. But that investment is not always coordinated with other activities across the institution. Consider these examples:
- Many institutions spend millions of dollars to upgrade their wireless networks to allow faster, more ubiquitous access to programs, databases, and the Internet; yet at the same time, some faculty members are banning laptop computers from their classrooms.
- Faculty submit grant proposal after proposal to extensive peer review for external funding for their computational research projects; yet research computers that are state-of-the-art at the beginning of a project may be dated by the end of the project because "technology refresh" wasn't planned.
- Administrators deal with internal steering committees for funding new administrative systems; yet many of these administrative systems implement procedures that have been in place for decades, often costing hundreds of thousands of dollars more than necessary.
- Students (and faculty!) manage their lives using "Web 2.0" services on their smartphones, and some even develop applications (on the Internet, on their smartphones, and on the college/university computers) that could be of value throughout the institution; yet there is often no mechanism for supporting these applications once students or faculty move on.
What is missing in these examples is the integrated view of the role of information technology across all aspects of campus activity. What is missing is the recognition of the impact that information technology can have if individual IT decisions are made in the context of broader initiatives of scholarship, research, and cost-effective administration. To paraphrase a CIO colleague: "Building a bunch of buildings doesn't magically make an efficient or attractive campus; offering a bunch of courses doesn't magically make majors and degrees." Coordinated leadership is necessary to integrate individual activities to create strategic value.
For example, over the past decade, information technology has made it possible for faculty (and students) to work from any location, to collaborate with dozens of colleagues all over the world, to teach in multiple locations simultaneously, and to publish their works (online) in record time. These are important advances with significant strategic impact for the institution. If faculty and students can work from almost anywhere, that changes the meaning of campus (and certainly the meaning of office space and study space), and if faculty can publish their work online, that changes the basis for evaluation and tenure decisions. In some ways, information technology is changing the very definition of the university and of scholarship — changes that need to be carefully considered as an institution thinks about its future. And to do that, institutional leaders need to take the same multidisciplinary approach to the future possibilities of information technology as they now take when considering the future of scholarship and teaching. Creating the future isn't the work of a single researcher or a single discipline. It takes a university (many universities!); it takes integrated planning and decision-making; and it takes advances in the use of information technology.
James Hilton, vice president and CIO at the University of Virginia, makes the distinction between "essential" and "strategic" IT services: "The essential services tend to be those that have commoditized [such as telephone and e-mail], meaning that they have standardized and their prices are generally starting to drop. The strategic services are experimental: they are higher risk, and they are often one-off."1 Of course, today's essential services were at one time experimental, higher risk, and developed one-off. But the case still needs to be made for more experimentation with higher-risk, potentially strategic IT services to generate the essential services of the future.
Every IT leader recognizes that his or her first job is to keep the essential services running — to keep the network connected, the e-mail flowing, the course management system available, and administrative systems operating — with decreasing budgets. But every IT leader also recognizes that a close second is to make sure that faculty, students, and staff have access to the kind of information technology that will help faculty in their research, that will engage students and teachers in different approaches to learning, and that will improve community life across the campus (and now across the "virtual campus") — services that are strategic and constantly evolving. The job for all IT leaders is to appropriately balance these two goals, engaging both the essential and the strategic nature of information technology. As a recent IBM study of 2,500 CIOs concluded, the job of the CIO is to balance the need for driving innovation to expand the business impact of information technology with the need for managing costs and creating value to raise the return on investment for information technology.2
IT leaders can't do that balancing act alone; academic leaders can't do that balancing act alone. Many IT and academic leaders are working together within their institutions to find ways both to stay at the leading edge and to ensure that priorities are set in a time of decreasing budgets:
- They are organizing "technology showcases" that engage faculty, students, and IT staff in demonstrating what information technology can do now and what it is likely to do in the near future. "Presenting the possibilities" will transform the way in which the college/university community operates. Ideas (and even implementations) that can be useful in the near-term already exist — they often just need a forum to become more widely known and used, whether that forum is a course in smartphone applications development, a program to nurture student-faculty projects so that more of them result in entrepreneurial activities, or an old-fashioned IT science fair to showcase new-fashioned IT ideas. Remember that the Mosaic web browser was developed by an undergraduate at the University of Illinois, and that changed everything!
- They are designating a portion of administrative budgets for experimenting — not in the scientific research sense but in the very practical prototyping sense: "let's try something out and see how it works." The best innovations in business processes using new technology do not spring full-blown from well-ordered IT development projects; rather, they emerge from somewhat messy, incremental, iterative prototyping in which faculty, administrative staff, and IT staff work side by side.
- They are developing future IT plans as part of a collaborative forum including academic and administrative leadership at the highest levels. The modern view is that IT plans are best developed as part of general institutional planning, rather than being developed independently after institutional plans are already set.3
These approaches and others bring college and university leaders together and increase their understanding of the influence that the constantly changing IT landscape has on the overall campus landscape. How our universities evolve over the coming decades will be significantly influenced by how institutional leaders incorporate information technology into their visioning processes. And that is why the future of information technology should be on every academic leader's list of important issues.
- James Hilton, "Essential versus Strategic IT Investments," EDUCAUSE Review, vol. 44, no. 3 (July/August 2009), pp. 8-9, <http://www.educause.edu/er/HiltonInterview>.
- "The New Voice of the CIO," IBM Global CIO Study (2009), <http://www-935.ibm.com/services/us/cio/ciostudy/>.
- Jeanne W. Ross, Peter Weill, and David C. Robertson call this process "developing an enterprise architecture" (not to be confused with an "IT architecture") in their book Enterprise Architecture as Strategy: Creating a Foundation for Business Execution (Boston: Harvard Business School Press, 2006).