As a university president who was also the institution’s vice president for information technology and CIO for ten years, I am often asked: “What do you now think about technology? From your point of view as a president, what are the major issues in information technology today? What has changed in your thinking?”
So in my fifth year as president of Indiana University, I would like to talk about what is important to me regarding information technology in higher education. I will focus on three areas: the mission of a university or college; the funding challenges of our institutions; and the centralization and commoditization of information technology.
The Institutional Mission of Universities and Colleges
Higher education institutions have three fundamental missions: (1) the creation of knowledge (i.e., research and scholarship); (2) the dissemination of knowledge (i.e., teaching and learning); and (3) the preservation of knowledge. Although the last mission—the process of preservation—may not always be foremost, it has been one of the hallmarks of colleges and universities throughout history.
All three of these were missions for Plato's Academy, founded in 387 BC. Members of the Academy wanted to disseminate the knowledge, learning, and philosophy of Socrates. They wanted to use that philosophy to create new knowledge. They also wanted to preserve that knowledge, as was done in the Platonic dialogues and elsewhere.
A remarkable fact about institutions of higher education is that they are one of the oldest forms of human organization. Perhaps the oldest, dating to before Plato’s Academy, is Taxila University, founded near Islamabad in Pakistan in the sixth or fifth century BC. Nalanda University, established in India in 427 BC, existed for over 1,600 years. The University of Nanjing claims to have uninterrupted operations, in some form, since 258 BC. Al Karaouine University in Fes, Morocco, was founded in 859, and in Egypt, Al Azhar University was established in 970. The University of Bologna was arguably the first in Europe, founded in 1088, followed by Oxford University in 1096, the University of Paris in 1150, and Cambridge University in 1209. Harvard University, which claims to be the first college in the United States, was established more than 400 years later.
Why is this point about longevity relevant? When I was still the CIO at Indiana University, I put together a committee to determine what it was that the very best researchers wanted in terms of information technology. I asked our most distinguished faculty from a variety of disciplines across the university to tell me what was really important to them. What did they feel would be key to their work in the future? Would it be high-performance network connectivity? But this was seen as being well provided for by Internet2 and other organizations. High-performance computing? But this was no longer as exotic as it had been in previous years and was now basically just a matter of dollars: X dollars buys Y teraflops. The answer that came back repeatedly was “storage.” Regardless of their field of expertise, these researchers wanted to be able to store vast amounts of digitized data, whether generated by their research facilities or otherwise. And they also wanted to store the results of the analysis of this data.
Two particularly telling remarks came out of that report, and they are as relevant today as they were at that time. First, a faculty member who has since become a National Academy member said: “I want to be able to store the data not just for my students, but for my students’ students.” Second, another respondent said that he wanted to be able to store the data, noise and all, without compression that eliminated noise, because “today's noise is tomorrow's information.”
Unfortunately, researchers are not always very good at the long-term preservation and curation of data. It is not, after all, what they are paid to do. This work is often done, for example, by graduate students, who then graduate and move on. Thus, the expertise involved in organizing and storing the data tends to disappear with the graduate student. When this data is not replicable (and the great bulk of data generated in many of the sciences is not), the problem of the long-term storage of the data starts to become particularly serious.
The question then becomes, to whom should we entrust our data if we want to be able to access it in another forty, fifty, or one hundred years? The IT industry? But the IT marketplace is just the opposite of long-term stability—which, of course, is one of that industry’s strengths: innovation has been a constant product of the Schumpeterian cycle of creative destruction. Consider the following list of once-dominant IT companies that no longer exist: Sun, Wang, DEC, Storage Tek, Compaq, Control Data, Thinking Machines, Commodore, Sequent, Atari. An industry fueled by creative destruction is not promising as the long-term custodian of the data and knowledge of universities and, hence, of civilization.
As I have already indicated, universities are institutions of extraordinary longevity. I believe that the best option for the long-term preservation and curation of digital data—and by “long-term” I mean not just decades but centuries—will be universities. Within universities, it is the information technology and information systems staff who have developed the technologies and the methodologies for, and who are the most proficient in dealing with, the long-term preservation and curation of data. The crucial importance of such preservation is no different for us now than it was for those in Plato's Academy to preserve the Platonic dialogues, which still exist twenty-five centuries later.
Funding and Cost Savings
The recession has brought major financial pressures to bear on all of us in higher education—in both public and private institutions, in the United States and across the globe. Its direct effects vary from institution to institution, but we are all suffering the effects of the global financial downturn.
Colleges and universities have five main sources of income: (1) tuition; (2) state appropriation; (3) funding from the federal government; (4) philanthropy; and (5) clinical and other income from auxiliary operations. Every one of these sources is under severe pressure. Tuition income is under pressure because of the widespread concern that higher education is becoming unaffordable. States that have so generously funded their institutions are no longer able to do so in the way that they have since World War II. Federal funding also appears to be in irrevocable decline, as both the NIH and the NSF prepare for cuts in funding or, at best, level funding. Philanthropy too is under pressure, though it may be the one bright spot as Baby Boomers begin to distribute, in philanthropic ways, the wealth they have accumulated (though philanthropy has a major impact on probably only about one hundred higher education institutions in the United States). Finally, clinical income—in particular, income coming from university hospital systems—is under strain as a result of health care reform and other pressures.
I do not believe we are going to see this situation change—certainly not in the near future. This really is “the new normal” for higher education. We will be under continuing and increasing pressure to become more productive. Becoming “more productive” means decreasing the time to degree and improving graduation rates. We will also be under more pressure to move from what was effectively an agrarian model of education, with a long summer break, to educating year-round.
To these ends, incidentally, Indiana University initiated a 25 percent discount for tuition over the summer semester. This discount will also enable IU to increase the utilization of the university’s resources, which are substantially underutilized during this period. (See http://newsinfo.iu.edu/news/page/normal/20114.html for more information.)
Since World War II, much of the productivity in the United States, and probably in much of the rest of the world, has been due to advances in technology. Most industries have been restructured due to the prodigious impact of technology, with at least two exceptions. One of these is health care. The notion of the electronic health record, for example, has still not had any kind of substantive impact, though much is hoped for. A second area where the full impact of technology is yet to be felt is in higher education. At the moment, for example, it is arguable whether online education is more expensive than traditional, face-to-face education. If and when the issue of its expense changes—and there are many people working on this issue—online education will have a significant impact on the economics of higher education.
Another development that has come out of the IT community should be able to have a very significant impact on the cost of higher education and on the cost of running a college or university: open-source or community-source software. The savings that Indiana University has realized, and the costs that have been avoided, through the widespread deployment of a number of major community-source software systems has been in the tens of millions of dollars. IU has played a significant role in the community-source software initiative, which I both endorse and commend.
I should note that this is a model that has been—and still is—widely used in science, certainly in my own discipline of computer science. When I started as a graduate student over thirty years ago, much of the software was community-source software that was developed, disseminated, and used by researchers around the world. This model has worked very well in the scientific community, for a variety of complicated reasons. It has not worked so well in the broader IT professional community.
All that changed, I think, with Sakai (http://sakaiproject.org/), a community that was founded in 2003 to produce open-source Collaboration and Learning Environment (CLE) software. Initially a collaboration among the University of Michigan, Indiana University, MIT, Stanford, the OKI Project, and uPortal, Sakai is now used by more than 350 organizations and institutions worldwide. This is a remarkable success story. The cost savings have been immense for higher education. Kuali (http://kuali.org/history), announced in 2004, is a community-source initiative to build a financial accounting system for higher education. In 2005, the Kuali Financial System was begun with founding partners Indiana University, the University of Arizona, the University of Hawaii, Michigan State University, San Joaquin Delta Community College, Cornell University, NACUBO, and the rSmart Group. Today Kuali comprises eight community projects and is already deployed by dozens of organizations at research universities, community colleges, public and private institutions, and commercial affiliates internationally. I was pleased to have approved Indiana University’s participation in both these initiatives while I was CIO.
Sakai and Kuali are excellent examples of the cost savings that can be generated by institutional initiatives. They also enable higher education institutions to take control of their own destinies, to have direct control over a part of their infrastructure and a fundamental part of what they do, at least for the foreseeable future: the management of their teaching and financial resources.
Both of these projects received early funding from the Andrew W. Mellon Foundation. This investment in community-source systems was critical to enabling the efforts and provided the independent validation for the projects to become successful. The Mellon Foundation deserves enormous credit for this. I know there is considerable debate (I hear it from presidents and others) over the importance of community-source software and how viable it is as a model, but I think the evidence of its success and viability is now overwhelming, and I certainly endorse it.
There are other cost-savings models. For example, Internet2 has provided substantial bandwidth to all the institutions that are connected to its Internet backbone to support research computing and cyberinfrastructure more generally in this country. E-texts are another example: Indiana University recently announced an eText Initiative consisting of agreements with a range of publishers, and others have similar initiatives. These agreements will result in lower-cost options to save students money on required course materials and provide new tools for teaching and learning.
Centralization and Commoditization
I am—and have been for most of my career—a firm believer in the centralization of information technology, where appropriate. My use of the phrase “where appropriate” is not meant to be a clever phrase that effectively allows the negation of this assertion. I am mindful of the importance of striking the proper balance between the centralization of services and infrastructure where that makes sense for an institution, where it is technically feasible, or where the savings are demonstrable. We must also consider those who use services generated by or deployed through those central services and their specific needs.
The basic argument for centralization has not changed from what it was when I first came to IU about fifteen years ago. At that time this argument was just starting to take hold. At the risk of grotesquely oversimplifying a highly complex situation, until roughly the mid-1990s, there was a heterogeneous mix of operating systems. Now there are basically only three: UNIX (in its various forms), MacOS, and Windows. There were a number of competing networking protocols. Now there is basically one: TCP/IP. There was a bazaar of desktop machines. Now there are only a relatively small number of suppliers.
What we have seen is a relentless commoditization of information technology driven by Moore’s Law, and hence the very substantial reduction of alternatives and extensive standardization. All this leads to very substantial savings as economies of scale are leveraged more and more.
Because of this, it is much harder—almost impossible, in fact—for university departments and units to make the claim of uniqueness when it comes to their IT environments. And hence institutional savings, as well as improvements in institutional research and educational capabilities and in productivity, flow from leveraging an increasingly homogenous IT environment.
Certainly at IU, we have progressively centralized the server infrastructure. There are good economic reasons for doing so, but there are also critical security reasons. Often colleges and universities have members of their governing boards who come from the commercial sector. They look at the centralization of IT services in industry, and they ask, “Why are you different?” Although there are exceptions in areas such as embedded processors and experimental architectures, it is very hard to make a case for universities being that much different from businesses in terms of their fundamental IT infrastructure.
With greater centralization of IT resources, the savings are immense. For example, when Indiana University moved to the consolidated purchasing of desktop machines and software—rather than allowing each department to make its own agreement to buy these products—we saved tens of millions of dollars annually. This was genuine savings that flowed back directly to the various schools and departments in the institution.
Of course, this commoditization of information technology has, in turn, driven the revolution in mobile computing and communications. By some estimates, there are more cell phone users in Africa than in the whole of the United States and Europe combined, and iPhone and iPad apps are being developed for children as young as two or three years old.
Mobile computing and communication and associated phenomena such as social networking are already having a profound impact not only on higher education but also on the political structure of the world. Much of this impact in higher education is still becoming clear, but it will be enormous. How it is harnessed productively to support the mission of higher education institutions will be the task and challenge for the next generation of college and university CIOs.
Conclusion: The Continuing Importance of the CIO
It has been said in some quarters—and I have certainly heard this from many people—that the role of the CIO in a university is diminishing. I disagree: there could be no more important time than now for the CIO to be a senior officer at a higher education institution and to have a role that reports either directly to the president or, at the least, to the provost or equivalent.
The CIO has responsibility for vital resources in the college or university. The importance of those resources can be seen by performing the simple thought experiment of imagining a situation in which the information technology suddenly stopped working at an institution. That unfortunate institution would come to a complete halt in every aspect of its education, research, and business activities. The CIO can have a major impact on the research and educational missions of a university through timely deployment and adoption of the most innovative developments in information technology. He or she can also have a major impact on the financial bottom line of the institution through some of the areas I have mentioned above and in hundreds of others. The institution requires the oversight and direction of someone with the strategic vision to be able to see how to pull all of the pieces together, someone who can not only communicate down to his or her fellow technologists but also up to people who are not necessarily that knowledgeable about the intricacies of the technologies.
This is as important a time as there has ever been for the CIO to have a seat at the table with the president or provost of an institution. It is also a key time for the IT organization to be seen as a partner. It is absolutely essential for the IT organization to run first-class, cost-effective, efficient services to support the institutional missions. By providing such services, IT professionals—many of whom are highly skilled technologists with advanced degrees—can become partners with the institution. The IT organization can help the college or university balance centralization and commoditization, address funding challenges, and preserve knowledge.
I started this article by trying to answer the question I am often asked as to what has changed in my thinking since I became a university president, and I began by reflecting on the history of the university. Let me end, then, as I began.
Universities are magnificent institutions—at their best among the finest the human race has ever devised. They are also institutions of ancient lineage dedicated to the creation, dissemination, and preservation of knowledge of past centuries for future centuries. Information technology has sprung from almost nowhere to become a fundamental force in human society in less than half a century. In utilizing this brash and powerful new force within higher education, we must never lose sight of how to use it to preserve and enhance all that is great about and within universities.
This article is based on Michael A. McRobbie’s keynote session delivered at the 2011 EDUCAUSE Annual Meeting, Philadelphia, Pennsylvania, October 21, 2011.
© 2012 Michael A. McRobbie. The text of this article is licensed under the Creative Commons Attribution-NoDerivs 3.0 Unported License (http://creativecommons.org/licenses/by-nd/3.0/).
EDUCAUSE Review, vol. 47, no. 1 (January/February 2012)