From the Dark Ages (~476 to 1000 AD) through the Renaissance (~fourteenth through seventeenth centuries), the Age of Reason/Enlightenment (~1650 to ~ 1800), and right on up to the Industrial and Information Ages, the history of man has been an epic quest to expand knowledge. Sitting at ground zero of this Sisyphean battle against ignorance was the university/college. Toiling quietly, heroically, and for the most part invisibly in the background was the university/college IT department, the secret ingredient in the recipe for success of the contemporary educational enterprise. And then “something happened.”1
The first decade of the third millennium has not been kind to the modern university or its IT department. Society has become—in stages—aware, concerned, and semi-outraged at the trajectory of costs associated with accessing a university/college education.2 No less an authority on innovation than Clayton M. Christensen, the Robert and Jane Cizik Professor of Business Administration at the Harvard Business School, believes that the modern university suffers intensely from “The Innovator’s Dilemma” and is vulnerable to disintermediation by more agile and lower cost educational alternatives.3
In this article, I will propose that the informed and affordable use of advanced analytics can materially improve the future of University 3.0 and its IT department. My hypothesis goes as follows: To be an exceptional university/college, a higher education institution needs world-class information technology. To be a world-class IT shop in the very near future, the IT department needs a mastery of advanced analytics. The path forward requires (1) effecting a rapprochement between information technology and the rest of the university and (2) the aggressive embrace, by IT leadership, of the tools and techniques of advanced analytics.
Barbarians at the Gate—There’s Nothing New Here
Universities are not strangers to controversy and existential attacks. Historians of the evolution of higher education may recall when Clark Kerr, the University of California, Berkeley, chancellor and ultimately president, wrote about his later dismissal by noting that he had left the university system the same way he had entered it: “fired with enthusiasm.”4 Information technology too is periodically (some might argue perpetually) assaulted by controversy and existential attack.
If chroniclers from another planet were to visit Earth a thousand years from now and write the history of technology on our planet, the ten years from 2000 to 2010 would stand out as a rough patch. The large-scale enterprise application projects launched in anticipation of the Y2K virus expanded in price and contracted in benefit, materially reducing the credibility of IT leadership. These massively complex systems of record were initiated with the ambition of optimizing the capture and analysis of process knowledge, but as deadlines slipped and costs grew, the goal became not to create better processes but simply to get the systems implemented.
This was also the decade when Nicholas Carr penned his screed “IT Doesn’t Matter.”5 Every year the world spends about $3.7 trillion dollars on information technology—and yet many organizations, including many colleges and universities—operate on the edge of chaos.
Analytics Is to Technology What the Stirrup Was to Warfare
Today, advanced analytics is the most affordable, most scalable, and lowest risk path to performance improvement available to management teams around the globe. A rich set of tools and methodologies exists. For the past five years, analytics (also referred to as “Business Intelligence/BI”) has featured prominently first in the “Technologies to Watch,” then in the “Technologies Being Evaluated,” and currently in the “Technologies Being Deployed” lists of the various subscription research firms monitoring the IT industry. And yet despite the headlines, analyst attention, and coverage by the trade press, analytics is for the most is fundamentally misunderstood by business and campus communities alike.
Using the Verstehen techniques championed by the nineteenth-century German philosopher Wilhelm Dilthey, I have spent the last two years doing an ethnographic study of analysts.6 Advanced analytics encompasses several related but frequently separated disciplines to which I apply the mnemonic acronym F.O.D.D.R.S.:
- Forecasting
- Operations Research
- Data Mining
- Data Integration
- Reporting
- Statistics
The analytics landscape ranges from a spectrum of basic capabilities and value to extraordinary competitive differentiation. Base-level business analytics is descriptive:
- What happened?
- How many, how often, where?
- Where exactly is the problem?
- What actions are needed?
Basic analytics helps us react to the world as it happens.
Advanced analytics is both predictive (i.e., what will happen) and prescriptive (i.e., what should be done about it). Advanced analytics elevates decision-making into the all-important realm of understanding:
- Why is this happening?
- What if these trends continue?
- What will happen next?
- What is the best that can happen?
In the buildup to the era we now occupy, there has been a widespread desire for knowledge. Leading elements of society wanted to know. There was a general presumption that a best-efforts basis had been performed such that those with authority knew what was knowable. That presumption—in the face of widely publicized monumental failures in information/knowledge management—has been shaken. Society—all aspects of society—is now demanding that leaders actually know what they are doing. Leadership success is now tied to a rapidly expanding requirement for knowledge.
Martin Wolf, the associate editor and chief economics commentator at The Financial Times, summarized the general zeitgeist by explaining that most people “no longer believe that they [the executives running organizations] know what they are doing.” David Brooks, a PBS NewsHour commentator, expanded this general lack of trust to the public sector: “People have lost faith in the government—both parties . . . and [in] the ability of government to handle problems.”7 The quickest way executives can win back the trust of those they would lead is to master advanced analytics.
Data Has Been Rediscovered
Stakeholders now expect, particularly in regard to resource-allocation decisions, transparency about how decisions are made. Somewhere, sometime, someplace, somehow, we rediscovered data.
There is a lot of data. The analyst firm IDC estimates that in 2010, the amount of data will approximate 1.2 million petabytes, or 1.2 zettabytes. A petabyte is a million gigabytes (e.g., the amount of information stored in a stack of DVDs reaching from Earth to the moon and back).8 Collecting and analyzing data has never been more affordable. Data-management proficiency has emerged as a major ingredient of world-class operation. The rarely articulated implication of all of this data floating around is that unaugmented human cognition [the old know] is no longer sufficient.
Every day, every day, there is more to know, more ways to know it, and heightened expectations—by students, faculty members, alumni, football coaches, trustees, regulators, and elected officials—that senior managers will do something efficacious with what they know. Very recently a switch was toggled in the collective mind: “information overload” migrated from a problem to be dealt with to an opportunity to be exploited. Paul Otellini, president and CEO at Intel, was asked last year: “What is going to be obsolete next?” Otellini responded: “Ignorance.”9 The best tool in the battle against ignorance is advanced analytics.
The new law of the new jungle can be stated simply. Know: thrive and prosper; do not know: wither and die. We are standing on a hinge of history. The choice is ours.
1. “Something must have happened,” was how one of the world’s top art historians explained the fundamental change that occurred in mindsets during the Renaissance. See Erwin Panofsky, Renaissance and Renascences in Western Art (Stockholm: Almqvist & Wiksell, 1960).
2. Annalyn Censky, “Surging College Costs Price Out Middle Class,” CNNMoney, June 13, 2011, <http://money.cnn.com/2011/06/13/news/economy/college_tuition_middle_class/index.htm>.
3. See Clayton M. Christensen, The Innovator's Dilemma: When New Technologies Cause Great Firms to Fail (Boston: Harvard Business School Press, 1997), Clayton M. Christensen, Michael B. Horn, and Curtis W. Johnson, Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns (New York: McGraw-Hill, 2008), and Clayton M. Christensen and Henry J. Eyring, The Innovative University: Changing the DNA of Higher Education from the Inside Out (San Francisco: Jossey-Bass, 2011),
4. Clark Kerr, The Gold and the Blue: A Personal Memoir of the University of California, 1949-1967 (Berkeley: University of California Press, 2001).
5. Nicholas G. Carr, “IT Doesn’t Matter,” Harvard Business Review, May 2003. For one response from higher education, see Jack McCredie, “Does IT Matter to Higher Education?” EDUCAUSE Review, vol. 38, no. 6 (November/December 2003), pp. 14–22, <http://www.educause.edu/library/erm0360>.
6. The Verstehen approach, which in many ways is similar to Toyota’s Genchi Genbutsu (“Go and See”) approach, involves chronicling the first-person participatory perspective that agents have on their individual experience as well as their culture and society. For more on analysts, see Thornton May, The New Know: Innovation Powered by Analytics (Hoboken, N.J.: Wiley, 2009).
7. Martin Wolf, interview, Charlie Rose, April 7, 2011, <http://www.charlierose.com/view/interview/11603>; David Brooks, PBS News Hour, April 22, 2011.
8. Dylan Tweney, “Here Comes the Zettabyte Age,” Gadget Lab Blog, Wired, April 30, 2010, <http://www.wired.com/gadgetlab/2010/04/here-comes-the-zettabyte-age/#more-39224>.
9. Paul Otellini, interview, Charlie Rose, February 26, 2010, <http://www.charlierose.com/view/interview/10883>.
© 2011 Thornton A. May
EDUCAUSE Review, vol. 46, no. 5 (September/October 2011)