Since the first use of the term chief information officer 37 years ago, the CIO's role inside and outside of academia has evolved—along with the technology—through three distinct eras.
In 1984, there were fewer than 20 chief information officer (CIO) positions in higher education in the United States. By 2017, 68 percent of US higher education institutions had a CIO position, and many others had a similar role with a different title.1 These 30-plus years included rapid technological evolution, which can be divided into three distinct eras: the mainframe era; the distributed era; and the web era.2 Because technology, vendors, and leadership are all intertwined, the CIO role evolved during these three periods from that of operational resource allocator to executive entrepreneur.
In this modern entrepreneurial role, CIOs are expected to be business strategists, contract negotiators, information architects, and enterprise systems leaders, while still engaging in public relations and financial planning for the institution.3 Such an expansion in duties is a departure from the role's original, narrow purpose: to create policy and manage data-processing procedures.
The evolution in the CIO role mirrors the increase in technology's strategic importance in higher education.4 Most recently, CIOs are at the forefront of organizational challenges—both internal and external to the institution—including disruptive innovations (e.g., MOOCs) that have brought globalization to higher education.5
William Synnott and William Gruber first articulated the chief information officer term in their 1981 book Information Resource Management: Opportunities and Strategies for the 1980s.6 They viewed the CIO as a role that naturally evolved given the growing prevalence of technology resources in an organization. In their vision, the CIO would be a C-level executive, sharing power equally with the chief executive officer (CEO) and the chief financial officer (CFO). Whether that equally shared power within the C-suite has been achieved is subject to debate, but the CIO role is now ubiquitous,7 and Synnott and Gruber's seminal book—which focused on CIOs in finance—turned out to be foundational for CIOs in many fields.
Mainframe Era
The mainframe era covers the 1960s through the early 1980s, when large mainframe computers dominated the IT landscape. During this era, the primary function of computers was to reduce the company's clerical load, and more than 90 percent of installed systems were classified as transaction processing systems. Because the important technology metric was return on investment (ROI) in relation to clerical staff, the focus was on output growth and quantity, not output quality.
Mainframe era applications were mostly accounting related, so most companies put systems under the CFO's purview. IBM dominated the vendor landscape, leading to the common refrain: nobody gets fired for buying IBM. However, this also meant that information system managers were viewed as unimaginative because they simply waited for the outside vendor to introduce or update products.
Information managers—the precursors to the CIO role—were frequently neglected and misunderstood by management.8 Indeed, one survey from the late 1970s showed that 50 percent had been replaced in the previous 18 months in corporate America. Surveys also showed a lack of clarity around job expectations for information managers, while business leaders saw projects with runaway costs and low ROI. Although still hired for technical skills, information managers increasingly had to manage budgets, projects, and subordinates—without the accompanying increase in respect or opportunity to provide input to top-level management.
As the mainframe era gave way to the distributed era, the new role of data processing manager relinquished sole control over the various data projects, giving access to local information managers within departments who would manage their own systems. The mainframe era's esoteric and reclusive engineers slowly evolved into a more standardized business function as a growing number of line managers obtained computer literacy, making it a less specialized skill. By the early 1980s, there was a growing belief nontechnical users could eventually manage their own systems. Also, large centralized systems—as well as the people who managed them—proved slow, unresponsive, and expensive.
It is around this time the first academic CIOs entered the landscape. Anne Woodsworth was the first to study and describe the higher education CIO in the academic literature. She found "while no ideal model has emerged for information management roles in universities," the higher education management model for the CIO "is very similar to the hypothetical one proposed for the corporate sector by Synnott and Gruber." In higher education institutions, the CIO typically evolved from the library; based on her research, however, Woodsworth noted the CIO would likely evolve independently and the library's authority over technology would diminish. This is largely what happened: the position evolved away from being an "information czar" overseeing basic data uses in the library to its wide breadth of responsibilities today.9
Distributed Era
The distributed era, which covers the late 1970s through the mid-1990s, was marked by advances in microprocessors, which made possible the personal computer's invention and the broad distribution of computers throughout an organization. These advances—heralded by the infamous 1984 Macintosh ad—also let companies transition away from using technology simply to increase clerical processing speed and move toward using it to gather and analyze information to support decision making. During this time, management of and direct control over systems became decentralized. In addition, advances in programming languages, telecommunications, networks, and databases provided new opportunities for technology use throughout the organization.10
In the distributed era, individual applications did not change significantly; they continued to focus on tasks similar to those of the mainframe era. Applications did cater more to local department needs and could be customized to a particular local purpose. However, this localized software created challenges of integration and coordination in large organizations and eventually led to technology silos. Further, the convergence of technology services began to disrupt traditional separation of roles and responsibilities. The phone, typewriter, and data storage (that is, a file cabinet) were utility and wires, a piece of equipment, and furniture, respectively, and all had separate suppliers and support structures; now, they were in a single converged device. This convergence created organizational uncertainty over who was responsible for the new device.
As the technology moved out of a centralized data center, managers had to make local decisions about technology. At the same time, networking advances required them to integrate and coordinate their decisions with a central hub of some kind. This new local experience with technology's power facilitated collaboration and innovation across the organization; it eventually turned technology into a tool for communication and, later, competitive advantage.11
Synnott and Gruber viewed the CIO as a solution to the increasing complexity in organization, management, and technology, noting only the most advanced organizations can understand the significance of a C-level IT manager.12 To be successful as a peer to existing executives, they said, CIOs would need diverse qualifications as communicators, strategic thinkers, and integrators. In 2015, Brad Davis and Joe McDonagh further specified "CIOs would have to be a business person first, managers second, and technologists third."13 Indeed, by the start of the 1990s, CIOs were no longer technicians; they were top managers with strategic importance.
Although largely driven by internal forces, the CIO role grew synergistically with the changing world of business and consumer technology through the early 1990s. The practitioner press began to follow and promote technology strategies to business executives, while books and articles began to push the idea of process re-engineering.14 Rather than promoting technology to facilitate business, writers urged businesses to reorganize around technology and promised exponential—rather than incremental—leaps in performance and profits.15 Technology's evolving demands and promises meant CIOs had to take on new roles and new challenges.
To address this, new and existing suppliers created two "cutting edge" options for companies: IT outsourcing and enterprise resource planning systems, which were large integrated systems sold as a comprehensive package. These new technology options were advertised as offering a competitive advantage; in reality, however, they were costly and unreliable. While some companies thrived in this technology-focused endeavor, others declined to invest heavily in restructuring for technology and found many of their business process re-engineering efforts failed. To survive and deal with skeptical executives and ever-rising technology costs—without clear benefits—CIOs learned to be organizational designers, technology advisors, technology architects, and (above all) informed buyers. Academic studies of the day found that, by the early 1990s, CIOs already more closely resembled CEOs than information managers.
As the distributed era gradually replaced the mainframe era, higher education's management and leadership requirements evolved as well. Technologically, computers became less expensive and more broadly distributed across campus, and, as in the private sector, local managers gained independent control over the distributed systems. As desktop computers replaced centralized data processing centers, CIOs in higher education evolved to be more manager and less technician; more strategist and less data processor. However, with this change, these CIOs also felt tremendous pressure both to help the institution and to impact learning. Describing this increase in CIO pressure, Robert Gillespie, the former vice chancellor for computing at the University of Washington, noted jokingly: "The only thing computer czars have in common with Russian czars is a high probability of being assassinated."16
Web Era
The web era started in the mid-1990s and was marked by expanded use of networks and the web to drive new internal and external technology in organizations. By this point, information technology was firmly rooted in organizations, and its value was widely recognized by executives who "proclaimed their businesses had embraced e-commerce."17 While networks existed in the distributed era, they were typically localized to a place or organization. In contrast to existing in-house networks, web era services were open on the web rather than proprietary creations. This new form of engagement with customers via the web was seen as an important marketplace disruption and innovation.
Key vendors in the web era included both established firms, such as IBM and Microsoft, and startups that offered Web-based applications and/or expanding services for customer-facing interfaces. One of the web era's defining characteristics was the proliferation of access and the low barriers to entry for new vendors, some of whom made billions of dollars, seemingly overnight. As small, nimble startup companies evolved around a tech product, the web era meant customers now expected existing companies to display continued strategic agility as well. With the proliferation and ubiquity of technology and vendors, Nicholas Carr argued IT purchases no longer presented a strategic value: to many vendors' chagrin, he noted, "studies of corporate IT spending consistently show that greater expenditures rarely translate into superior financial results."18 However, many still believed technology did matter, and the tech firms continued to receive large investments.
With technology systems continuing to cover internal employees and corporate infrastructure—and now adding external customer-facing applications—CIOs were required to add even more skills to their repertoire. James Spitze and Judith Lee best summed up the challenge of the web era CIO:
The role remains immature… the technological products and services used by CIOs are rapidly changing, as they have been for over fifty years; and the cross-functional nature of the role continues to expand and now frequently includes end-customers (looking "downstream") and vendors/suppliers (looking "upstream").19
Still, despite this expanded portfolio, CIOs were expected to provide more services at a reduced cost because executives believed technology could deliver more for less.
Like the corporate CIO, academic CIOs in the early web era were "being challenged to embrace and drive business change and replace old and inflexible computational infrastructure with more robust and integrated ones appropriate for the modern university."20 The first fully online programs appeared in 1994 and many institutions felt the pressure from this new medium. Reuben Dlamini's research confirmed the web era academic CIO was a change agent and strategist who had to constantly learn, adapt, and develop professionally as institutional needs constantly evolved. Still, despite the acceptance of the CIO as a feature of the modern higher education institution in this era, specifics of the role across higher education—such as a centralized-versus-decentralized management structure, strategic technology implementation, and best practice governance—remained unsettled.
What's Next?
As the web era comes to a close, it is interesting to look at the landscape and wonder: What's next? MOOCs brought instant globalization to education but failed to generate persistent interest. Mobile devices extended the web into our classrooms and every aspect of our lives, but is mobility simply the last extension of the web era? Will augmented and virtual reality usher in the immersive era? Or are large-scale cloud-based service providers rebooting the cycle and bringing us back to a new mainframe 2.0 era? Regardless of what happens, the academic CIO's role will continue to evolve to tackle the challenges and lead higher education into the new frontier.
Notes
- Anne Woodsworth, "The Chief Information Officer's Role in American Research Universities" (PhD diss., University of Pittsburgh, 1988), 48; Jeffrey Pomerantz, IT Leadership in Higher Education, 2016: The Chief Information Officer (Louisville, CO: ECAR, 2017). ↩
- Brad Davis and Joe McDonagh, "The Evolving Role of the Chief Information Officer (CIO)" in Manish Wadhwa, ed., Technology, Innovation, and Enterprise Transformation (Hershey, PA: IGI Global, 2015): 207–232. ↩
- Adam Marks and Yacine Rezgui, "IT Leadership in Higher Education: The CIO Candidate," IT Professional Magazine 13, no. 3 (2011): 52–56. ↩
- Reuben Dlamini, "The Role of the Strategic and Adaptive Chief Information Officer in Higher Education," The Official Journal of the IFIP Technical Committee on Education 20, no. 1 (2015): 113–140. ↩
- William G. Tierney and Michael Lanford, "Conceptualizing Innovation in Higher Education," in Michael Paulsen, ed., Higher Education: Handbook of Theory and Research 31 (Cham, Switzerland: Springer International Publishing, 2016.) ↩
- William R. Synnott and William H. Gruber, Information Resource Management: Opportunities and Strategies for the 1980s (New York: John Wiley & Sons, 1981). ↩
- Wayne A. Brown, with Teresa Brown, The Chief Information Officer in Higher Education (Albany, NY: CHECS, 2015). ↩
- Blake Ives and Margrethe H. Olson, "Manager or Technician? The Nature of the Information Systems Manager's Job," MIS Quarterly 5, no. 4 (1981): 49–63. ↩
- Woodsworth, "Chief Information Officer's Role." ↩
- James L. McKenney and F. Warren McFarlan, "The Information Archipelago—Maps and Bridges," Harvard Business Review 60, no. 109 (September 1982). ↩
- Blake Ives and Gerard Learmonth, "The Information System as a Competitive Weapon," Communications of the ACM 27, no. 12 (December 1984): 1193–1201. ↩
- Synnott and Gruber, Information Resource Management. ↩
- Davis and McDonagh, "Evolving Role of the CIO." ↩
- Michael Hammer and James Champy, Reengineering the Corporation: A Manifesto for Business Revolution (New York: Harper Business, 1993). ↩
- Thomas H. Davenport and James E. Short, "The New Industrial Engineering: Information Technology and Business Process Redesign," Sloan Management Review 31, no. 4 (1990): 11. ↩
- John Donnell and Isadore Newman, "The Changing Role of the Chief Information Officer in Higher Education," in Delmus E. Williams and Edward D. Garten, eds., Advances in Library Administration and Organization, vol. 17 (Bingley, UK: Emerald Group Publishing Limited, 2000): 155–185. ↩
- Ross and Feeny, "The Evolving Role of the CIO," 11. ↩
- Nicholas G. Carr, "IT Doesn't Matter," Harvard Business Review 81, no. 5 (2003): 41–49. ↩
- James Moffat Spitze and Judith J. Lee, "The Renaissance CIO Project: The Invisible Factors of Extraordinary Success," California Management Review 54, no. 2 (2012): 72–91. ↩
- Dlamini, "Role of the Strategic and Adaptive CIO," 114. ↩
Jonathan Blake Huer recently earned his doctorate in education leadership from Lamar University. His research focused on the higher education CIO.
© 2018 Jonathan Blake Huer. The text of this work is licensed under a Creative Commons BY-NC 4.0 International License.