Bring Data: A New Role for Information Technology after the Spellings Commission

min read

© 2006 Kenneth C. Green

EDUCAUSE Review, vol. 41, no. 6 (November/December 2006): 30–47

Kenneth C. Green, founding director of The Campus Computing Project (http://www.campuscomputing.net), is a visiting scholar at The Claremont Graduate University and also host and co-producer of the Ready2Net series at California State University, Monterey Bay (http://www.csumb.edu/ready2net). Green received the first EDUCAUSE Award for Leadership in Public Policy and Practice in 2002. Comments on this article can be sent to the author at [email protected].

Summer 2006 was long and hot across the United States. Data from the National Weather Service suggest that this was one of the warmest summers on record. (Perhaps there really is something to the claims about global warming.) Of course, the weather was not the only source of heat—and of discomfort—for the higher education community this past summer. In offices across the country, academic leaders, association executives, news reporters, and policy experts—plus polemicists from the left, right, and center—functioned as metaphorical meteorologists, tracking what many viewed as the impending storm brewing in the interim drafts and forthcoming final report from the Secretary's Commission on the Future of [U.S.] Higher Education.

Education Secretary Margaret Spellings convened the Commission in September 2005. During her inaugural tour of education conferences in winter 2005, Secretary Spellings, appropriating a quip from (the easily Googled) W. Edwards Deming, the widely recognized father of statistical quality control, told education audiences: "Back in Texas we like to say, 'In God we trust; all others bring data.' " Secretary Spellings's appropriation of Deming's bring data message was a charming and disarming way to describe a key aspect of the George W. Bush administration's priorities in education: evidence and assessment.

These priorities were clearly apparent in the Bush administration's first-term focus on K-12 education, culminating in the No Child Left Behind (NCLB) Act of 2001. Although some education leaders praise the NCLB Act, teachers unions have been very critical, and several states have sued the U.S. Department of Education over various provisions of the implementation of the NCLB legislation. Undoubtedly, the Spellings Commission report—officially released on September 19 under the title A Test of Leadership: Charting the Future of U.S. Higher Education—will also have its fair share of advocates and antagonists. But as with the NCLB legislation, one fact is indisputable: the Spellings Commission report sends a strong message that the Bush administration is serious about assessment and accountability in higher education.

Plus ça Change

Secretary Spellings charged the Commission, chaired by former Texas Regents Board Chairman Charles Miller, to "think boldly" about the future of higher education in the United States. Yet despite this mandate, A Test of Leadership may well generate a sense of plus ça change (the more things change, the more they stay the same) for many in the higher education community. Even though the individual commissioners may not have been happy with each and every element in the final document, the only objecting vote to the third and final prerelease draft came from Commission Member David Ward, the president of the American Council on Education, who also refused to sign the final report.1 (For the record and of special interest to the higher education IT community, there was a post-final-vote objection by Commission Member Gerri Elliott, corporate vice president of Microsoft's Worldwide Public Sector division, who "vigorously" objected to the report's language supporting open-source software and open-content projects in higher education.2)

Reporters at the Chronicle of Higher Education and Inside Higher Education spent a good part of their summer parsing the sometimes small, sometimes semantic, and sometimes significant changes in the tone, tenor, and text of the evolving drafts. Inside Higher Education called the first draft a "stinging critique" of the nation's colleges and universities.3 The initial text, described by Commission Chairman Miller as "very rough," characterized U.S. higher education as offering "equal parts meritocracy and mediocrity." But as the document evolved over the summer, the subsequent drafts reflected a softer, if still critical tone. Of course, conspiracy theorists might view the moderating criticism in subsequent drafts as an explicit strategy on the part of some commissioners or the Commission staff: by moving toward "moderation" in the later versions of the document, the most critical members of the Commission and its staff could place on the public record the evolution of the Commission's work—taking a proverbial "shot across the bow" without having a heavy-handed critique stand as the final word.

Still, despite what some observers in the press and the higher education community have nonetheless characterized as the critical tone and tenor of the report, the final text of A Test of Leadership pales in comparison to the terse opening paragraphs of Nation at Risk, the 1983 report presented to President Ronald Reagan and the American people by the National Commission on Excellence in Education, chaired by David P. Gardner, president of the University of Utah and president-elect of the University of California:

Our Nation is at risk. Our once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world. This report is concerned with only one of the many causes and dimensions of the problem, but it is the one that undergirds American prosperity, security, and civility. We report to the American people that while we can take justifiable pride in what our schools and colleges have historically accomplished and contributed to the United States and the well-being of its people, the educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a Nation and a people. What was unimaginable a generation ago has begun to occur—others are matching and surpassing our educational attainments.
If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war. As it stands, we have allowed this to happen to ourselves. We have even squandered the gains in student achievement made in the wake of the Sputnik challenge. Moreover, we have dismantled essential support systems which helped make those gains possible. We have, in effect, been committing an act of unthinking, unilateral educational disarmament.4

By comparison, the opening paragraphs of A Test of Leadership seem surprisingly tame:

Three hundred and seventy years after the first college in our fledgling nation was established to train Puritan ministers in the Massachusetts Bay colony, it is no exaggeration to declare that higher education in the United States has become one of our greatest success stories. Whether America's colleges and universities are measured by their sheer number and variety, by the increasingly open access so many citizens enjoy to their campuses, by their crucial role in advancing the frontiers of knowledge through research discoveries, or by the new forms of teaching and learning that they have pioneered to meet students' changing needs, these postsecondary institutions have accomplished much of which they and the nation can be proud.
Despite these achievements, however, this commission believes U.S. higher education needs to improve in dramatic ways. As we enter the 21st century, it is no slight to the successes of American colleges and universities thus far in our history to note the unfulfilled promise that remains. Our year-long examination of the challenges facing higher education has brought us to the uneasy conclusion that the sector's past attainments have led our nation to unwarranted complacency about its future.
It is time to be frank. Among the vast and varied institutions that make up U.S. higher education, we have found much to applaud but also much that requires urgent reform. As Americans, we can take pride in our Nobel Prizes, our scientific breakthroughs, our Rhodes Scholars. But we must not be blind to the less inspiring realities of postsecondary education in our country.
To be sure, at first glance most Americans don't see colleges and universities as a trouble spot in our educational system. After all, American higher education has been the envy of the world for years. . . . For a long time, we educated more people to higher levels than any other nation.
We remained so far ahead of our competitors for so long, however, that we began to take our postsecondary superiority for granted. The results of this inattention, though little known to many of our fellow citizens, are sobering. We may still have more than our share of the world's best universities. But a lot of other countries have followed our lead, and they are now educating more of their citizens to more advanced levels than we are. Worse, they are passing us by at a time when education is more important to our collective prosperity than ever.
We acknowledge that not everyone needs to go to college. But everyone needs a postsecondary education. Indeed, we have seen ample evidence that some form of postsecondary instruction is increasingly vital to an individual's economic security. Yet too many Americans just aren't getting the education that they need—and that they deserve.5

Even with the major differences in tone and tenor, the opening paragraphs of these two reports sound a common set of alarms: education is essential for the general welfare and economic development of the nation; the world is increasingly competitive; the declining quality of U.S. educational institutions poses significant challenges to the U.S. economy, security, and quality of life; and other nations are surpassing the United States on key educational metrics and attainments. (Those readers with really long memories will recall the same set of issues and urgency in the wake of the launch of the Sputnik satellite by the Soviet Union some fifty years ago, which led to the U.S. National Defense Education Act of 1958.)

Interestingly, few observers referenced the dire tone and tenor of the Nation at Risk report when discussing the summer drafts from the Spellings Commission on the Future of Higher Education. And even fewer acknowledged the Nation at Risk's "companion" document focusing on higher education: the "kindler, gentler" Involvement in Learning, a 1984 report from the Study Group on the Conditions of Excellence in American Higher Education, also chartered by the Department of Education during the Reagan presidency.6 Involvement in Learning is an early public source for the concept of "value added," stemming from the work of UCLA's Alexander W. Astin, who served as a member of the Study Group.7 The "value added" concept, which has become part of the conversational lexicon about higher education outcomes over the past two decades, also received a strong endorsement from the Spellings Commission.

Focused on the Three A's

Given the "accessibility, affordability, and accountability" mantra often heard from Republican congressional leaders during the legislative debate on the renewal of the Higher Education Act of 1965 over the past two years, it should come as no surprise that the Spellings Commission report speaks directly to these three issues:

  • Accessibility "is unduly limited by the complex interplay of inadequate preparation, lack of information about college opportunities, and persistent financial barriers. . . . Too few Americans prepare for, participate in, and complete higher education—especially those underserved and nontraditional groups who make up an ever-greater proportion of the population."
  • Affordability is a consequence of the "seemingly inexorable increase in college costs, which have outpaced inflation for the past two decades. . . . Our higher education financing system is increasing dysfunctional. . . . Affordability is directly affected by a financing system that provides limited incentives for colleges and universities to take aggressive steps to improve institutional efficiency and productivity."
  • Accountability suffers from "a remarkable shortage of clear accessible information about crucial aspects of American colleges and universities. . . . There is inadequate transparency and accountability for measuring institutional performance, which is more and more necessary to maintaining public trust in higher education."8

The other focal issues in A Test of Leadership include institutional innovation, financial aid, and student learning. Perhaps its three most controversial recommendations involve (1) "replacing the current maze of financial aid programs . . . with a system more in line with student needs and national priorities"; (2) measuring and reporting "meaningful student learning outcomes"; and (3) developing "a privacy-protected higher education information system that collects, analyzes and uses student-level data as a vital tool for accountability, policy-making, and consumer choice."9

Nevertheless, as noted above, many readers may find the key critiques and accompanying recommendations strikingly familiar. With due respect to the commissioners and the Commission staff, A Test of Leadership is really more a compilation of continuing criticisms than a bold, breakthrough document that charts a new path for U.S. colleges and universities and for public policy in American higher education. Indeed, a small team of undergraduate or graduate students working online and in a good research library could quickly generate a well-documented bibliography that cross-references the Spellings Commission's key findings and policy recommendations with previous reports from a variety of academic groups, government task forces, and professional organizations. Even a quick pass at the headline stories published by the Chronicle of Higher Education in the decade of the 1980s confirms the recurring nature of most of the issues noted in the report: accountability, rising college costs, quality, and globalization.10

Admittedly, some will view these comments about the familiarity of the Spellings Commission's work as cheap shots—as self-serving "academic" criticism that fails to acknowledge the larger message, and potential consequences, of the Commission report. And that is a fair counter-critique. Like the NCLB legislation, A Test of Leadership could indeed have major consequences. Early signals suggest that the Bush administration and the Republican congressional leadership are serious about the Commission's work: press reports suggest that Education Secretary Spellings began exploring options to implement some recommendations via department regulations even ahead of the formal release of the report on September 19.11 And at her September 26 press conference about the Commission report, Secretary Spellings announced several initiatives—from convening accrediting agencies for a November 2006 meeting to revising the federal student-aid application form and earlier notification about federal student-aid grants—that reflect her plans to move quickly on various aspects of the Commission's recommendations.12

Attention Must Be Paid

If the critiques in A Test of Leadership are familiar, why should IT officers pay attention to or care about yet another report criticizing U.S. colleges and universities? Does the Spellings Commission report offer a particular message for college and university IT leaders or present a special opportunity for them to engage with faculty, other institutional officials, and various on- and off-campus constituencies about key—indeed, vital—campus planning and policy issues identified in the report?

Let's chart the terrain, looking at the key functions of information technology in higher education and how these functions map back to the Commission report. Most CIOs—and the presidents and provosts they report to across all sectors of higher education—would agree that vital institutional IT functions include infrastructure, instruction, and management. Two or three decades ago, only a small number of students and faculty were "consumers" of campus IT resources for research and instructional activities. In contrast, today, information technology is ubiquitous across the campus and across all sectors of American higher education: virtually everyone on or affiliated with a college or university is a user/consumer of IT resources and services (e.g., e-mail, online services, portals, online content, learning management systems). Administrators are increasingly dependent on increasingly complex information systems to manage student records, institutional finances, development efforts, student recruitment, and alumni initiatives.

Back to the Spellings Commission report. The final document makes some modest, yet critical, references to the role of information technology in U.S. colleges and universities. The report complains that American higher education "has taken little advantage of important innovations that would increase institutional capacity, effectiveness, and productivity," commenting that with "the exception of several promising practices, many of our postsecondary institutions have not embraced opportunities for innovation, from new methods of teaching and content delivery to technological advances to meeting the increased demand for lifelong learning." Discussing college costs and affordability, the Commission recommends that state and federal policymakers support "the spread of technology that can lower [institutional operating costs]." Elsewhere in the report, the Commission urges postsecondary institutions "to make a commitment to embrace new pedagogies, curricula, and technologies to improve student learning."13

No doubt some (many?) in the higher education community will take issue with the Commission's assessment that institutions have taken little advantage of advances in information technology to improve instruction, enhance operations, and increase productivity. Over the past two decades, colleges and universities have spent billions on information technology. Unfortunately, much of the "evidence" about the impact of information technology too often involves individual or institutional epiphany. We lack good data—hard data—to document the beneficial effect of these investments on instruction, operations, and productivity.

Moreover, whereas CIOs strongly support the notion of evaluating the impact and benefit of institutional investments in information technology, less than one-third of the institutions participating in the 2006 Campus Computing Survey do so on a regular basis (see Figure 1). Alas, this is all too common across higher education: even though many will agree on the need for evaluation and assessment, few units—IT or otherwise—engage in thoughtful, effective, and informative assessment activities.

Figure 1. What We Say About Evaluation
Click image for larger view.

Productivity has long been a conundrum for higher education; the link between technology and "academic, instructional, or institutional" productivity has always presented special problems. In the United States, corporate investment in information technology surpassed corporate investment in manufacturing technology in 1994. However, it was not until the last years of the 1990s that economists were able to document any productivity bang for the billions of corporate IT bucks spent during the 1980s and early 1990s. Moreover, in the run-up to the 2004 presidential election, the often political conversation about the "jobless recovery" brought the issue of information technology and productivity back into public focus. No less an authority than Federal Reserve Chairman Alan Greenspan confirmed the economic recovery early in 2004, explaining that the jobless recovery was a "benefit" of rising productivity and that the catalyst for rising productivity was the corporate investment in information technology.

No wonder, then, that the Spellings Commission members (and others) would turn from corporate America to campus America to ask when the institutional investment in information technology would begin to affect—that is, improve—"academic" productivity: instruction, learning, and institutional operations. Most definitely, the higher education community is being asked to document the productivity returns from investments in information technology. Unfortunately, unlike the corporate sector, higher education does not have clear and concise definitions for productivity, nor does it have consensual agreements about the kinds of data that are needed to measure "academic" productivity—for instruction, for learning, or for institutional operations.14

An Emerging IT Responsibility: Assessment

The growing—indeed, pressing—need for hard data about institutional performance and outcomes leads directly to an emerging function for information technology in today's colleges and universities: assessment. Today, information technology is clearly linked to infrastructure, instruction, and management. But should assessment also be a key function of information technology? Yes, absolutely. Higher education IT leaders should—indeed, must—recognize assessment as a new critical and core function of institutional IT services.

We now have the analytical tools (business intelligence/analytics, data mining, and data warehousing, among others) to use routine, unobtrusive institutional data (high school transcripts, students' test scores, students' records from ERP modules, transactional data from learning management systems, college/university transcripts, and more) to address the critical assessment and outcomes issues that affect colleges and universities. Referring back to Deming (and Spellings, and the Commission report), information technology brings data to both the campus conversations and the public policy discussions about institutional performance, academic productivity, and student outcomes.

Unfortunately, the powerful bring data analytical tools that are increasingly deployed in the corporate sector are only beginning to be used by colleges and universities. Consider the case of Wal-Mart and the fall 2004 hurricanes that hit Florida. As reported in the New York Times, with two hurricanes down and one coming, Wal-Mart executives assumed that they knew what their customers were buying in the hours before the storms hit: diapers and duct tape, bottled water and snack foods. But just to be sure, they did a quick analysis of the sales data in the hours leading up to the two prior storms. And although Wal-Mart shoppers did indeed buy massive amounts of diapers, duct tape, bottled water, and snack foods, Wal-Mart execs also discovered a surprising run on beer in the hours before each storm hit. Eager to serve shoppers and shareholders, Wal-Mart execs rushed additional supplies of beer to the stores in the path of the impending third storm.15

Note that the Wal-Mart execs did not assign the analytical task to a research office, hoping to receive a report in weeks or months. They gave it to the IT shop and got the data back fast, in time to discover an opportunity and move a desired product—beer!—into stores ahead of the third storm. Compare the Wal-Mart experience, where the company's information system knows every customer in and every product out at the end of the day, with the experience that is all too common at all too many colleges and universities, where three to six weeks are required to produce an accurate class roster after the beginning of each term.

Similarly, consider Amazon.com. Although Amazon still sells books and lots of other products, Amazon is now really a service business—effectively mining data about the interests, preferences, and purchases of the millions of people who shop at Amazon each day. For example, Amazon offers the shopper multiple ways to buy a book: new from Amazon or used (at a lower price) from dozens of Amazon-affiliated resellers. Why would Amazon "sacrifice" a sale to become a portal for its potential competitors? Because Amazon executives know that gaining additional data about shoppers' purchases will help the company customize its services to better serve the Amazon customer.

ERP analytics, digital dashboards, data marts, data mining, and data warehouses are emerging technologies that bring data to the planning, policy, productivity, and outcomes discussions that confront all postsecondary institutions today. Moreover, this is not just an abstract, "white-board" conversation about a coming world of IT resources. Well ahead of the Spellings Commission report, the University System of Georgia (USG) began planning a comprehensive outcomes project to exploit a rich array of student, academic, and institutional data to address assessment, outcomes, and productivity issues affecting Georgia's thirty-four public postsecondary institutions that serve undergraduates, from community colleges to research universities. Drawing on unobtrusive data—data already in hand, already in campus information systems—Georgia officials are planning a unique and comprehensive initiative that will help state agencies and individual institutions (as well as the rest of the campus community) address critical assessment and outcomes issues. Georgia officials, under the leadership of USG Chancellor Erroll B. Davis Jr. and Interim USG CIO Thomas L. Maier, intend to bring data to the policy discussions about student retention, student development, student enrollment in critical majors (science, technology, engineering, and health care), the impact of information technology on student learning, and student preparation and financial resource issues that also affect persistence, completion, and learning outcomes.

The Expanding Technology "Evolution"

Three decades into the much discussed (much-hyped?) "computer revolution in higher education," we have learned (or should have learned) that (1) the past two decades have been marked by the rapid evolution of information technology in U.S. colleges and universities, and (2) the real IT issues are not about products but rather are about the effective use of resources and the effective delivery of services—about how information technology aids and advances the institutional mission.

The emerging analytical tools—badly needed and long overdue—will help colleges and universities address the assessment conundrum that began in 1980 when the Southern Association of Colleges and Schools became the first of the six regional accrediting agencies to mandate institutional outcomes as part of accreditation. These mandates remain today, and there is broad agreement across all sectors of higher education on the need for outcomes assessment. The strong support for assessment in A Test for Leadership reaffirms this need. But the devil is in the details, as are the politics. Left unresolved is the methodology for the assessment. There is no consensual methodology—an effective methodology—for this work.

Moreover, as Margaret Miller, at the University of Virginia, notes in a thoughtful and informative Chronicle of Higher Education "Point of View" article, the individual student testing initiatives launched by the NCLB legislation are not appropriate for higher education. Unlike the assessment initiatives launched by the NCLB legislation, which are intended to monitor the educational progress of individual students as well as individual schools, the Spellings Commission's focus on accountability and assessment, says Miller, reflects the Commission members' interest in knowing "if the government's massive investment in higher education is paying off. They want to know if the nation is poised to take on the increasingly fierce economic competition it faces, and whether its citizens are prepared to navigate the complexities of modern life and make good collective decisions." These are good, fair questions we should be asking ourselves. These are, says Miller, "questions we want policy makers to ask [us]."16

In this context, a key component of the outcomes and assessment solution resides in the emerging analytical IT tools increasingly deployed in the corporate sector and now coming to higher education. These tools can, do, and should expand the mission of information technology in colleges and universities to include assessment.

Admittedly, this is not necessarily a responsibility that many central IT units—and many college/university IT officials—will want or will eagerly pursue. But the new technologies of data mining and warehousing mean that campus IT units and institutional IT leaders can respond to the bring data message—the bring data mandate—with real, timely, and useful data and information about aggregated student performance and institutional outcomes. Information technology now offers viable methodologies to address the mandates for outcomes assessment.

The question here no longer concerns if information technology has a role to play in the campus conversations and public discussions about assessment and outcomes. Rather, the issue before us in the wake of the Spellings Commission report concerns when college and university IT leaders will assume an active role, a leadership role, in these discussions, bringing their IT resources and expertise—bringing data, information, and insight—to the critical planning and policy discussions about institutional assessment and outcomes that affect all sectors of U.S. higher education.

Notes

1. Kelly Field, "Uncertainty Greets Report on Colleges by U.S. Panel," Chronicle of Higher Education, September 1, 2006.

2. Doug Lederman, "Changing the Report, After the Vote," Inside Higher Education, September 1, 2006, http://www.insidehighereducation.com/news/2006/09/01/commission.

3. Doug Lederman, "A Stinging First Draft," Inside Higher Education, June 27, 2006, http://www.insidehighereducation.com/news/2006/06/27/commission.

4. U.S. Department of Education, by the National Commission on Excellence in Education, A Nation at Risk: The Imperative for Educational Reform, a Report to the Nation and the Secretary of Education, April 1983, http://www.ed.gov/pubs/NatAtRisk/index.html.

5. U.S. Department of Education, A Test of Leadership: Charting the Future of U.S. Higher Education, a Report of the Commission Appointed by Secretary of Education Margaret Spellings (Washington, D.C., 2006), pre-publication report, September 2006, http://www.ed.gov/about/bdscomm/list/hiedfuture/reports/pre-pub-report.pdf, pp. vi–vii (emphasis in original).

6. National Institute of Education, Involvement in Learning: Realizing the Potential of American Higher Education, Final Report of the Study Group on the Conditions of Excellence in American Higher Education (Washington, D.C.: U.S. Department of Education, 1984). Interestingly, whereas the 1983 Nation at Risk is available on the U.S. Department of Education Web site, the 1984 Involvement in Learning report is not. In fact, several Google and Ask.com searches in September 2006 suggest that the 1984 Involvement report is nowhere to be found on the Web.

7. See, for example, Alexander W. Astin: Four Critical Years (San Francisco: Jossey-Bass, 1977); Minorities in American Higher Education (San Francisco: Jossey-Bass, 1982); and What Matters in College? Four Critical Years Revisited (San Francisco: Jossey-Bass, 1993).

8. A Test of Leadership, 7, 2, 9, 4, 13.

9. Ibid., 3, 23, 21.

10. "Higher Education in the 80's: Some Highlights," Chronicle of Higher Education, December 13, 1989.

11. Doug Lederman, "Carrying Out the Commission's Ideas," Inside Higher Education, August 17, 2006, http://insidehighered.com/news/2006/08/17/commission.

12. Office of the Secretary, U.S. Department of Education, "Secretary Spellings Announces Plans for More Affordable, Accessible, Accountable and Consumer-Friendly U.S. Higher Education System," press release, September 26, 2006, http://www.ed.gov/news/pressreleases/2006/09/09262006.html. The video archive of Secretary Spellings's September 26, 2006, press conference is also available online: http://www.connectlive.com/events/deptedu/.

13. A Test of Leadership, 14, 15, 2, 4.

14. See Kenneth C. Green, "The 'New Computing' Revisited," EDUCAUSE Review, vol. 38, no. 1 (January/February 2003): 32–43, http://www.educause.edu/LibraryDetailPage/666?ID=ERM0312.

15. Constance L. Hays, "What They Know about You," New York Times, November 14, 2004, late edition, sec. 3, p. 1.

16. Margaret A. Miller, "The Legitimacy of Assessment," Chronicle of Higher Education, September 22, 2006.