Performance measurement can be a difficult political as well as technical challenge for educational institutions at all levels. Performance-based budgeting can raise the stakes still higher by linking resource allocation to a public "report card."
The 23-campus system of the California State University (CSU) accepted each of these accountability challenges beginning in 1999. CSU agreed to institutionalize a comprehensive data-collection process designed to measure progress toward a series of technology policy goals. Further, the state legislature would receive annual reports on those measures of success. In exchange, the legislature agreed to support the technology infrastructure buildout on each of the campuses. The agreement runs through 2008.
This research brief provides an overview of the process and methodology underlying the Measures of Success (MOS) reports. The tools and approaches described here may apply to other public institutions interested in striking a "negotiated accountability" agreement with the state government in exchange for a predictable base of technology funding.
Funding the technology infrastructure of a campus through traditional means (operational budgets) is often uneven and inadequate. Telecommunications pathways, spaces, and media can and perhaps should be treated the same as other forms of physical infrastructure, such as electrical, water, and sewer systems, and funded through capital investment.
The academic and administrative benefits derived from technology depend on a robust telecommunications infrastructure. Therefore, executive management in the CSU system determined that this infrastructure should be given priority—often above new buildings. Voter-approved bonds provided the funding to build the infrastructure.
Before approving the CSU plans to expend capital dollars on technology infrastructure, the state legislature required assurances that having this utility would produce the benefits identified in the system-wide master plan for information technology known as the Integrated Technology Strategy, or ITS. The 10-year time frame of the reporting requirement allows the CSU to show how, over time, as the infrastructure is extended to a growing number of campuses, there is commensurate improvement in ITS goal attainment.
Background
The first MOS report in November 1999 outlined the framework and metrics for success to be used throughout the 10-year period. The November 2000 MOS study presented baseline data against which progress could be measured in subsequent reports.
Data are presented in four major outcome categories:
- Excellence in Learning and Teaching: The ITS academic initiatives seek to improve academic quality, increase student access, and contain costs.
- Quality of the Student Experience: The goal of the student services initiative is to use IT to facilitate interactions with the university (communication, admission, registration, scheduling) for students, potential students, parents, and counselors.
- Administrative Productivity and Quality: The administrative initiatives are to increase the accessibility and utility of major administrative information systems to students, faculty, and staff while improving the efficiency and quality of administrative services. To achieve this, the Common Management Systems (CMS) initiative aims to have all campuses and the Chancellor's Office use common PeopleSoft applications in full production mode by 2007, supported by a consolidated data center.
- Personal Productivity: The information technology infrastructure initiatives seek to provide each campus with a baseline capability, sufficient in the quantity and quality of computing and network resources, to enhance the personal productivity of individual students, faculty, and staff.
The MOS data collection and reporting process yields information about extensiveness, or the amount of usage of IT services; effectiveness, or the degree to which the ITS objectives are being met; efficiency, or the cost of the services provided; and quality, or the currency and capacity of IT resources and the satisfaction of users.
The academic initiatives expand student and faculty access to teaching and learning resources through collaborative acquisition, development, and distribution of technology-mediated instructional materials. Gains in efficiency made possible by the student services initiatives lower institutional costs for processing admission applications while making services to students much more convenient. The administrative initiatives contribute to containing costs over the long term by streamlining and integrating major campus support operations and automating labor-intensive processes. The infrastructure initiatives are the prerequisite for achieving all of the ITS goals. They will provide each CSU campus with a baseline telecommunications capability and personal productivity resources adequate to maintain institutional quality.
These improvements in access, quality, and affordability are significant. The resulting improvements in productivity are offset, to some extent, by the costs of training, technical support, and periodic hardware and software replacements. Any large-scale economic benefits from the use of IT can only be obtained through efficiencies in the core function of the university's instructional programs.
The four outcome categories of the ITS remain unchanged, but the initiatives to achieve them are dynamic. The academic initiatives continue to evolve in scope and influence. They are expanding the types of learning opportunities available to students, increasing access, and providing significant cost savings in many areas. The administrative initiatives represent the largest enterprise resource planning (ERP) project in American higher education, and their implementations are on schedule and on budget. The campus infrastructure buildout initiatives are realizing steady progress in achieving baseline status for the physical plant, workstation hardware and software, networking, and end-user training and support.
The ITS framework and process were intended to respond to new needs and emerging technologies. The CSU is adding new initiatives to the ITS while retiring others, although new additions will not be part of the MOS reporting process.
Methodology
The CSU has conducted a wide range of data collection efforts to support the MOS process. Both institutional surveys and individual surveys of students, faculty, and staff have been administered over the past several years, and more are scheduled through at least 2008. These include the following:
Conclusion
Information technology is a major investment and strategic resource of the CSU. The MOS series documents the pervasiveness and importance of information technology in the CSU. Surveys undertaken in connection with the reports make clear that technology touches every aspect of the university's operations. The data show that in almost all of the reporting categories, technology has had a generally positive influence, sometimes dramatically so.
Although the CSU is a data-rich system in many respects, prior to the MOS the CSU did not have access to this kind of outcomes-based information about technology. The MOS informs planning for and implementation of the ITS by alerting decision makers to what is working and what is not. In that sense, it is a vehicle for organizational feedback and learning, one that has potential for nurturing an institutionalized "culture of evidence" in the policy-making process.
From the state perspective, the MOS is an example of public accountability in higher education. It is a model of negotiated accountability between a state government and the largest four-year, higher education institution in the United States.
It should be noted that the scope and depth of the data collection effort requires a significant annual investment. The data gathering and reporting activities undertaken to produce the MOS series are expensive in direct dollars and staff time. There are no shortcuts in either the design of the survey instruments or the methods to implement the research. This attention to detail increases confidence in the validity and reliability of the findings.