Academic Analytics: A New Tool for a New Era

min read

© 2007 John P. Campbell, Peter B. DeBlois, and Diana G. Oblinger

EDUCAUSE Review, vol. 42, no. 4 (July/August 2007): 40–57

John P. Campbell is an EDUCAUSE Learning Initiative (ELI) Scholar in Residence and Associate Vice President for Teaching and Learning Technologies at Purdue University, where he is responsible for all aspects of instructional technology. Peter B. DeBlois is Director of Programs and Media Relations at EDUCAUSE, where he coordinates member-engagement programs and media relations. Diana G. Oblinger is Vice President of EDUCAUSE, where she is responsible for the association's teaching and learning activities and for the EDUCAUSE Learning Initiative. Comments on this article can be sent to the authors at [email protected], [email protected], and [email protected] and/or can be posted to the Web via the link at the bottom of this page.

In responding to internal and external pressures for accountability in higher education, especially in the areas of improved learning outcomes and student success, IT leaders may soon become critical partners with academic and student affairs. IT can help answer this call for accountability through academic analytics, which is emerging as a new tool for a new era.

A New Era

Although A Nation at Risk was released by the U.S. Department of Education in 1983, in many ways it remains an apt description of current conditions.1 Consider the following statistics:

  • Western nations are looking over their shoulders at China and India. As a popular YouTube video highlighted, if you are "one in a million" in China, there are 1,300 other people just like you; in India, there are 1,100 others just like you. The 25 percent of the Chinese population with the highest IQs is greater than the total population of North America; in India, the 28 percent of the population with the highest IQs is greater than the total population of North America.2
  • Economies depend on a well-educated population. As a result, many countries are investing in strengthening their educational systems, with an emphasis on increasing the proportion of their population that has a postsecondary degree.3
  • In spite of the importance of postsecondary education, the percentage of the U.S. population with a postsecondary degree has slipped relative to that of other countries. Rather than leading the world, as it has for most of the past fifty years, in 2003 the United States ranked ninth among industrialized nations in the percentage of 25-to-34-year-olds who have completed at least an associate's degree.4 If current trends are left unchanged, the percentage of the U.S. workforce with a bachelor's degree will decline from 17.1 percent (as of 2000) to 16.4 percent (in 2020).5
  • Minorities are an increasing percentage of the population, as well as of the college-going population. For example, by 2015, Hispanics will be the second-largest student group (by race/ethnicity), growing to 15.4 percent of the nation's campus population.6 Yet the educational attainment for minorities is below that of other groups. High school graduation rates in the United States are around 70 percent but fall to 50 percent for black, Native American, and Hispanic students.7 Although the percentage of students who graduate from four-year institutions after six years is 57 percent overall, the rate is 41 percent for African Americans, 47 percent for Hispanics, and 39 percent for Native Americans.8
  • Just being able to hold a job may not be enough. Each year, more jobs are outsourced to countries with lower labor costs. The good jobs that remain in countries with higher labor costs are those that require education and more refined skills. For example, the U.S. Department of Labor projects that twenty of the thirty fastest-growing jobs in the United States will require education beyond high school and that 40 percent will require at least an associate's degree.9
  • If the current educational gaps remain, U.S. per capita income is projected to decline 2 percent from 2000 to 2020. By contrast, per capita income has increased 41 percent in the past two decades. And trailing other developed countries on education may reduce U.S. economic growth by as much as half a percent a year.10

As populations become concerned for their financial well-being and economic security, pressures increase on those individuals and institutions that might influence the outcome. In the information age, one of the most influential institutions is education. And in an era of accountability and liability, organizations that resist pressures for results, accountability, and action are suspect. With economic security at stake, how long will society accept that the percentage of the population with a college/university education is stagnant even as the demand has risen, that retention rates have not significantly improved in decades, and that graduates may not have mastered even basic competencies?11

A greater proportion of the U.S. population is being educated, but overall college/university graduation rates have remained relatively unchanged for decades. And even though the United States is holding its own in the percentage of the population with college/university degrees, other countries are moving ahead (e.g., Sweden, the United Kingdom).12 Blaming K-12 for inadequately preparing students does not divert attention from the root problem, since K-12 teachers are the products of higher education.

But thanks to enterprise-wide systems that generate massive amounts of data, data warehouses that aggregate disparate types of data, and processing power that sifts, sorts, and surfaces patterns, academic analytics is emerging as a new tool that can address what seem like intractable challenges.

A New Tool

Analytics marries large data sets, statistical techniques, and predictive modeling. It could be thought of as the practice of mining institutional data to produce "actionable intelligence." Just as knows when to send someone an e-mail notice of a new book that he/she might be interested in buying, so does an admissions office know whether to invest in the printing and postage necessary to send a high school junior a glossy campus viewbook.

Today, analytics is most often used in higher education for administrative decisions—from delivering the targeted number and quality of a freshman class to cultivating likely donors. But the use of analytics will likely grow in high-stakes areas such as academic success. Whether the catalyst for adoption is a call for accountability from outside of higher education or the need for scorecards or decision-making models from within, analytics is in higher education's future. To prepare, IT and institutional leaders need to begin to understand analytics—as well as the changes that may be required in data standards, tools, processes, organizations, policies, and institutional culture.

Many institutions have implemented analytics to improve enrollment management. Institutional researchers collaborating with admissions staff have created complex formulas—based on standardized exam scores, high school coursework, and other information—to determine which applicants will be admitted. The "actionable intelligence" generated from statistical analyses of these diverse data sources can guide a more efficient use of limited admissions budgets and staff time. For some institutions, analytics means that the institution can provide applicants with an immediate response to their admissions applications. Analytical models and decisions have been refined over the years to produce fairly predictable enrollment rates, as well as balance in areas such as in-state/out-of-state students, enrollments within programs, and other demographic factors. Based on current data, models are refined annually to improve enrollment decisions.

Beyond enrollment management, analytics is increasingly used to inform fund-raising. By building a data warehouse containing information about alumni and friends, institutions can use predictive models to identify those donors who are most likely to give. Aside from academic and co-curricular history, information may include an individual's response to past solicitations; interest in particular college/university initiatives; employment in, contributions to, and honors received in fields related to institutional programs; and participation in institutional events.

With the increased concern for accountability, academic analytics has the potential to create actionable intelligence to improve teaching, learning, and student success. Traditionally academic systems—such as course management systems, student response systems, and similar tools—have generated a wide array of data that may relate to student effort and success. Early academic analytics initiatives are seeking to predict which students are in academic difficulty, allowing faculty and advisors to customize learning paths or provide instruction tailored to specific learning needs.

Sample Academic Analytics Initiatives

Baylor University, the University of Alabama, Sinclair Community College, Northern Arizona University, and Purdue University are among the pioneers in higher education analytics.

Using Enrollment Predictive Modeling at Baylor University

Since the late 1990s, Baylor University has pioneered and refined the gathering and analysis of massive amounts of data on prospective students to support a sophisticated admissions strategy. The traditional admissions cohort-action funnel (moving from inquiries to applications to acceptances to deposit payments through enrollments) analyzes factors from the application stage forward to the enrollment yield. Baylor, however, looks at a broad array of variables from the inquiry stage forward.

Out of all the variables available, Baylor identified eight that result in the best predictive model for Texas residents:

  • Attendance of a premier event
  • Campus visit
  • Extracurricular interest
  • High school attended
  • Mail qualifying score (Baylor level of interest)
  • SAT score (for non-Texas residents, this variable was replaced by the number of solicited contacts)
  • Number of self-initiated contacts
  • Telecounselor score (Baylor level of interest)13

The inquiry pool is updated once a week with additional prospective students and with additional data on existing inquirers. Scores from the predictive model are added to the student database, which admissions staff can then query to identify those students most likely to be admitted. Specific scores trigger who will receive various types of follow-up. For example, the top 75 percent of scored inquirers are sent the expensive-to-print-and-mail campus viewbook, whereas the bottom 25 percent receive only a reply card and the application form. The goal over time has been to refine the model so that no mailings go to those lower-scoring inquirers who are least likely to enroll.

The same model can be used to select the most likely enrollees for phone calls by telecounselors. Admissions counselors making high school visits use the model to identify students to seek out personally. At all points of contact, admissions staff and telecounselors enter new or updated interest scores into the prospect database; the system populates missing data with averages from past cohorts. The net effect has been the creation of information that lets Baylor segment its prospect pool, target the most likely enrollees, and more efficiently use human and financial resources to deliver the desired freshman class. One measure of this use of actionable intelligence was the increase in new student applications from approximately 15,000 for the fall semester of 2005 to approximately 26,000 in fall 2006.

Predicting and Improving Student Retention at the University of Alabama

As part of an effort to improve student retention from the freshman to sophomore year, the University of Alabama (UA) experimented with analytics. Graduate students in a data-mining course were given access to the data files of enrolled freshmen (identities were concealed) from 1999, 2000, and 2001 and were asked to develop predictive models of at-risk students. Using such statistical techniques as logistic regression, decision trees, and neural networks, the students developed a single refined model with eight significant variables:

  • UA cumulative GPA
  • English course
  • English course grade
  • Distance from UA campus to home
  • Race
  • Math course grade
  • Total earned hours
  • Highest ACT score (ACT or ACT-converted SAT score)14

Using the retention model along with pre-enrollment data and an aggregate cut-off score—a score developed in consultation with the university registrar and one below which students are considered in need of intervention—UA is able to identify 150–200 freshmen each year who are not likely to return for their sophomore year. The information is then shared with faculty and academic advisors for outreach efforts, counseling, or other action. Consistent with best practices in deploying analytics, a UA admissions research team is presently assessing and refining the retention model and will introduce a new one next year.

Developing a Student Success Plan and Early Alert System at Sinclair Community College

Since 2004, the Student Success Plan (SSP) at Sinclair Community College (SCC) has won seven national awards for its innovative and effective data-gathering for student advising and retention.15 The SSP is a Web-based counseling records management, reference, and reporting system that uses an SQL database to integrate demographic and admissions information from a data warehouse; real-time course registration, grades, and financial aid status from the student information system; and counselor risk-assessment notes and faculty-initiated early alerts.

Analytics generates a system alert for advisors to initiate an Individual Learning Plan (ILP) whenever any one of the following four criteria appears in a new student's profile:

  • Placement-test referrals into two or more developmental courses numbered below "100"
  • Individual or family income level below the federal poverty level
  • Full-time work
  • Undecided major

As the student and the counselor develop an ILP, additional information is captured, including Myers-Briggs Type Indicator, Learning and Study Strategies Inventory, personal life challenges, student satisfaction surveys, course enrollment planning, study plan and tutorial referrals, student progress markers, and counselors' notes. Aggregate factors that signal student success are achieving a GPA of 2.0 or better, passing all developmental courses, deciding on a major and a career, resolving child-care and transportation issues, and attending class regularly.16 The fall 2006 impact assessment of the ILP program from the SCC Office of Research, Analytics, and Reporting shows that first- to second-quarter new student retention was 93.3 percent for ILP completers, 76.0 percent for ILP-active students, and 65.7 percent for non-ILP students.17

The SSP, funded in part through a five-year federal Title III "Strengthening Institutional Programs" planning grant, has future plans to develop a student success course, create secure faculty-access modules, integrate the early-alert system into the campus portal, and develop new analytics reports to support prediction and analysis of student success.

Connecting Resource Utilization, Risk Level, and Outcomes at Northern Arizona University

Northern Arizona University (NAU) is in the third year of an initiative to use multiple data sources to identify at-risk first-year students and to assess which proactive interventions have the best influence on their academic success and retention. Like most other colleges and universities, NAU has a robust set of academic and student-life support resources, but members of a support task force knew that students and, by extension, staff were using services in a reactive way—after problems had surfaced and students were already at greater risk of failure-withdrawal. The task force set as its goal a predictive model that would identify which students would benefit from which resources.18 The model comprised three critical elements:

  • Resource/Service Utilization: a distillation of NAU's numerous resources and services into five categories that would make the analysis manageable but not too broad-brush: (1) academic services (academic advising and tutoring); (2) recreational resources (recreation center usage, fitness program, intramural sports, and wellness education); (3) social resources (student organization membership, after-hours events, and social activities); (4) academic referrals (by centralized advising center staff to academic departments for curricular and degree program assistance); and (5) advising/career sessions (with centralized advising center staff and resources). Although gathering the data for some categories of utilization involved blending online records and manual rosters, much of the data gathering was facilitated by usage records created from student ID card swipes by service offices and at campus activities and events.
  • Levels of Risk: established by admissions test scores, high school GPAs, and psychosocial factors as measured by NAU's deployment of the ACT Student Readiness Inventory
  • Outcomes: measured by first-year student GPAs and enrollment retention status

Among the telling results: the GPAs of students who used one to three academic services increased 0.192 points, on average; those who used four services increased GPAs by 0.280 points; and students who were high-risk and used four services increased GPAs by 0.460 points. The utilizations that had the greatest impact on retention were academic referrals and advising/career sessions.19

The NAU researchers found that the most efficient use of advising and support resources occurs when interventions are focused on high-risk students who are engaged through academic referrals and advising/career sessions. They also recognized an important aspect of "intrusive advising": that despite the positive gains in performance and retention, the way in which students learn about institutional efforts on their behalf may affect their perceptions of privacy; consequently, the timing and the content of communications require careful planning.

In the future, NAU hopes to use prediction information to connect resources and services with students as early as possible and to continue tracking both resource use and student success to refine the predictive model. One possibility is to add new data sources to the model, such as UCLA's Cooperative Institutional Research Program (CIRP) Freshman Survey ( and the National Survey of Student Engagement (

Using Course Management System Data to Identify At-Risk Students at Purdue University

Purdue University is extracting data from the course management system (CMS) to build models that predict which students may be struggling academically and to provide proactive intervention. Purdue's premise is that student academic success is the result of the student's aptitude (as measured by standardized test scores and similar information) and the student's effort (as measured by participation within the CMS).

The CMS was selected as the initial focus because of broad campus adoption and the automatic collection of more than twenty activity variables, ranging from the time spent within the CMS to the number of discussion postings. Since it can be difficult to understand how each faculty member utilizes the CMS, an individual's data is compared with that of his/her peers. The resulting comparison provides a standardized value of student effort (e.g., the number of discussion postings is shown as the standard deviation from the class mean). If the faculty member decides not to utilize a tool, everyone in the course is "average."

Using factor analysis and logistic regression, a model was developed to predict student success within a course. Six variables were found to be significant:

  • ACT or SAT score
  • Overall grade-point average
  • CMS usage composite
  • CMS assessment composite
  • CMS assignment composite
  • CMS calendar composite

Two models were developed and validated: for freshmen and for the overall campus population. Both models included the same significant variables, but the variables differed in importance. In each model, the CMS data significantly contributed to predicting academic success. However, the freshman-only model was able to correctly classify nearly 80 percent of the students, whereas the all-student model was able to correctly classify only 67 percent of the students.

Purdue's next step is to connect the results of the predictive models to existing student-intervention programs including help desks, supplemental instruction, and similar programs.

Building an Academic Analytics Initiative

Academic analytics relies on the extraction of data from one or more systems, such as the CMS or a student information system. The data, which may be stored in a data warehouse for ongoing use, is analyzed using statistical software, and a mathematical model is generated. Based on the model and predetermined values, a particular action may be triggered, such as sending the student an electronic notification or initiating a personal intervention by college/university staff.

For example, data extracted from a student information system provides baseline student demographic, academic performance, and aptitude information. The CMS provides a snapshot of the student's course-related efforts by providing real-time interaction information that allows for comparison with peers. The two sources of data are combined to predict the probability of student success. Using this probability, the institution can decide whether to take certain actions such as inviting a student to a help session via e-mail or calling a student with an invitation to meet with an advisor.

Three characteristics of successful academic analytics projects are worth highlighting:

  1. Leaders who are committed to evidence-based decision-making
  2. Administrative staff who are skilled at data analysis
  3. A flexible technology platform that is available to collect, mine, and analyze data20

Any academic analytics effort begins with leaders who are committed to decision-making based on institutional data. Analytics can be used to examine key institutional issues, such as enrollment or retention, which by their nature are complex and often sensitive, but the decision to move forward with analytics depends on knowledgeable champions among senior administrators.

The second critical component to building an academic analytics initiative is staffing. Staff members involved in analytics efforts often include database administrators, institutional researchers, educational researchers, programmers, and domain specialists (e.g., student services, retention, development/advancement). Academic computing staff may be needed to collect information from various academic systems such as the CMS. The team must have the skill to build predictive models based on institutional data guided by educational research. Other staff may be needed to focus on policy development and clarify who has access to the data, how the data can be used, and which data-security models are required. Since analytics requires data analysis, institutions will need to invest in effective training to produce skilled analysis staff. Obtaining or developing skilled staff may present the largest barrier and the greatest cost to any academic analytics initiative. Whether such staff are added to existing institutional research units or are cultivated in the IT organization, student affairs divisions, or academic units will depend on the organizational culture and the locus of resources.

The third element in any academic analytics project is technology. A data warehouse is the key component of the technology infrastructure, housing information from a variety of sources in a common structure that enables data analysis. To populate the data warehouse, the institution will need to build a "bridge" between the application and the warehouse. For some applications, standard interfaces facilitate the transfer of data. For other applications, the interface development requires significant programming effort.

Piecing together a coherent academic analytics effort can be difficult, requiring support from many units: enrollment management, institutional research, IT, the registrar's office, academic divisions, student affairs, and more. Standards must be agreed upon for the data (e.g., is enrollment based on headcount on day seven after the start of the semester or on day ten?). Extracting information from academic systems requires careful analysis and programming effort. Building the appropriate models requires staff with statistics and educational research backgrounds. Creating interventions requires domain knowledge (e.g., advising, retention) and advising/counseling skills. For institutions to be successful in academic analytics projects, IT leaders must build a coalition of people.

Considerations and Concerns

With analytics being used to address complex institutional issues, concerns are likely to arise that the complexity has been reduced to "a number," potentially resulting in oversimplification or insensitivity. In particular, analytics projects that focus on teaching and learning need to be approached deliberately. As these projects are developed, refined, and implemented to support academic success, several concerns must be addressed:

  • Big Brother: The notion that a person or institution can track the actions of individuals within a software application will be welcomed by some and will be threatening to others. Who determines which data is collected? What obligation does the institution have to inform faculty and/or students that their behavior within an application is being tracked? Does an individual need to provide formal consent before data can be collected and/or analyzed? Does an individual have an option to "opt out" of an analytics project?
  • Holistic View: Although analytics produces a prediction based on the data available, no prediction can take into account all the possible causes of success or lack of success (problems at home, financial difficulty, and so on). In addition, some will be skeptical of the ability of "a number" to account for the interpersonal relationships and personal growth that come from attending a college or university, irrespective of grades or graduation.
  • Faculty Involvement: As analytics enters the academic realm, ensuring faculty involvement is critical?in the measures as well as the actions that address at-risk students' needs. Faculty are key to "interventions" such as inviting students to office hours, providing additional practice quizzes, or encouraging participation in tutorial programs. For some faculty, analytics may provide a valuable insight into which students are struggling or which instructional approaches are making the greatest impact.
  • Profiling: One potential use of analytics is to create a profile of successful, or unsuccessful, students. The profile may be used to prompt interventions or to predict student success. Does the profile bias people's expectations and behaviors? Should the institution even create profiles that lead to generalizations about students? Are there profile uses that should be prohibited?
  • Data Privacy: The data that is collected and analyzed may be protected by federal, state, and institutional privacy regulations. For example, the Family Educational Rights and Privacy Act (FERPA) of 1974 ensures privacy except in cases of "legitimate educational interests."21 Does the institution need approval before data is used? Who has access to the data during model development and implementation? Will the information be shared?
  • Data Stewardship: The data for any academic analytics project may derive from a wide range of sources. How is the data preserved, secured, and shared? Once a data warehouse has been established, can anyone use it for any purpose? If not, how are use decisions made?
  • Information Sharing: Initial academic analytic models have produced a probability of student success. Should the results be shared with the student, faculty, or other staff? Who makes the determination of what and how information is shared?
  • Obligation to Act: If the academic analytics model provides a probability of student success, what is the obligation of faculty, students, and institutions to act on that information? With whom does the obligation to act lie? How is the responsibility shared among different groups?
  • Distribution of Resources: With quantifiable prediction models, the distribution of resources to those who most need them may emerge as an issue. Will access to support services be limited to those with the greatest need, or will anyone who has interest be able to receive help? Who receives priority if resources are limited?

Analytics can be a powerful tool for higher education, but analytics can also magnify existing value conflicts and introduce new ones. Higher education will need to balance the expectations of faculty (are faculty required to intervene in all situations where a student is at risk?), federal privacy laws (who can view student information or have access to predictions of success?), and the institution's own philosophy of student development (should the institution endorse a sink-or-swim or a nurturing environment?).

Potential Impact

Richer data sets, new ways of extracting and organizing data, more sophisticated predictive models, and additional research will drive the evolution of analytics. As the practice of analytics is refined, colleges and universities can place more and better information into the hands of a greater number of people, enabling informed decision-making.

With the public demand for documented learning outcomes and increased retention, academic analytics can contribute to institutional action. Data from the CMS, e-portfolios, student response systems, course podcast downloads, and similar applications can be used in academic analytics. And the focus of future analytics efforts can shift from predicting who is going to be successful to customizing learning environments so that the most effective instructional approaches are used for each student. Eventually, institutions may be able to provide unique learning paths, matching instructional activities to a student's learning needs.

As higher education continues to implement systems that collect a wide range of data, IT units will be called on to support analytics efforts. As a result, IT leaders will find that new expectations are being placed on their units. Staff will be required to have more than the traditional IT skills. They will need to be adept at mining data, understanding the nature of the data, creating metadata to provide long-term data management, analyzing data from multiple sources, and developing models that can be used for decision-making and action. By sharing data that is collaboratively interpreted and acted on, IT can help institutions bridge academic affairs and student affairs. As colleges and universities respond to the demand for greater accountability in higher education, the emerging practice of academic analytics is likely to become a new, highly useful tool for a new, highly demanding era.


1. U.S. Department of Education, by the National Commission on Excellence in Education, A Nation at Risk: The Imperative for Educational Reform, a Report to the Nation and the Secretary of Education, April 1983,

2. "Did You Know?/Shift Happens," created by Karl Fisch, modified by Scott McLeod,, sources:; blog post:

3. Thomas L. Friedman, The World Is Flat: A Brief History of the Twenty-First Century, 1st updated and expanded ed. (New York: Farrar, Straus and Giroux, 2006).

4. "Education at a Glance, 2005,"

5. Anne K. Walters, "Minority Education Should Be a Priority, Report Says," Chronicle of Higher Education, November 18, 2005.

6. Debra Humphreys, "Achieving Equity as Generation Y Goes to College: New Data," Diversity Digest (Spring/Summer 00),

7. G. Orfield, D. Losen, J. Wald, and C. Swanson, Losing Our Future: Our Minority Youth Are Being Left Behind by the Graduation Rate Crisis (Cambridge, Mass.: The Civil Rights Project at Harvard University, 2004), p. 2,

8. Kevin Carey, "One Step from the Finish Line: Higher College Graduation Rates Are Within Our Reach" (The Education Trust, January 2005),

9. Daniel E. Hecker, "Occupational Employment Projections to 2014," Monthly Labor Review (November 2005),

10. Patrick J. Kelly, "As America Becomes More Diverse: The Impact of State Higher Education Inequality" (National Center for Higher Education Management Systems, November 2005),; June Kronholz, "Economic Time Bomb: U.S. Teens Are among the Worst at Math," Wall Street Journal Online, December 7, 2004,

11. "Fact Sheet: The National Survey of America's College Students" (American Institutes for Research, January 19, 2006),

12. "Education at a Glance, 2005,"

13. Kathleen Morley and Tom Bohannon, "Applications of Data Mining in Institutional Research," presentation, Association for Institutional Research Annual Conference, Cincinnati, May 2000.

14. Cali M. Davis, J. Michael Hardin, Tom Bohannon, and Jerry Oglesby, "Data Mining Applications in Higher Education," unpublished book chapter (2007).

15. Awards include the 2004 EDUCAUSE Information Technology Solutions Award, the 2005 National Council for Student Development Exemplary Practice Award, and the 2007 Bellwether Award for Instructional Programs and Services.

16. "Student Success Plan and Early Alert System: Individual Learning Plan, Holistic Counseling, and Intervention Model," Sinclair Community College Presentation, March 5, 2007,

17. Sinclair Community College, Office of Research, Analytics, and Reporting, "Fall 2006 ILP Results," March 2007,

18. Rebecca Pollard Cole, Margot Saltonstall, and Paul A. Gore Jr., "Assessing Student Readiness to Promote Student Success: A Campus Collaboration," unpublished manuscript developed from a paper presented at the 25th Annual Conference on The First-Year Experience, Atlanta, Georgia, February 24–28, 2007.

19. Steven Robbins, Jeff Allen, Alex Casillas, Adaeze Akamigbo, Margot Saltonstall, Rebecca Cole, Eileen Mahoney, and Paul Gore, "Associations of Resource and Service Utilization, Risk Level, and College Outcomes," unpublished paper (2007).

20. Philip J. Goldstein, with Richard N. Katz, "Academic Analytics: The Uses of Management Information and Technology in Higher Education," EDUCAUSE Center for Applied Research (ECAR) Study, vol. 8 (2005), key findings publicly available:

21. The specific FERPA provision (�99.31) under which a student's prior consent is not required to disclose information is as follows: "An educational agency or institution may disclose personally identifiable information from an education record of a student without the consent required by �99.30 if the disclosure meets one or more of the following conditions: (1) The disclosure is to other school officials, including teachers, within the agency or institution whom the agency or institution has determined to have legitimate educational interests" This would seem to clear the anticipated federal legislative hurdle. However, whether there is an institutional will and constituent buy-in to create and share "actionable information" from diverse sources is something else altogether. Because each institution is obligated to interpret and publicly declare what "legitimate educational interests" means, this is an aspect of FERPA that has beguiled registrars and academic administrators for some time in aligning local information-sharing policies.