A Rubric Accompanying the Student Success Analytics Framework

min read

Practitioners across campus can use a newly developed rubric to help create and sustain student success initiatives.

Chart titled RUBRIC with checkmarks in various boxes.
Credit: Vineyard Perspective / Shutterstock.com © 2023

The EDUCAUSE Student Success Analytics Framework Rubric Working Group:
Tasha Almond-Dannenbring, Maureen Guarcello, Marcia Ham, Andy Miller, Shannon Mooney, and Ayla Moore

Download Rubric 

In May 2022, EDUCAUSE published "A Framework for Student Success Analytics," which was developed by a working group focused on the integration of data-informed practices that consider students and their diverse contexts.Footnote1 The framework, which comprises the four interdependent components of Preparedness, Outcomes, Analysis, and Decisions, can be used by people in a broad range of roles on campus to influence decisions that affect student experiences and outcomes.

A subset of that larger working group then developed a rubric to support the use of the framework. The rubric, which is provided as a downloadable Excel file, is designed to guide practitioners interested in taking action to develop an impactful student success analytics initiative. A base level of awareness of the framework will equip users with context in understanding each of the subsections in the rubric. Narratives for each subsection describe the essence of that area, and the Excel document provides detail for evaluation.

For practitioners managing initiatives that are currently under way, those who are facilitating the development process, and those looking for insight prior to starting, the rubric can serve as a means of evaluation or a roadmap of best practices. The rubric could also be used in conversations with cross sections of stakeholders in conversations around successes and challenges, providing opportunities for productive cross-departmental collaboration in moving forward with or beginning work. Multiple cycles of development and ongoing enhancement are expected when employing this rubric; for example, resolving a particular challenge in a given aspect may give rise to inquiries or uncertainties in other domains.

The term "diverse" in this narrative is used to describe the students for whom an initiative is intended to create an impact and the make-up of the coalition.Footnote2 Both groups should reflect and embrace the vision of diversity defined by EDUCAUSE:

Inclusivity and diversity foster creativity, innovation, and a broader range of ideas and perspectives. Embracing diversity promotes equality, respect, and appreciation for the unique qualities and contributions of each individual or group involved.Footnote3

This rubric is part of a set of tools provided by EDUCAUSE to assist colleges and universities in various areas of institutional analytics. Each of these tools is intended to offer guidance in analytics depending on the institutional need.

  • "A Modern Framework for Institutional Analytics" provides high-level input on considerations for institutions as they evaluate their institutional analytics strategies.Footnote4
  • The "Analytics Institutional Self-Assessment" is a tool for institutions to use in evaluating their overall analytics strategies, providing resources and items for improvement to be implemented depending on where there is need.Footnote5
  • The Student Success Analytics Framework Rubric is intended to offer tactical guidance on a specific initiative, with the goal being successful, actionable work that impacts student outcomes positively. The means by which each institution chooses to evaluate itself on the rubric are unique to each institution's situation.

Rubric Levels of Actionability

This rubric includes three levels, which are defined in terms of evidence provided and actionability. Levels of actionability are intended to offer guidance on where a coalition may strive to be; however, it is not imperative that the coalition be at level 3 in all areas. In evaluating institutional preparedness, capacity is also a consideration and may impact multiple areas as a result. As an initiative kicks off or is under way, coalitions can rate themselves on these levels depending on the evidence they have to show around the work done in that area.

Level 1: Plan is not actionable in current state, with serious concerns and limited evidence.

At the lowest level of actionability, there is a lack of evidence and work accomplished, warranting serious concerns for the coalition to move forward. Given the missing evidence, actionability at this level would be compromised if gaps are not addressed. There are serious concerns with the efficacy of this part of the plan, and evidence to demonstrate its full development is limited.

Level 2: Plans for actionability are underdeveloped, with some minor concerns and reasonable evidence.

Level 2 indicates underdevelopment in some areas of the plan. Evidence may be moderate, causing minor concerns with gaps that need to be addressed. In order to move forward, certain areas must be addressed. Some of the gaps or needs might require resolution prior to moving forward, and others might not, as long as the problem has been identified and plans are in place to address it along the way. In other instances, a pause might be necessary to address a gap prior to moving forward with the initiative. Not addressing these gaps may eventually inhibit actionability.

Level 3: Plan is actionable with substantial evidence.

Level 3 indicates that gaps in evidence have been addressed and solutions have been put in place for those areas in need of evidence. Furthermore, plans are in place and on track for the initiative to make a positive impact on student outcomes. The project is on track for success and sustainability.

Scenario

The following scenario is used throughout explanations of each subcategory to provide examples of what the levels might look like in practice. Multiple advising models were evaluated to develop these examples; however, the scenario is not particular to any one institution.Footnote6

First-Generation Student, Undeclared in First Year: Student Initiative
Advisors are interested in how to advise first-generation students with an undecided major in the students' first year. Advisors would like to develop a tailored advising strategy for these students to guide them into majors to increase their sense of belonging and persistence to graduation.

Areas of Evaluation

The Student Success Analytics Framework Rubric is a comprehensive tool that scrutinizes initiatives on the four framework components: Preparedness, Outcome, Analysis, and Decisions. This framework serves as a meticulous and rigorous mechanism to gauge the likelihood of impact and sustainability of an initiative to enhance student success.

Preparedness

This section focuses on the preparedness of a coalition in undertaking an initiative, setting the foundation for success and actionability. The key elements that can maximize an institution's opportunity to facilitate an effective, data-informed initiative include institutional mission; student success definition; data preparation, literacy, and governance; technical capacity; and plans for actionability. The institutional mission, used as a frame, keeps the initiative and coalition focused. Further, a clear definition of student success provides context and alignment for the cross-functional and diverse coalition. Data readiness, including validation,Footnote7 literacy, and technical capacity, is critical to an analytics initiative and should be carefully factored into plans prior to moving forward on analysis. Attention to actionability through appropriately aligned data processes is important for long-term success.

Outcomes

This section focuses on the outcomes of a coalition in the selection of measures and planning, including roles and responsibilities. These key elements to consider are critical for a coalition to effectively come to actionable decisions. First, a student success initiative should consider the unique contexts of students through the multidimensionality of the metrics from the perspective of diverse lenses. In evaluating the multidimensional nature of student success, one may consider as an outcome the achievement of learning objectives as assessed by a student's passing grade on an assignment (a metric), but it also may be ultimately assessed through the achievement of graduation or attainment of certification (a separate metric).

Next, almost any individual metric has an inherent relationship with another one, and those relationships should be considered in the course of decision-making that impacts student outcomes. Some commonly used metrics that are multidimensional in nature include academic achievement, attainment of learning objectives, acquisition of skills and competencies, student satisfaction, persistence, career success, and student wellness. The charter developed by the coalition should inform the scope of the multidimensional nature of the data in the initiative, after which this aspect of the project will be in a mature state for movement forward. Finally, the initiative includes a number of stakeholders, each with different perspectives necessary for holistic decision-making around the needs addressed by the outcomes. What is important to students, faculty, staff, and administrator stakeholders will be different, yet all voices should be included in the decision-making process based on the outcomes of the initiative.Footnote8

Analysis

This section focuses on the analysis choices that are made in undertaking an analytics initiative and the choices made in the type and complexity of analysis. The essence of this section is to provide coalitions the opportunity to improve the analysis of the outcomes in their initiative. Generally speaking, higher levels of analysis (level 2 or 3) are preceded by higher levels of preparedness, although it is possible for highly sophisticated analysis to take place in isolation. More sophisticated preparation will both aid and amplify the analysis efforts. Key considerations should include appropriate data preparation, thoughtful selection of analytics and data type, as well as level of complexity. As with any analytics initiative involving student data, there is always the possibility for unintended consequences to surface, and it is expected that these have been addressed throughout the process.

Decisions

This section focuses on asking coalitions undertaking an analytics initiative to consider how their decisions and use of the data can inadvertently harm certain groups. Key considerations should include the potential for unintended consequences of the work, identifying such consequences and developing strategies to mitigate them, the inclusion of student voice, privacy and equity, and accountability structures to ensure continuous assessment of the work and its impact.Footnote9

Conclusion

The efforts to develop this rubric stem from a collective effort of the Student Success Analytics Practitioners Community Group to provide our community with tools and support. The devotion of practitioners prepared to invest in the integration of impactful analytics strategies is critical in providing their institution with useful insights to be used in data-informed decisions. In an environment with an abundance of data and entities seeking to help students, this rubric provides a path forward toward addressing the dynamic and extraordinary circumstances that our students are experiencing today.

Notes

  1. Tasha Almond-Dannenbring et al., "A Framework for Student Success Analytics," May 25, 2022. Jump back to footnote 1 in the text.
  2. The term "coalition" is used throughout this document. Generally speaking, a coalition in this context is a body of diverse individuals formed for the purposes of mobilizing an initiative intended to improve student success. Examples of a coalition include a task force, project team, Academic Affairs Council or other council, steering committee, and working group. Jump back to footnote 2 in the text.
  3. See "The EDUCAUSE Guide to Diversity, Equity, and Inclusion." Jump back to footnote 3 in the text.
  4. Dave Weil, Casey Kendall, and Rob Snyder, "A Modern Framework for Institutional Analytics," EDUCAUSE Review, February 14, 2023. Jump back to footnote 4 in the text.
  5. "Analytics Institutional Self-Assessment," July 29, 2021. Jump back to footnote 5 in the text.
  6. "Every Student. Every Semester," University Advising Center, University of South Carolina, 2021; David B. Spight, "Undeclared Versus Declared: Who Is More Likely to Graduate?" Journal of College Student Retention: Research, Theory & Practice 23, no. 4 (2019); Joe Cueso, "Advising Undecided Students: Research & Practice," in Improving the First Year of College: Research and Practice, ed. Robert S. Feldman (New York: Erlbaum, 2005), 27–50; Tracey A. Glaessgen, Cynthia J. MacGregor, Jeffrey H. D. Cornelius-White, Robert S. Hornberger, and Denise M. Baumann, "First-Generation Students with Undecided Majors: A Qualitative Study of University Reacculturation," NACADA Journal 38, no. 1 (2018): 22–35. Jump back to footnote 6 in the text.
  7. For the purposes of this rubric, "data validation" refers to the process of quality assurance and cross-checking data fields that are to be used in the analysis and outcomes of the initiative. A typical validation process in this context involves the data steward responsible for the fields of interest, as well as a team that reviews the field to ensure that the data are accurate and reflect official or approved numbers. All members of the coalition are assumed to understand the metrics used and that they reflect a common understanding of student success. Jump back to footnote 7 in the text.
  8. André De Waal, Michael Weaver, Tammy Day, and Beatrice Van der Heijden, "Silo-Busting: Overcoming the Greatest Threat to Organizational Performance," Sustainability 11, no. 23 (2019); Kate Marijolovic, "What Does a Healthy Campus Actually Look like? A New Study Offers Ideas," Chronicle of Higher Education, March 13, 2023; Amelia Parnell, Darlena Jones, Alexis Wesaw, and D. Christopher Brooks, "Institutions' Use of Data and Analytics for Student Success: Results from a National Landscape Analysis," Washington, DC: NASPA – Student Affairs Administrators in Higher Education, the Association for Institutional Research, and EDUCAUSE (2018); Theresa Duong, "What Is Pedagogical Wellness?" UCI Division of Teaching Excellence and Innovation; "Trauma-Informed Pedagogy." CoLab Podcast, July 29, 2022. Jump back to footnote 8 in the text.
  9. Definitions for the following concepts: Do No Harm, Respect for Students, Accountability, and Autonomy, Privacy, and Equity can be found in "A Framework for Student Success Analytics"; Sharon Slade and Alan Tait, "Global Guidelines: Ethics in Learning Analytics," International Council for Open and Distance Education, March 2019; and Eric Kelderman, "The New Accountability: How Accreditors Are Measuring Colleges' Diversity, Equity, and Inclusion Efforts," Chronicle of Higher Education, April 3, 2023. Jump back to footnote 9 in the text.

© 2023 Tasha Almond-Dannenbring and the Student Success Analytics Framework Rubric Working Group. The text of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.