Data Transformation at Salt Lake Community College

Case Study

min read

Transforming the culture of data use can create a virtuous cycle in which more members of the campus community use more data, more often, and use data better.

Case Study
Credit: Muslianshah Masrie / © 2022

Salt Lake Community College (SLCC) is Utah's largest public college, with the most diverse student body in the state. It serves more than 50,000 students on 10 campuses. More than half (56%) of SLCC's students are first-generation to college, 32% of the student body is non-White, and more than 80% work while attending school.

The Challenge/Opportunity

A cover story in The Economist from May 2017 asserted that "the world's most valuable resource is no longer oil, but data." Companies in various industries have found success through the innovative use of data—consider, for example, how Netflix and similar services have changed the way people consume TV and movies. Higher education can learn from this approach. By viewing data as an asset across the institution, colleges and universities can improve student success and strengthen sustainability.

In 2018, Jeffrey Aird, vice president for Institutional Effectiveness at Salt Lake Community College, renamed Institutional Research as Data Science & Analytics, a move that would be one of the first steps in changing the way the members of the campus community view and use data. Institutional leaders had recognized a gap in how the college managed and used data. Many spreadsheets were being used to track data, multiple people were tracking the same sets of data with no consistent source for the data, and data reports were being shared that had conflicting data. We recognized a need to improve access to trusted, validated, and consistent data and insights.

Part of achieving SLCC's strategic goals for 2023 is to become more data informed as an institution—to use more data, more often, and to use data better. Data literacy is the ability to use and understand data, ask questions, and challenge the data. People with low data literacy are more likely to distrust data. We do not expect everyone to have a high level of data literacy and be able to create advanced reports, but we want them to feel comfortable understanding and talking about data.

College leaders created a committee that developed a plan based on weaknesses they discovered through research and interviews. This group reviewed organizational data maturity on an assessment used by several consulting firms. This assessment found varying levels of maturity across different departments. The college established a goal of having more consistent levels of data maturity and data literacy across the institution.

The plan included several elements:

  • Increase the availability of trusted data
  • Prioritize key analytical projects for the institution with the highest return on investment, with some key data products prioritized including a faculty dashboard, an advising dashboard, and a tool for academic administrators
  • Align the organization on the use of data and data literacy
  • Improve definitions of standard data and data quality

This wasn't just a technology change but was really a cultural shift. Cabinet members and senior leadership championed the charge with consistent messaging about the value of data. They reiterated the need for getting the right data to the right people at the right time. They strengthened the Data Science & Analytics team and empowered the group to lead this transformation.


I was hired in May 2018 as the director of the Data Science & Analytics group, with the charge of getting away from conventional IR duties and moving instead to a mindset of meeting the college's needs for data with analytics. Not long before I started, the college implemented a data warehouse, which has been a vital part of the transformation of data maturity and data literacy. My team, including engineers, analysts, statisticians, and qualitative researchers, supports the entire college by managing the data warehouse environment as a trusted source of institutional data, building data products such as dashboards for the college overall and for individual departments and conducting actionable research and assessments. It is my goal for everyone at the college to know they can come to my team for data or for help understanding data and can trust the work that comes from the team.

One of our strategies over the past three years has been to improve student analytics to better use data to support student success. As part of that strategy, we work not only to improve data literacy but also to increase the availability of data and insights.

Goals and Indicators

Institutional goals are often articulated at a high level, and one of the first steps is to break down such goals into actionable metrics. The metric of completion is a lagging indicator, meaning that it has already happened and can't be changed. Leading indicators, on the other hand, are milestones that we expect to happen as a student works toward completion. An obvious leading indicator might be applying for graduation—as in, what percentage of students who are eligible for graduation have actually applied? But many other steps lead to that point. Are students taking courses toward their degree? Have they registered for the next term? Retention is a challenge at SLCC, and we are using data and analytics to help retain students. Students leave for many reasons, sometimes financial but also related to factors such as mental health, study habits for first-generation students, and other ingredients that can be exposed and addressed using analytics to focus our staff efforts.

A challenge for higher education is that some metrics move slowly or only happen a few times a year, such as registering for the next term. In a business context such as e-commerce, we would expect a person to search a website and make a purchase that day. In education, with the years-long goal of graduation, outcomes from initiatives or changes are more difficult to see. To get over this hurdle, we can define leading indicators that are day-to-day events for the student. Are they logging in to Canvas? Turning in assignments on time? Participating in discussions? We provide faculty with individualized data about their students based on those leading metrics.

A Focus on Student Achievement

One of our current projects is an AI bot called Digital Assistant that uses Canvas data to help students understand patterns and behaviors that correlate with success. Part of that work is to ensure that the tools are not replicating biases hidden in historical data or creating new biases. The development work includes behavioral specialists who focus on such issues, and during the pilot of the program, we are surveying students to understand how people from different backgrounds interpret actions differently.

We have also begun using machine learning in our products, including a scheduling tool, which works to predict which courses will fill during an upcoming term, and a tool called Behavior Alerts for student advisors. The Behavior Alerts focus on things students can change (behaviors) rather than on who they are, leaving demographics out of the model, which alerts advisors to the students who need their assistance each week, helping focus advisors' limited time. That said, we test against those dimensions to ensure that the tool doesn't inadvertently create or reinforce equity gaps.

Data Products and Practices

We have implemented a range of practices and developed a growing list of data products that collectively have helped transform the culture of data at SLCC.

  • Data Products
    • Teaching Insights: A broad tool with insights for faculty about which students are struggling, how former students have done in subsequent classes, how students are contributing to our SLCC strategic goals, and insights on how assignments are best preparing the students to succeed in the class.
    • Pre-Enrollment Dashboard: A look at the entire admissions and enrollment process to better understand the current picture and where we need to improve in the process to increase enrollment numbers.
    • Academic Insights: A comprehensive reporting tool for academic administrators to understand course-taking patterns of students in their programs and classes, scheduling insights (including first day fill-rate predictions), Canvas usage, and more.
    • COVID Reporting: A tool for tracking self-reported positive cases, COVID testing results, impacts to classes and rooms, and overall campus numbers.
  • Data Governance Council: A group focused on identifying the policies, practices, and roles to ensure accurate, consistent, trusted, and secured data across the institution.
  • Analytics Steering Committee: Senior leaders who meet monthly to discuss vision and strategy for data, guide current development and priorities, and oversee the release of new data products.
  • Data and Insights User Group: A group that meets monthly over the lunch hour to discuss data-related topics with a goal of increasing data literacy. Open to anyone at the college, including students, though student participation remains relatively low so far. Sometimes it's reviewing key points from a book that a member read, discussing a data product released, giving tips for data visualizations, or using Excel or one of our reporting tools.
  • A focus on inclusive and equitable data: As part of our mission and values, a changed approach to how we present data to the college. Whenever we look at data, we provide the ability to break down the metric by demographics of race, ethnicity, gender, age, and first-generation status. The goal is to make sure we are surfacing inequities and ensure that we are not unintentionally affecting one group over another through our initiatives. Showing data this way has raised more questions across the institution about how we can better provide an equitable and inclusive experience for all of our students.

The Data Science & Analytics team holds weekly office hours, and members of the group attend departmental staff meetings to discuss data products and the insights that can be gleaned from them. I meet with each new director to the college to show them where to find data, explaining the services my team provides to the college and showing data products that can help them in their role. We emphasize helping people understand how to read and interpret data. "When you see this kind of chart, here's how you would read it" or "Here's what I look for when looking at this kind of data" or "Here are questions to ask yourself when interpreting this dashboard for this purpose, such as program reviews." Because people learn in different ways, we provide these tips in multiple methods: text boxes that pop up on a dashboard the first time you log in to any of the data products, help documents with videos embedded, or in our written research that walks readers through the data and how we reached certain conclusions.

In every data product that we deliver, we focus on actionable insights, not merely on providing information. We've seen an increase in our overall data maturity over the past few years, but there's still more work to do.

Outcomes and Lessons Learned

As units and individual users experience the benefits of the effective use of data products, words spreads about that value, creating a snowball effect. Faculty see what their peers are able to do, and the whole culture of data starts to change. The number of requests coming into the Data Science & Analytics team has increased, but, more importantly, the requests have shifted from, for example, simply providing lists of students to requesting actionable data. My team is being asked to attend many more committee meetings, where we listen to the needs and identify data products we can deliver to help the group answer questions or understand what's going on so they can act.

As Jason Pickavance, associate provost for Academic Operations, observed, "In effectively responding to our data and analysis needs (with reports, strong analysis, dashboards, and so on), our Data Science & Analytics team has created what we might call a virtuous cycle of data. We're consuming more data, we have a greater appetite for being data informed. And we're becoming better at being critical consumers of data at SLCC."

As we move through the process of transformation, we inevitably make mistakes and need to alter our course, and an important element of the culture of my group is to view all successes and pitfalls as occasions to learn something and make improvements.

  • Develop a vision and a strategy. Anticipate how data will delivered, used, and accessed, but look for the low-hanging fruit that you can deliver quickly along the path to your bigger vision. In our quest for a new reporting tool, we have had to get clever to deliver small wins that fit into our budget (of $0) so that we could prove the value of what we are doing.
  • Get a cabinet-level sponsor to advocate for data. If possible, get your president on board. When the cabinet asks for and uses data, others see the importance of getting them the data they need.
  • Recognize that trust in data is the top priority. If my team rushes to fulfill a project and delivers incorrect results, trust in my team erodes. We built a project workflow that includes peer review and QA of data to ensure that accuracy is a priority. Data governance is also a key factor for trust. Our partnerships with data stewards across the college help ensure we have quality data to use. If the use of a field changes, we need to be alerted to this change via our data steward partnership.
  • Establish a foundation for the work. Building a data infrastructure is not high-visibility work but is critical to being able to deliver analytics quickly in the future. Invest in the time and talent to build a platform where data can move throughout the institution.
  • Engage students. As the objects of analytics programs, students have concerns about the ways the systems might be "watching" them. What happens if certain groups of students stop trusting the use of data and opt out? Increasing not only data literacy but "analytics literacy" is an important part of this work. The institution has grappled with issues of transparency—who owns data, and who decides how the data are used? We have included student leaders in the development of data products, and this has been beneficial for all involved.
  • Prioritize patience as people increase their data literacy. Individuals are at their own points on the journey to data literacy. Help them learn the language of data.
  • Make strategic projects a priority. It's easy for my team to get caught up in every little request that we receive, but if we didn't prioritize the bigger, more strategic products that serve a wide audience, we would continue to have a goal with no outcomes.
  • Adapt products to different users in current processes. Adjuncts and faculty up for tenure, for example, can be left behind because they are too busy to invest the time to learn the data products. By developing processes that these users can easily follow, we are finding ways for them to integrate this work into current process such as portfolio development rather than having to add it to their workloads.
  • Leverage top users as advocates. When our Teaching Insights tool was first released, I was doing all of the training. Then I found that a few faculty members were using it quite often, and I recruited one of them to do all trainings now. He is able to relate to his peers and their needs as he explains how he uses the tool. It has increased usage significantly.

Where to Learn More

Michelle Hardwick is Director of Data Science & Analytics at Salt Lake Community College.

© 2022 EDUCAUSE. The text of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.