Three Corporate Voices: Analytics and Business Intelligence

min read

The EDUCAUSE Enterprise IT Program asked three corporate members to share their advice on how integrating data and systems can advance their analytics maturity.

wordle-like composite of BI icons
Credit: Undrey / Shutterstock.com © 2019

Five times a year, the EDUCAUSE Enterprise IT Program publishes a set of materials, each focused on a particular challenge. For many institutions, analytics and business intelligence is a current concern. We therefore set out to gather expert insights into how institutions might better view administrative systems data as a strategic institutional asset that can help them

  • answer important organizational questions,
  • assess progress on institutional goals, and
  • improve their institution's ability to make information-based decisions.

We reached out to three corporate members—Deloitte, Jenzabar, and Oracle—and asked leaders within them for their guidance, advice, and ideas about how institutions might better understand how integrating data and systems can advance their analytics maturity. Their answers to some key questions follow.

The Value Proposition

How can institutions best demonstrate the value of analytics to answer important institutional questions so that they can make better information-based decisions and better align their efforts with institutional strategies, goals, and objectives?

Rich Clayton: Higher education institutions face many challenges, from growing enrollment to lowering costs to improving student outcomes to deciding which programs to expand and which to retire. Analytics can provide answers to these questions, but few institutions have a strategy and a culture to use institutional data in the moments that matter.

I see three reasons why institutions fail to realize this potential. First, data and systems are siloed across institutions, creating tension between IT and administration. Second, analytical tools are limited in scope and not connected to daily work. Third, the technology is too difficult to use broadly across the institution.

Provosts, presidents, and senior leadership must ask difficult questions to break down silos. In my experience, most administrative leaders spend 90 percent of their time collecting facts about what happened and 10 percent understanding the causes. Leaders must stop looking for data to support a hypothesis or an agenda and instead let the data speak.

At Oracle, we call this AI for the Why—an opportunity to leverage embedded machine-learning techniques to de-bias data analysis to find explanations for pressing questions. When data are allowed to do the talking, profound change can occur. Important variables often go undiscovered because most analysts keep analyzing the same data when what's needed is a capability to find patterns automatically and bring those insights to the decision at hand. Gartner calls this capability Augmented Analytics, where embedded AI can help augment the decision-making of humans and deepen the rigor and viability of the analysis.

Peter Fritz: First, it is critical to be clear about the value chain of analytics: analysis leads to action leads to impact. Many institutions struggle with two temptations:

  • They focus on the "wow" factor of new technologies or the ingenuity and complexity of analytical techniques themselves, rather than on the outputs of actual analysis.
  • They develop and socialize insights but stop short of using that knowledge to improve strategy or operations.

Always focus on the ultimate impact. For example, while integrating student data from across an institution and then analyzing and modeling that data to find patterns in student outcomes might be impressive, it has little impact unless it results in organizational changes that improve student outcomes.

Second, it is critical to talk about the financial ROI of analytical efforts. Many institutions understandably focus on creating a positive impact on their mission through analytics, but they do not clearly define the financial impact. The bottom line is that building analytics capabilities takes time and money, and executive leaders need to understand the "bang for their buck" to approve the necessary investments. Luckily, in many cases, analytics can have a significant positive impact on both the mission and finances—and institutions that embrace both often see the most progress and success.

Meghan Turjanica: For institutions that have only just begun to turn to data to answer tough questions, this is particularly difficult. Those institutions have two options: point to educational leaders who have yielded the benefits of analytics, or choose a small project to measure, analyze, reform, and retest. The first method is suspect because the audience often says, "We just can't do that here," while the second can be difficult because it requires commitment and work and may be outside the comfort zone of the institution.

For those institutions that have done the analysis but haven't done a good job of sharing the benefits, it is often because it is challenging to shape findings into a compelling data story. Institutional silos that segment and break down communication—or the lack of an appropriate sponsor to promote the information—are additional impediments.

In summary, institutions must place more emphasis on talking about what was learned, openly sharing what was successful, and reinforcing that any negative findings are an opportunity to make improvements. Finding something that isn't working is not failure, it's an opportunity to do better. Institutions cannot improve without first knowing their true reality—and that takes analytics.

Keys to Successful Collaboration

With the increasing need to integrate data, systems, and processes from a multitude of environments, higher ed institutions and solution providers need to collaborate effectively throughout the process of technology discovery, implementation, and continued support. What makes a great collaboration?

Clayton: The rapid rate of technology change requires a different collaboration model between solution providers and higher education leaders. The greatest collaboration opportunity is in the area of academic program development. The demand for data-savvy business professionals is profound, and the supply from higher education continues to fall short.

Every solution provider should, at a minimum, offer to participate in the classroom on analytic best practices—and ideally partner with the academic leadership team to guide the development or evolution of analytic programs. The solution provider should also provide real-world industry use cases and data for students to practice their new skills. At Oracle, we partnered with Cal Poly to advance its program, and the best collaboration was in its data science competition. A recent Forbes article offers more details on how collaboration can help close the talent gap.

Above and beyond the academic collaboration, great partnerships focus on best-practice sharing and community development to promote excellence in analytics. Workshops, special-interest groups, and online forums are a great way to advance your experience with analytic technologies.

Fritz: Although a successful collaboration has many elements—scope clarity, strong program management, sufficient resourcing, etc.—it really boils down to one word: trust. Universities need to trust that their solution providers have the institution's best interests in mind and are not just "out to make a buck." Providers, in turn, need to trust that their university partners are committed to the effort and have a clear path for creating the desired impact.

Building trust is obviously a complicated calculus, but successful collaborations do have particular hallmarks. Universities should focus on three provider capabilities. First, find solution providers that can build institutional capacity and enhance existing capabilities rather than replace them with a "black box." If stakeholders do not understand how and why a solution works, it will be difficult to build organizational trust around it.

Second, seek providers that understand what it takes to succeed across the analysis-action-impact value chain. Partners who deliver analytical insights and help you successfully socialize those findings and build an action plan to address them will ultimately deliver much more value.

Finally, look for providers with broad experience in higher education. The very best analytics programs bridge gaps and break down silos across the institution; working with providers experienced in dealing with those challenges is important for creating real impact.

Turjanica: Many institutions now have upwards of sixty different on-campus software solutions that collect and maintain data. This can provide a number of challenges, including that there are many opportunities for data to have the same label but mean something quite different across platforms. Even within the same platform, definitions and process can get quite hazy across the institution, leading to suspect analysis.

A key element to improving collaboration on all sides of the table is data governance. Having industry-standard definitions for data that can be used and shared across platforms is essential. For example, when we do predictive modeling for student success, we often find that data is used, documented, or stored in rather nonstandard ways. It's critically important that all parties understand and share a definition of what a particular data point does mean or will mean. This requires a great deal of openness on the part of all and a willingness to be flexible.

This may mean that an institution needs to refine its definitions or practices for data collection and that a provider will need to help find workarounds for institution-specific data that doesn't quite fit the mold. All parties can benefit from a willingness to ascribe to standardized data definitions.

Communicating about Analytics

What key communication elements or approaches make cross-institutional communication successful?

Clayton: Leading organizations make analytics fun, agile, and outcome-based. Analytics is a team sport, and it requires many unique skills and coaches. To develop a data culture, it's vital that when teams or individuals win with analytics that their success be promoted internally. One way to do this is by building a virtual "data club." The role of data club members is to be advocates for success in analytics. Data clubs are made up of cross-functional analytic leaders, and the members are positioned as the thought leaders in their field. Data clubs aren't like "old skool" competency centers—that is, they aren't focused on technology standards but rather on promoting outcomes reached through the strategic use of data. By linking analytic initiatives to the mission of the institution, it will become more apparent to non-data-savvy managers that they need to modernize their skills and explore more data.

Fritz: Every institution must communicate baseline expectations as part of a successful analytics strategy to build excitement and buy-in and, frankly, to quell nerves.

At the outset, you must proactively address any data security and confidentiality concerns:

  • Get in front of fears about "black box" analytics by committing to explain methodology and process to interested stakeholders, and ensure that your solution providers make the same commitment.
  • Openly share your institution's analytics priorities and roadmap across the analysis-action-impact value chain so that stakeholders understand when and how their operational units will see benefits.
  • Use case studies or early-win examples to demonstrate success and to help individuals understand that the analytics strategy does not stop at dashboards or reports but follows through into true impact for faculty, students, and staff.

Next, to reduce anxiety across campus, inform impacted stakeholders that context will continue to matter and analytics will not result in witch hunts or blame games. If, for example, a regional admissions counselor underperforms against expected enrollment funnel targets, make it clear that this information will be paired with unanticipated contextual factors (such as a regional competitor introducing larger aid packages) and that any resulting action will be supportive (such as increased marketing support) rather than punitive.

Turjanica: There are several key elements that make cross-institutional communication successful. First, begin with a project sponsor who has authority. A great example of this is when a president or CAO sees opportunities in and understands the benefits of implementing analytics.

Second, get project sponsorship from members of various departments. Not only should there be direction from the top but also from those on the front lines. This can be tricky, however, as the champion for a new initiative may not necessarily be a department head.

Third, provide frequent communication about what has been learned, what will be researched, and any benefits garnered. Removing what feels like finger-pointing in any initiative will pay dividends. It is crucial to indicate that if something is not working, it is not a person's or a department's failure. Data analysis can help find what is not working and give us an opportunity to address the issue.

Finally, be an advocate for this process. Stress how data analytics can help radically improve the way things are done. Often, analytics can highlight how ineffective traditional activities might be, enabling teams to remove less-popular and time-consuming tasks.

Aiming for Analytics Maturity

What additional recommendations or advice do you have for institutions working to increase their analytics maturity?

Clayton: In working with many institutions over the years, I have found at least five common attributes among high analytic performers. First, great analytic leaders embrace new technologies and let AI reduce bias in decision-making. When embedded in analytic processes, AI can bring many insights that would otherwise go unnoticed. Second, they create a data lab [https://blogs.oracle.com/bigdata/what-is-a-data-lab] for experimentation and discovery, which is essential to growing the analytical capabilities of the institution.

Third, they use storytelling and visualization; beautifying insights for greater comprehension is a core competency in leading institutions. Fourth, they connect plans with shared assumptions. Too often, analysis and planning are separate and not connected; great institutions link these processes to improve operational efficiency. Finally, and most importantly, they create a talent-development plan for data literacy. Analytic skills in the workforce need continuous nurturing, and those who have a plan to build their competencies in analytics will create better student and institutional outcomes.

Fritz: Rather than starting with a focus on infrastructure—which can result in daunting investments, murky value propositions, and challenges with executive buy-in—focus first on the questions you seek to answer and the value those answers will provide your institution. Questions might include: How do we better understand our prospects? How do we help students succeed? How do we deepen alumni connections? How do we more efficiently operate? These questions—and the ultimate mission-focused and financial outcomes—will help create momentum for change.

Further, while some institutions immediately invest in broad analytics transformations, an incremental approach lets institutions assess various functional areas to see which are likely to offer the greatest benefit and the most enthusiastic partners; they can then design and implement pilot projects and use the success of those efforts to increase interest across the institution.

It is also important to embrace the ethical debate about analytics and define the line between Can we? and Should we? Such a line is crucial in higher education given the age of many students and the overwhelming wealth of personalized data available across campus.

Finally, there is no "right" way to develop more mature analytics; you should learn from peers and solution providers, but ultimately set a course that is best for your own institution.

Turjanica: It can be difficult to get support from across an institution. Often there is a perception that it will be a herculean effort or that departments will lose control over "their" data. There is also a fear that findings may adversely impact individuals. Rather than focusing on the obstacles, teams should concentrate on the benefits and how much easier analytics has become over the past couple of years.

Cloud efficiencies have revolutionized the ability for even small institutions to get a big impact from their data. No longer does everything have to be on-premise and managed by in-house resources. Further, it doesn't have to take hours to run jobs to correctly format data for analysis; it can now be done in minutes.

Finally, it's important for institutions to assign champions and increase communications and training opportunities on campus. Institutions can benefit from putting someone in charge of analytics—not reporting, not IT, but analytics. By creating a position or truly carving out time from a person's commitments to make this a priority, an institution can have an enormous impact on increasing analytics maturity.


Andrew Clark is Enterprise IT Program Manager at EDUCAUSE.

Rich Clayton is Vice President of Analytics Product Management at Oracle.

Peter Fritz is Senior Manager, Higher Education, at Deloitte Consulting.

Meghan Turjanica is Product Manager, Analytics and Student Success Solutions, at Jenzabar.

© 2019 Andrew Clark, Rich Clayton, Peter Fritz, and Meghan Turjanica..