Empowering Institutions with Data

min read
EDUCAUSE Shop Talk | Season 2, Episode 6

In this episode, hosts Sophie and Jenay are joined by guests Chad Marchong and Ben Hellar to discuss cultural strategies and practical steps for transforming higher education institutions into data-empowered organizations.

 

Listen on Apple Podcasts Listen on Spotify

Takeaways from this episode:

  • Data-empowered decision-making goes beyond data-informed and data-driven decision-making to consider the full context of a situation.
  • Collaboration and communication across departments is key to maximizing the impact of data and analytics in student success, learning outcomes, and more—especially at large institutions with siloed communications.
  • Artificial intelligence (AI) is changing the conversation around data and analytics, and there are differing opinions on what the future of data and AI will bring and how effective AI will be in combining disparate datasets for empowering decisions.

View Transcript

Sophie White: Hello everyone and welcome to EDUCAUSE Shop Talk. I'm Sophie White. I am a content marketing and program manager with EDUCAUSE and one of the hosts for today's show.

Jenay Robert: And I'm Jenay Robert. I'm a senior researcher at EDUCAUSE, and I'm the other host for today's show.

Sophie White: Great. So today we'll be talking about data-empowered institutions. We're really excited to have with us two special guests on the show today, Ben Heller and Chad Marchong. I'll introduce them and then we will kick off our discussion. Chad Marchong is the associate Director of learning analytics and research at Georgia State University. His team supports academic initiatives that use analytics to inform effective teaching and learning practices. They also support institutional researchers with projects focused on teaching and learning through their projects. His team has transformed, analyzed and visualized data resulting in actionable initiatives and teaching and learning. He's interested in how data can enhance student learning and understands its ethical use in education. That's so important. I'm excited to dive into that more. And Benjamin Heller is our other guest. He is the manager of the Data Empowered Learning team, a mission focused data science and learning analytics unit of central IT for Penn State University for Penn State. Ben provides strategic and operational leadership of learning analytic insights that inform and empower the institution's student success initiatives. So thanks Ben and Chad for being on the show today.

Ben Hellar: Thanks for having us.

Chad Marchong: Thank you for having us.

Sophie White: Great. So data is a big topic in higher education and really everything related to technology these days. So we're talking about different ways that data can empower decision-making. It was the data-empowered institution was actually the number one issue in our annual EDUCAUSE's Top 10 report that we put out, which identifies the Top 10 IT Issues for higher education institutions. So data is a big deal as it relates to institutional missions and the direction we're going in higher education. So I'm curious, can you two just talk about why you're excited to work with data at your institutions, what kind of projects you're working on? Just a little bit about yourselves and your interest in data and analytics.

Chad Marchong: Sure. I guess I can go ahead and kick that off for us. So I've been working with data here in this space probably about five to six years, maybe going on seven years. I'm originally from an IT background and background in working with application development and doing developer sort of work. Got into the teaching and learning space so we can sort of support a transition for our learning management system going from one to our current learning management system, which is Brightspace by D two L. And in doing that work I learned a lot about how the backend of a learning management system worked and how to sort of provision that system and then start to get some data out of that system. And then I was tagged to lead up the new learning analytics team based on the background that I've had and then the limited exposure that I had with data.

So there was that learning curve that I needed to have around the use of data and how to sort of coordinate that with the work we're doing without learning management system. So when I first started the work, our institution has been known for working with student success and the use of data in that student success process, getting students to graduate, increasing out retention rates, increasing grades, those sorts of things. So it just felt like a compliment to the work that we are already doing around student success to then start to dig into data that is existing in our learning management system so that way we can pull that surface, that data out, provide that to our student success folks and they could utilize that data to be more informed about how they're supporting students moving forward. But also I work within our Center for Teaching and Learning.

So we also surface a lot of that data to help inform our faculty in their teaching as well. So they can utilize some of that data to then think about how they would like to redesign their courses, how they can identify assessments that may or may not be working well and redesign those assessments, look at courses from a sort of longitudinal perspective to understand if things are actually working inside of your course and what needs to change. And that sort of started to shift for us as well. So while we were thinking about and looking at some of that data, they also started to ask us questions around their research and they started to embed research and experiments into their courses and then they needed us to surface a lot of that data as well. So our team also created a process or a team that then allows us to go in and help surface a lot of that data. But we also created some wraparound services as well where we're also initiating research with our faculty and then we're helping them get to the dissemination point of their research as well, which is identifying their research questions, setting up their research, extracting the data, doing an analysis on the data and then helping them get that to dissemination so that way they can publish that work.

Sophie White: Thanks so much for sharing and I love that kind of two-way communication of you giving data to the faculty to improve their teaching and learning practices and then them coming to you to help analyze the data that they're working with in the classroom too. So thanks for sharing. Ben, do you want to chime in at all about what you state?

Ben Hellar: Yeah, sure. So my story is that before I joined Penn State around 2018, I was working coming from a user experience design and engineering background. So I have a lot of experience working with analysts and subject matter aspects, subject matter experts in a variety of different fields in order to sort of model and design interfaces that sort of help drive them to decision making. And when I started at Penn State, one of the things that I started to do was I was working with the data scientist at the time we were trying to pilot back then long before AI was a thing, we were just working with machine learning models and trying to figure out how we can sort of mine all the various data insights that are available in all of these institutional sources and uniquely deliver them to the people that are making decisions that impact student success.

So I think one of my interests and my background is a lot of institutional research is about pushing dashboards and reports upwards to executives and leaderships and we're trying to figure out, we're trying to help student success, we're trying to help retain students. Retention happens now and it happens on the front lines with academic advisors and retention specialists and support staff and faculty. Those are the people that make lots of decisions every day that impact how a student feels in terms of belonging and how well they're progressing in their career. And so I took that sort of user experience background and started being able to talk with faculty and academic advisors about the types of decisions that they're making and then starting working with a data scientist to help model data that can help fit their specific use cases. So some of our original success stories was during the pandemic when the university shut down and everyone went online, there was a question about how do we know which students are still engaging in their materials?

We forced every faculty to put their course on canvas, but how do we actually measure to understand who's still participating? And that becomes really challenging when you start to think of the amount of engagement data at a scale of Penn State. We have 44,000 people at University Park campus alone. Double that. When you add in the commonwealth campuses and rural campus and then add in all of those interactions, we're talking about a huge data set. So really beget a data science perspective in order to figure out how we can sift through all of that massive data. That's not something you're going to be able to load into a powered BI dashboard or an Excel spreadsheet. So how do you go through that data and be able to then create a model that helps understand and answer that question of what does it take to understand who is engaging in the class when not all Canvas courses?

The diversity in how courses are delivered at an institution is massive, especially one is Penn State. That's very sort of academic freedom, which it should be. But there's just to say there's a lot of different ways in which a course can be set up and not always a hundred percent consistency like in a K-12 environment where there's going to be these specific metrics, et cetera. So when we're starting to look at a larger dataset, how do we then, so let say that to step forward just a little bit, we ended up creating an analytic that was able to be a comparative metric that says how much activity is a student doing in a class in comparison to their peers. Then we're able to shape that and put that in front of academic advisors and faculty to be able to help them model those decisions.

So that's where our team started and actually, well that's one of the places where we started and one of the challenges that we've had was sort of getting through the data literacy, the training and sort of helping people better to understand data. And so we've come a long way in the past four years since we released that tool for a lot of apprehension, a lot of uncertainty, especially introduced this new data point and having to teach people how to learn it and how to operationalize it. And then also filling in all the other questions that they have that would help provide context to that data. So it's not just that a student, what their activity level is that we also have to wrap around, well what's the course, what's the mode of the course? Is it a remote course, is it in person? What are all the other different factors and data points that help provide insight and context to a faculty or an advisor that needs to make a decision about what's happening with a specific student situation. So I'll just pause there before I keep rambling.

Jenay Robert: I'm familiar with that problem. As a researcher, I always tell people, be careful about asking me questions. I may or may not ramble as a researcher does. Well, I'm curious about, I think something I'd like to chat about a little bit is kind of that people aspect, you both brought it up in describing what you do and where you sit at the institution. Ben, you are located in it. So you report up to the CIO. Right. And Chad, did you say you're located in a teaching and learning center?

Chad Marchong: I am located the Center for Teaching Learning. It's called the Center for Excellence in Teaching Learning and Online Education.

Jenay Robert: And so then do you report up to provost or what's that reporting?

Chad Marchong: Yeah, our department, our center reports up to the provost.

Jenay Robert: Okay. Yeah, that's really interesting actually. Just in terms of dynamics of where you sit at the institution and the types of people you work with, I'm curious. I know that from my experience of I used to work with Ben, so as put it to inform our audience a little bit, I used to at teaching and learning with technology at Penn State, which is where Ben started off. And I know there's been some shifts in structures since I left, but we used to be office buddies. So I kind of get that dynamic around where you report, where you sit at the institution, who you get to work with, maybe it's harder or easier to work with some stakeholders than others. So I'm curious about your experiences, both of you around that piece.

Chad Marchong: Got it. I think it comes down to relationships. If I can put it as simply as possible, think that there are always going to be wins. I think you're always going to have moments of celebration when you're working with folks. And I also think that you're going to always be faced with challenges. And I think that Ben mentioned apprehensiveness and I think that when we work with our faculty and we work with departments that see a lot of initiatives come through and they see a lot of this is the new thing, this is the new hype, they tend to be apprehensive, particularly if it impacts some of their metrics that they're concerned about. So for our faculty, if they're thinking about how students are going to rate them or how they're going to evaluate them, then they're a little bit more apprehensive to introduce something different into their workflow, into the way that they're teaching.

If it's a department that they're doing a lot of different types of initiatives, they're being asked to do one thing, they're being asked to do another thing, they may be apprehensive because they're looking at a list of more projects that can be more important than other projects. So I think it comes down to just establishing a really good relationship with folks around the campus. I think that our team has done that fairly well because what we actually have to, because we're sitting in the center for Teaching and learning, we have to then go out and engage with our IT department, our Office of institutional effectiveness, different departments that are stakeholders in housing data as well as giving access to data. And our job is really to be peers with them, to be partners with them. If they have a question that we can help them answer, we want to make sure that we go and try to answer those questions for them.

We actually have more questions for them because they are the creators of the data. So we're always asking, where does this data exist? Can you tell me a little bit more about this table? Why is this name this way? This table is missing a set of data. Should we be looking at another data? Why is this data warehouse table look the way that it does? So there's a lot more of us doing those questioning than they are questioning us, but then we always have to be mindful of making sure that we create these relationships that can exist beyond these types of questions. We can continue to have these conversations. I always look at it as like they're friends of our team, they're friends of the learning analytics team, so how do we continue to keep our circle of friends from fragmenting but also increasing it at the same time?

Ben Hellar: What you said resonated a lot with me. I often describe the work of our team as applied forensics because a lot of times when you're trying to combine different data sources together, there's a lot of business rules and some are documented and are implicit are based on the way it was implemented ten years ago from a person who may not even be here anymore. And so there's a lot of sussing out, well, why does the business process work this way? And the way you get answers to those questions is exactly what Chad said. It's through relationships, it's through a lot of social capital in terms of engineering and communicating with stakeholders from an it, it is our own stakeholder, which does help streamline a little bit of the data governance and access management because we sit on top and it sort of owns a lot of the data, but it's still in terms of, I misspoke earlier when I said we own governance.

Governance is actually distributed across the enterprise. And so one of the challenges then, even if we have access to the data, well who is governing this data? Who is controlling the business logic about how this data is populated, how this is run and how do we understand downstream impacts everything from, and we've learned so much about just the way the institution works, anything that seems as mundane as cumulative GPA, when does that get calculated? When does that get updated? Is there historical tracking of cumulative GPA and who can see that and who has access to that? And so some of these things are, some said there's not always a manual on some of this stuff. Sometimes you have to go back and forensically discover how this works so that you can then communicate what the insight is downstream to an end user so that they can have some trust and understanding of what you're actually trying to present to them.

Chad Marchong: And I wanted to pick up on what Ben just mentioned. I don't know what it's like. I'm sure his institution is very similar to my institution, but I'm sure some institutions aren't built the way that we are. We have a lot of silos, so we have a lot of different departments that are creators of data and users of data as well. And so the example about a cumulative GPA that Ben mentioned that could be created in a certain department, maybe it's being created or being defined, I should say, not created but defined in a particular department. Maybe someone in the provost office or someone else, the registrar, maybe they're the ones that's defining these things. But there are other, like there's housing, there's graduate services, there's undergraduate services, and these folks are all sort of populating and creating data that then streams all the way up.

And sometimes it's our job to go and find out what this data is doing or Hey, I need you to analyze this data for us. And our responsibility is to then go back to the folks that are managing this data and say, all right, I see something here. Can you please define what the singular data point means so I can do a better analysis of what is being requested of me? And that requires you to have, establish and build these relationships across campus and not just within your department or those departments that you visit or talk to frequently. You also want to do it with those departments that you don't have conversations with as often.

Jenay Robert: You two have any concrete advice for people trying to develop some of those relationships. I think that can sometimes be a challenging aspect of working on a team like this. And again, thinking back to the days that I worked at an institution, you don't always know who's the person to talk to in this department or what even is the department I need to talk to solve this problem? You don't know what you don't know. So I'm curious. I feel like that would be really valuable insight if you have any tips for developing that network at the institution

Chad Marchong: Before the pandemic. When my team first started learning Analytics team first started as we started to accumulate relationships across campus. We used to have every month we had a Friday lunch and learn and we would bring our new friends in from across campus. We would ask them if they wanted to contribute food, we would have drinks and we would have food. We'd spend about an hour or so just everyone just talking about what work do they do, what are you all responsible for? And you learn so much from these folks. You think that they're good for one thing, they know one thing, but they really know a lot more. They sort of open the door for things that you may not even know that they work on a day-to-day basis. But not only that, they can also connect you to other people that they're working with on campus.

So a question can come up where you're like, oh, I'm trying to figure out how many students that are currently living in housing that are Pell eligible students. And they will say, well, we don't know who lives on campus, but I can connect you with the housing person and I can connect you with the financial aid person so they can help you to find the data that you're looking for. And we house that in our data warehouse, but we don't know what the definitions are. So you can go and speak with 'em. I'll give you their contacts. In fact, I will do the introduction for you. And so that helps with building those relationships with people in a way that allows you to have those conversations. What I've learned is that not everyone wants to hoard this work. Sometimes it may feel like everyone wants to keep their secrets to themselves, but a lot of them don't because they're so overwhelmed with amount of work that they're doing that they're willing to let you in. So maybe you can help them with a process, maybe you can help them find an answer, maybe you can help them move their work forward. So I've found that most people that I work with that works with data are so open to having conversations and letting us in that they're happy to build these relationships with us.

Ben Hellar: That's a great answer. I was going to say beer and pizza, but that's a much larger, longer way of saying that. The other thing I would tack onto this is one of the things that's helped us is the idea of identifying champions for our work, both from a stakeholder standpoint, but also just from an end user standpoint. Let me tell you a little bit about what I mean. So I mentioned before that we have one tool for that we sort of develop for academic advising that helps sort of identify students engaging. One of the very sort of critical growth points for our team and for that analytic to be successful was for the executive stakeholder, the dean of advising at Penn State, Dr. David Smith sa, our work and after during the pandemic, and he was the one that sort of put forth says, I want this in front of all advisors by the start of the fall semester.

And so having an executive sponsorship also helps the institution realize that you can drive strategy through data, you can drive strategy through analytics. And so being able to put data in and also suggest ways in which that data can be used can help sort of shift and shift the culture around mixed data literacy, almost a byproduct. It's the business goals and objectives of being able to support students becomes number one, the fact that we are improving the data literacy sort of a byproduct of people having to have benefits out of working with that. So my first advice is finding a good executive stakeholder that can help push forth. And as we tackle new initiatives and new projects, we do the same thing. So we created another tool, course Insights at Penn State, which is for instructors, and we started as a pilot and eventually we're able to get adoption through it now through the Center for Faculty Development at Penn State, which is under the provost office.

So it gives us a connection to help understand, but also as a way to support how faculty can use data to understand the populations of students in their course. So said, one technique is definitely finding the executives. The other technique is finding the end users that help really articulate the problem. One of the things that we've really discovered and just in my history of being in UX is that people can often articulate the symptom and sometimes we'll have very pointed opinions over, I need this data point, but they don't always often understand why. And so being able to ask why they need something or to be able to articulate what their problem is, finding those key for us key advisors or key faculty that really can articulate what their problem is that they're trying to solve. And if we do that, then we can build a solution that not only helps them, but helps others as well. And so finding those key champions, and they may not be anywhere in the hierarchy of leadership. They may just be faculty or advisor, but they are some of the key people that can help break open a problem and be able to help you innovate and help solve real needs at the institution.

Sophie White: Yeah, go ahead Chad.

Chad Marchong: Sorry. Yeah, one thing that Ben mentioned, I know I talked about front end and Ben complimented me with the executive stakeholder side of it. One thing that we found success with working with our stakeholders is some of the things that Ben has just mentioned, being able to answer their questions but also continuing to ask the question of why. What do you need this for? What problem is this going to solve? If we presented you with the data that you're asking for today, what is the thing that it will go and help you do tomorrow? Will there be more data? There's always going to be more data that they're looking for, but if we can get in front of that while we're having these conversations, we can go ahead and help them answer those questions. One thing that I also talk with my team a lot when we work on data and do an analysis and share the data with others is how can we anticipate their needs?

How can we listen to what they're saying and try to uncover what they're not saying to us and then anticipate that and then provide that data for them. And that has worked for us in so many occasions where maybe we're not working with the executive, but because we've anticipated not only the stakeholder requesters questions and we thought about what the executive is thinking, the person that asked us this question will then go present it to the executive and the executive's like, all right, this is great and I need you to now do this for us. And then they're the ones that are in meetings saying, Hey, we got this pretty cool thing from this team that's helping us answer these questions and we'd like for all of you to see that. And that's sort of how our work sort of flourishes around the institution. Well, so I think it's important not only to go and answer questions for folks, but also try to anticipate their needs before they can think about what they need. Because Ben can attest to this. There's going to be more questions. It's like, Hey, I need to know how many students are enrolled this semester. Can you break it down by campus? Can you tell me who's under blah, blah, blah? So if you understand what they're all looking for and try to anticipate that you can put that data in front of them before they ask the question, and then that can help build that relationship, that rapport and help get your work to other stakeholders.

Ben Hellar: When we were piloting some of our tools course insights before it got released, we were, myself and my colleague were manually sort of training everyone in a one hour session on our tool before it got out the door. And then eventually from that we created some asynchronous training. But I think the point is that we purposely went out and did talking tours, and we still do for advising shops because there's always some turnover in advisors and always new faculty coming in. So we do regular talking tours, especially in the fall or the beginning of the fall or beginning of the spring semester for us and going and just doing talking tours for various units that request 'em to say, Hey, here's the tools, here's how you can use it. And explaining it and being able to provide context, not just dropping off an analytic or a dashboard and then just walking away.

We want to make sure that people understand not just how it works, but the value in what it can do. And that also both allows us to build those relationships and we learn a lot. We've learned so much and have actually added so many capabilities and enhancements because we'll go to some meeting and someone will tell us, you know what? It'd be really helpful if we could know the student's favorite flavor of ice cream so that we could reward them with that at the end of the semester. Just a complete hyper example. But again, that's the idea of if there's a need for that, we often hear that and then wrap that into the next iteration of our tool so that we can say, Hey, yeah, this is something that serves this need. And then we also then have a documented use case that goes along with it.

Because part of this learning analytics space we didn't touch to is we do a lot of work with our privacy office and our FERPA office because every time that we are unveiling a new fact, a new piece of data that's often a fact that hasn't been necessarily unearthed or put into the context that we're delivering it to our audience. So we always have to go and validate with both our privacy office as well as our FERPA officer. He was the registrar to be able to say. And again, there's also relationship building there of making sure that we're taking stuff to them ahead of time so that we don't end up on a new site somewhere saying, Penn State X did something awful. Our goal is to avoid being a headline. We're dragging from the faculty senate. We don't want that either.

Sophie White: Yes, one of our resources in the showcase associated with this issue is a new cybersecurity and privacy horizon action plan that Jenay worked on that has to do with building strong foundations for cybersecurity and privacy. Also very tied into this. I also wanted to say I have never been to Penn State's campus. I lived in Philly for four years, but people talked about the ice cream there all the time. So I feel like that is a relevant data point.

Ben Hellar: Yes, yes it is. But it gives you the idea of if people are requesting that, that's the type of data point that then later on there could be a strategy applied from executives later that says, you know what? We've decided that we want to reward X population of students with ice cream if they do well on these tests. So that's like a way that you can turn a data point into strategy as long as leadership is willing to provide resources for it. I think a more practical example would be identifying first generation students. So we have the ability to identify who are the students at Penn State who are coming from places that, not coming from places, but have parents that have never graduated with a bachelor's degree before. And so this is their first time. They may require additional understanding of how to be successful in the college environment. And by pushing those facts to advisors and by pushing those facts to instructors, they can start to diversify their instruction or consider or as an advisor, consider resources that can help students get acclimated to a college environment.

Sophie White: I love that. And I also love what you were saying earlier about data literacy among faculty and staff and students. I think even being a byproduct of this relationship building work too, that as you work with people, you can give these data points to the advisors, but then making sure you have the relationships to explain this is how you can use this effectively to actually support the mission of the institution and the students is so impactful too.

Jenay Robert: That data literacy piece was a part of that action plan as well. The panelists identified as a really important component of any data initiative really need to be thinking about the literacies data literacies of the stakeholders involved. And I think it's so poignant because even though Ben was joking about the ice cream thing, you do get interesting, I'll say interesting requests from end users that it may not be so silly as ice cream, but there are sometimes things that people will ask for where they don't quite even know what they're trying to get at. And that was kind of to Chad's point about what are you saying, what are you not saying? And that's where that expertise of that data professional is so important and not just, I know how to crunch numbers, but I know how to translate that into human language. Really important.

Chad Marchong: Yeah, we're actually working on a data literacy initiative on campus right now, and what's interesting about data literacy is that it's defined differently by different people. And I didn't realize that until I started talking to different people on campus about this data literacy that we have going on. And some of it can be, it's like as simple as doing some work within Excel, what's going on? What's an average, what's a mean? What's grand mean? Those sorts of things to maybe how do I data visualize continuous variables or categorical variables or how do I show data over time to, we did a consolidation about eight years ago with a two year institution, and a lot of that data is mangled and all the people that worked on it is retired. What does all of this stuff mean and why did we do it eight years ago to some of the points that Ben was talking about around if you create a product, let's say a data dashboard that's showing how students are performing over time, it could have came in as a request from someone that's like, it would be great if we just had a dashboard that showed how students performed in their course.

And then you finally get it out to people and they're like, what am I supposed to do with this? Right? And that's another form of data literacy where it's like, all right, now I have to define all the variables that you see in this dashboard. But not only that, I need to now give you some example use cases that you can take with you as you look at this data and try to think about how you can apply the data that you're looking at to the way that you're teaching your course, or if you're getting some data over to an advisor, we've definitely gotten some data over to advisors and there are hundreds of advisors and they're just like, I don't know, this is just another data point that I'm looking at. So it's how do you get them up to speed on not only how to use the data, but what does this data mean?

And they can also find creative ways of doing that, but if they don't know what it means and what's the purpose of it, then it's just another data point that's just sitting there that they will probably never access. And we've spent a lot of time sort of putting that in their face and developing something that we thought that can be useful. So I think that's another piece of the data literacy where it's just like, all right, maybe we need to understand what's two plus two, but we also need to understand once you get that result, what are some of the options? What are some of the things that are available to you that you could then begin to apply with that?

Jenay Robert: I love that such an important piece of the conversation is how are we going to use these insights that we say this so often in educational technology? Just because we can do something doesn't mean we should, and that is so relevant to data and analytics. Just because we can collect a certain data point doesn't mean we should. Just because we can predict a certain outcome doesn't mean we should. What's the value that we're actually getting from doing that? What's the value the end user? What's the value the student is actually getting from that? You have to start from that. I think

Chad Marchong: We, sorry, Ben. No, go ahead. Spent lot of time developing dashboards, creating dashboards, and we spend a lot of time creating dashboards that never get looked at. Our stakeholder would come in and they would say, my entire department's going to use this, and we need to give you access to everyone in our department. And then two months later you looked and the only person that looked at it is the stakeholder and they looked at it the day that you gave it to 'em. So we have a lot of those sort of projects and before we take on a project now, we always ask ourselves the question first, if this stakeholder's asking for something, can this be relevant beyond this stakeholder? Is this something that more than one person can use or more than one department can use? And if it's not, how can we create something that can go beyond the stakeholder?

Maybe it answers all the stakeholders questions, but can it also go beyond what they're doing and can we then release this to a larger scope larger audience than just a stakeholder? And that has been our constant question. I mean, we're still trying to figure out what that perfect recipe is because there are definitely dashboards that we create, reports that we create that gets used maybe once or twice a semester, but we also need to be honest with ourselves as well, is it only supposed to be used once or twice a semester? Has this report, has this dashboard save this person a copious amounts of time answering their questions? Or maybe they didn't have this question answered in the first place and they've always wondered it and now they can access it once or twice a semester, that can then help them do their job better.

Sophie White: I'm curious on that note, how do you decide whether a project or a dashboard is worth the effort? Do you have key performance indicators that you're measuring or some kind of criteria or checklist that you go through before fulfilling a stakeholder request for something like that?

Chad Marchong: I think for us it's been what is the best output for our stakeholder? So we start to think about should we build a dashboard or should we just get them a static report? And if we know that they're going to constantly look at the data, then we're moving more towards a dashboard. But if we know that this is a very specific question that they're answering, then we try to build a dashboard that simply answers those questions. And so we don't spend a lot of time in cycles on building that dashboard. If we know that it's going to answer more questions, it's going to go beyond that. Then we start to think about how can we build a more intricate dashboard that brings in data from different places? I think for our team, it's hard for us to say no to someone if they're coming and they're asking, particularly if it's a person within the institution, it's hard for us to say no, but how can we create something that's going to be very, very relevant for them? And we also look at our work as sort of an iterative process as well. So it isn't just like, okay, we're done and we're moving on. It's like, all right, let's put the lowest version out there to them that's going to work, that's going to answer their questions and let's see if they have more questions for us, and then we can continue to iterate and build on top of that.

Ben Hellar: At Penn State, we're a little bit different just because we had a separate institutional research group. So that group is focused on institutional reporting and creating dashboards for public news stories and for accreditation where our team is focused on learning analytic applications. So if it's a learning analytics that sort of scale to the entire institution. So we sort of delineate that if this is something that's a question that's historical in nature or is for research purposes, then that's something that we hand over to the institutional research side of the house. But if it's a question of what is a question about a student or a course that is helping understand what is the sort of current state of what's happening at the institution right now, not at the end of the semester, not before the semester starts, but what's happening today, those are analytics that we bear down on and that we engage on to understand what are the types of questions that we can help faculty and students, faculty, basically anyone below the C level of the institution, how can we help those people understand?

Because as you said, the executives and the stakeholders, a lot of times they're really, they're already being satisfied with a lot of the reporting that's already being done. And so what we find is as Chad found, is that some of those reports don't make it down to even one rung lower, let alone people at the bottom rung. So this is where we became the data and power learning team is how do we empower those at the various levels of the institution with data so that they can be informed and be able to make the decisions with confidence that they need to be able to support students.

Sophie White: Awesome. Thanks for sharing that.

Ben Hellar: Yeah. I will say one other thing that's funny because we're talking about data empowered. One of the anecdotes, I always got the question of difference between data-driven, informed and empowered. And the way I've always described it, and I don't know if this resonates, but I just wanted to try it out here, is data-driven is looking at that indicator in your car that says you are on low gas. That data-driven indicator says, oh, I see the low fuel indicator turn on. I need to go to a gas station right now. Data informed approach is I see that fuel indicator and I think, you know what? Given my experience driving this car, I know I have 25 miles and before I need to stop. But a data-empowered approach is taking not just the data, but all the other things you may know to say, I see that light come on, but you know what, this is a rail car, so that's not going to be my problem. I'm just going to go turn it in before I have to fuel and I'll pay the fuel cost to enterprise or wherever I'm running the car from. So it's taking that extra knowledge that you know about the situation that isn't in any sort of database or any sort of data system, and then applying that to make an informed decision.

Chad Marchong: I think that's a great analogy, Ben, because when I think about data informed and I think about our institution, I think about all the resources that we have available to us, and I think about how often they get access to the data that maybe my team is creating and my team is producing. And too often I'm having conversations with a stakeholder and I'm telling them that maybe we should include this particular department into this conversation. Maybe we should bring in this department into the conversation. Maybe we need this sponsor to come into this conversation so they can increase the visibility of the work that we are doing. And I feel once you start to bring in those partners and you build that collaboration, then you can get a better sense of how you should move forward with the results of your data and how others can benefit from the data point that you're sharing with them as well. They can be a part of the process of developing that strategy instead of you working by yourself, you can bring in those partners to say, well, we just introduced this particular technology, a tool that can help students do this. Why don't we include that and the work that you are doing? And then we can measure that. Then that becomes a point of measurement for you as well to see if this particular intervention is going to work or it's not going to work and we need to go in a different direction.

Ben Hellar: We apply that approach. That's something we learned from training advisors and faculty instructors when we are showing them our tools, is that we're going to give you a variety of data points, and this is going to paint a picture of what is happening with the student in a course, but it may not always be the absolute truth you have to take. What else about a student that's not in a system like the student may have told you that their grandmother passed away. And so you're going to see decreasing a variety of leading indicators of performance and activity, and that's probably natural based on what's happening, but also means that you can also approach the student, suggest them resources for counseling or as an instructor, sometimes you can offer them of passage easements on grading or deferring grades to different points in order to help the students. So you can see the indicator, but also taking what you know that's not going to be in any system to be able to figure out what's the best approach.

Chad Marchong: I think that's a great point, Ben, because I always say that every data point has a story. Every data point tells a story. I remember showing data to faculty around how students were performing over a number of semesters, and I remember showing a particular data point with a faculty member that where their students weren't performing as well. And before we shared that data, I was actually showing it to some administrators and they were like, what is going on with this data? Why are students not performing as well within that semester? That's just strange. And then when you finally show it to their peers and their colleagues, they immediately say, well, that faculty member had some challenges that semester. So it's like, all right, well, we need to figure out what these stories are before we go and start making decisions that could just be simply be a blip on what is going on in that person's lifetime at the institution, or it could be something that we need to begin to address. But before we start to think about solutions, we really need to drill into what's happening from the person, the people perspective. And I like to think about it from the quantitative. So our team does a lot of quantitative work. We do qualitative work around surveys and interviews and things like that, but it's mostly quantitative. But then how do we get to the qualitative side of what's happening inside of that data?

Jenay Robert: My qualitative researcher heart is singing, thank you for the shout out, the qualitative and that it's all linked together too. I mean, going back to our discussion about data literacies, I mean that is also about bringing in the story and the qualitative and what's the meaning. So just really good circle back there. But I work coming close to the end of time and I would fail at my job if I didn't start talking a little bit about AI here. I know we have somehow managed to not focus on AI for 52 minutes almost, and that might be a record for Shop Talk or for any professional meeting I've had in the last three years, but we should touch on it a little bit. I know part of this showcase is the AI landscape study, which I coauthored with Mark McCormick, and it is a little bit of an obsession for me. I'm not sorry about that. It is what it is. So I just kind of briefly wanted to touch on this and see what are some of the changing developments happening with perhaps the way stakeholders are interacting with you? Are there new expectations? Are there exciting new capabilities that people are now on board with? Is there even more of that data literacy stuff that's like, okay, we need to talk about what AI is and what it isn't. So what's going on for you all?

Ben Hellar: Ben, do you want to? Oh man. Okay. Yeah, I mean, I think some of this isn't necessarily my story to tell because there's been definitely a lot of work at Penn State on AI governance, but that work has not necessarily been through our shop. I will say that in my mind there's actually sort of two kind of types of AI governance. There's AI governance that's looking at what are the use case, what are the applications, what's the ethical standards for how we're using it and teaching and learning how we're using it to monitor or detect cheating, et cetera. And I think there's governance around the applications of it, and I think that's vitally important. I know there's a lot of good meaningful work that's happening in concert around our provost office, but the other part that gets into my role is more like the AI data governance.

If we have these various models, how do we determine what data they should have access to at the institution? So some of that even bets the problem in the first place is that I was actually telling Chad before we even started this podcast when we were introducing ourselves that I'm not worried about AI taking my job anytime soon because we don't have data systems just like Chad. There's lots of little silos of databases all across the enterprise, and if those systems can't talk to each other with a person, there's no hope that AI can magically connect things together.

We still have to get data ready to be able to be fed into AI and then provide a governance layer on top of it where we can say, what are the use cases that these LLMs and these chatbots and variety of things are being used for? And then we can determine what is the scope of data that they should have access to. And so that's the AI data governance piece. Obviously there's a link between the data governance and how that works and then the application and the use case. But I think there are two components that I think there's a Venn diagram where they overlap in the middle.

Chad Marchong: So I have agreed with everything that Ben has said, 99%, 99.9% of the time since we've been on this Shop Talk, except for the part about AI figuring things out. And I am under the belief that ai, if you put two data sets in front of them that are just opposites of one another, AI will figure out a way to make them relatable and be able to answer your questions. The tool right now may not be as advanced for it to do it the way that you need it to do currently, but I see the rapid growth in ai, the expansion and its capabilities, and I can potentially see that sometime soon. And maybe there's an AI that will bring all of those dispersed data sets together one day, Ben, and save our lives.

I'm still waiting for the day where we build a dashboard or we have a dataset and ai, you can just ask it a question and AI will then start to give you those responses or maybe feed that response onto a dashboard so that way folks can have their responses, their everyday sort of KPI that they're looking at that shows up exactly when they need it, when they need that. And I think that would be sort of a true advancement of where we're going with AI is that the data's in front of them when they need it, when that exact sort of specific data that they're looking for. I think we sort need to back up, and Ben mentioned a lot of this already, is what's the governance around where this data is going and who then has access to this data and what is the right data that should be fed into ai?

You talk to some folks and they would say, let's give 'em all the data, but we still have this FERPA that we need to consider around student privacy and making sure that these AI systems and technologies don't use the data to then feed it to different models or their same models to help build on top of that model. We're seeing some models that can exist within your environment only and don't go out or don't come back in. Those are some opportunities for us to start looking at those, but we want to make sure certainly that those are secure enough for us to start building some good AI models on top of. And I think once we start to get to that point, I think the institution can become even more data-empowered. So rather than us sort of spending time building a dashboard, hopefully a lot of that data then is in existence within the institution, hopefully secure that rather than looking at bars and charts, they're just asking questions, we're just asking questions and we're getting responses on what the data that we're looking at specifically. But I think it's going to require a lot of work. And Ben mentioned that, he mentioned making sure these systems are starting to talk to each other, that the data governance is there, that the technology is there, that it's not giving us false information about what the data is telling us.

Ben Hellar: And to me, that's why I guess I'm not, as I said, not worried about it, right? There will probably be some evolutions in how data is able to connect, but to me it's like a question of accountability. Especially I'm coming from a perspective of a student success space and having an AI try to recommend a course for a student to gives me heart palpitations, thinking about who would be accountable if they suggest a course that's not a part of their career path, or they missed the prerequisite data because it didn't exist in a system or it didn't know the history of how this professor taught the course. Because again, it's not in any dataset. And so right now, if an advisor or a staff member or someone at an institution makes a horrific mistake, there are penalties, there's ways, there's disciplinary actions that could be taken if something goes horribly wrong. If an AI does that and a student goes on a horribly wrong path, we going to hold Microsoft accountable for copilot, that's not going to happen. So how do we sort of govern to make sure that those use cases are protected and scoped appropriately so that we're only getting the questions that we're desiring AI to actually answer and importantly are going to answer them correctly and never incorrectly? That's the challenge.

Sophie White: Truly the challenge, I think we're all of the use cases of AI we're looking at are with guidelines in place and keeping it within these boundaries, and we're still establishing that I think within teaching and learning, but also across the institution as we look at how to use it, what type of data it has access to, all of these really important considerations. So thank you all for indulging us on that question, and this has been a fascinating conversation. I think we're at the end of our time, but I learned so much from you all just about how you're looking at data at your institutions, working with stakeholders, and then this AI conversation at the end. We'll have to follow up in a year and see if we're all on the same page with that. But thank you all for your time, Chad and Ben, and thanks Jenay for being a fantastic co-host, as always.

Jenay Robert: You too, Sophie. Thanks Ben and Chad.

This episode features:

Chad Marchong
Associate Director of Learning Analytics and Research
Georgia State University

Benjamin Hellar
Manager of Data Empowered Learning
Penn State University

Jenay Robert
Senior Researcher
EDUCAUSE

Sophie White
Content Marketing and Program Manager
EDUCAUSE