Smaller institutions are finding creative, values-driven ways to explore artificial intelligence tools. This conversation unpacks how grassroots experimentation can guide responsible innovation.
View Transcript
Jack Suess: Welcome to the EDUCAUSE Integrative CIO Podcast. I'm Jack Suess, vice president of IT and CIO at the University of Maryland Baltimore County.
Cynthia Golden: And I'm Cynthia Golden. Each episode, we welcome a guest from in or around higher education technology as we talk about repositioning or reinforcing the role of IT leadership as an integral strategic partner in support of the institutional mission.
Cynthia Golden: Hey everyone. Thank you for joining us today on the Integrative CIO. I'm Cynthia Golden, and I'm here with my colleague Jack Seuss. Hi Jack.
Jack Suess: Hi Cynthia.
Cynthia Golden: And we'd like to welcome you to the podcast. Today we're talking to our colleagues, Dave we, Dave is vice President for IT and Analytics at Ithaca College and Joe Forrester. Jill is VP for Information and Technology Services and CIO at Dickinson College. Hi everyone. Welcome Dave and Jill.
Jack Suess: Hi.
Cynthia Golden: Hi Cynthia. Hi Jack.
Jack Suess: Hi Dave. Yeah, and Jill, we saw each other last week.
Cynthia Golden: So today we've invited Dave and Jill who are two prominent leaders in higher ed it, and they've been doing some thinking and writing and implementing of ai, artificial intelligence on their campuses. So we have a lot to talk about, but let's do some introductions first, Jill, would you share a little bit about your background and how you came to your current role?
Jill Forrester: Yeah, I'd be happy to. So I have been at Dickinson since 2002, so that is 23 years now, which is really hard to believe. But I've been really fortunate to grow my career at Dickinson, which I know is not something that everybody gets to do, but it is something that Dave and I both have in common. But before I even came to Dickinson, my background, I studied math and computer science and have a master's in mathematics from Clemson University and right after graduate school, started as a software developer and loved it. I loved building software. I actually worked for Clemson for a few years and then moved to Duke Energy where I worked at a nuclear power plant designing software for the nuclear plant operators and then shifted to higher ed. So talk about a big shift going from a lot of controls in place for anything you develop to a higher ed environment where things are a little looser. But I started in our admissions office as a systems analyst, moved into enterprise systems where I was the director for quite a long time, and then moved into an interim CIO role and then in 2022 moved permanently into the CIO position.
Cynthia Golden: And Dave, how about sharing a little bit about your background?
David Weil: Sure. So Jill, I've been here for a long time. I actually started as a student back in the late eighties. My second semester freshman year I joined the academic computing services unit as a student consultant. And now 30 plus years later, I'm the vice president for IT and analytics, obviously, it's not that simple. There's a lot of little things along the way, but I've been very fortunate to be at an institution that has grown as I've grown. I started before the internet and before a lot of these technologies were there. And I've always been fascinated about the intersection of technology and work. How can technology really help people do their jobs, their learning better? And I think at Ithaca it's a really good fit because we are all about the student experience. Same at Dickinson. But so it's really how do we leverage these tools that we have at our disposal to really help our communities do a much better job at what they do.
Cynthia Golden: Well, thank you.
Jack Suess: Well, I'm thrilled to be talking to both of you. And I share Jill with you that I was a math and computer science major. And Dave, I share with you that I was a student employee at my current institution. So wow, we have a lot of fun together talking as we go through this. So last fall you published a terrific article called A Roadmap for Leveraging AI at a Smaller Institution. And this is a really great article and I encourage all of our listeners who are getting ready to be thinking about AI to take a look at this because you really walked people through a lot of the thinking that you need to do. But can you talk a little bit about what prompted you to write the article and how your thinking has emerged? And would you do anything different now that you're six or nine months into the article? This is a field that's changing almost every week daily, Dave?
David Weil: Sure. So Jill and I connected a couple of years ago and just started talking on a regular basis about the things that are happening at each of our institutions. And during one of those regular calls, we started really talking about AI and the things that we were doing at our institutions and we were learning from one another. And we said, we think there are probably others at similar size institutions. Both of us are from institutions of less than 5,000 students. And I think that actually makes a difference. I really think the large institutions have additional resources and more formal avenues I think for exploring ai. And Jill and I were really trying to say, well, how can we do it at institutions of our size? And that sort of led to sort of developing a roadmap and hey, well, we could try this, we could try that. And one thing led to another and we said, well, hey, let's create an article over that.
Jill Forrester: And I'll add to that, certainly we were seeing similar challenges at our institutions. I think too, as we were talking to other CIOs at small schools, we were hearing similar themes from them. But I think too, selfishly, Dave and I kind of wanted to build this roadmap so that we could use it as well. We don't have all the answers. And I think that's the interesting thing about how AI has really came to the forefront so rapidly that many of us who are leading IT organizations, we're having to run alongside our campus partners instead of a few steps ahead of them. And so it's a bit of a challenge. And so selfishly, I think we both wanted to create this roadmap to also help us guide our institutions along this journey.
Jack Suess: Yeah. Well, I'm going to say that I think that you're overestimating how far ahead larger institutions may very well be. And from reading your article, I know it's useful certainly to mid-size institutions like UMBC, which is about 15,000 and I think even for larger institutions, because I think all of us are struggling and we're making an assumption that everyone else is so far ahead, but when you really sort of get down to it, we're all sort of trying to work through this ourselves back and forth. Anyway, it's a great article.
Cynthia Golden: I agree with your assessment, Jack. I really enjoyed reading it. And in the article you guys talked about some activities that can help institutions better understand AI and kind of be foundational lay a foundation for them to be able to use it effectively. Could you describe a few of those? I think our audience would be interested in hearing that.
David Weil: Yeah, I think Jill and I took slightly different paths and that's what was so great. We learned from one another. Jill started with a presidential working group, and I was focused a little bit more on some of the development stuff. So Jill, why don't you talk about the presidential working group,
Jill Forrester: Right? Yeah. We wanted to bring together a group of individuals that really represented all areas of the college. So we wanted faculty involved, students involved, and the different areas of our administration where we could come together and really think about how we could help Dickinson move forward with AI initiatives. We could help the community learn about the value of using ai, but also recognize that AI for us was, we weren't approaching it as a replacement, but as a partner. And we wanted to make sure anything that we were doing was in line with our mission and values that Dickinson held. And so having that group has really helped us and we divided into a little subgroup so we could tackle different issues such as reviewing policies and making a recommendation to the campus on one, do we need a policy around ai? And if we do, what should be included in that policy?
Jill Forrester: The working group has also thought about how do we engage our campus partners in learning about AI and beginning to use it. And so one of the things, and that's actually one of our foundational activities, is how do you create learning opportunities for the campus community? And that's nothing new. Those of us who've been in technology, we always have to be cultivating our campus community and helping them learn about new technologies. And it's often hard. It's hard to get people to show up to a workshop or to training. And one of the things the working group came up with that's been so successful, and I love it, is instead of doing popups where you say, Hey, we're going to have a workshop, come and join us, we do pop-ins. So we show up with the division heads approval at a full division meeting and we do an AI workshop.
Jill Forrester: Our president invited me to do an AI workshop with our cabinet, and we spent an hour and a half working through hands-on activities with ai. And then we've done that with several divisions on campus. And it's been great because it's been eyeopening for members of the community to see the potential of what you can do. And then it also is an environment in which people feel safe expressing concerns that they might have about bias or ethical use of AI and when they should use it, data privacy concerns. And so doing the pop-ins has been wildly successful here at Dickinson. So Dave, I'll let you talk about some of the stuff you've been doing at Ithaca,
David Weil: Right? So we learned a lot from what Jill was doing, and here we really were leveraging the fact that we are a merged organization, we're IT and analytics, and that gives us insights into the data and AI feeds on data. And so I looked to my analytics team and I said, I experiment, play the data, the questions that are being asked. And so they took that approach and started creating some interesting pilot projects where they were using institutional data to help answer problems or respond to needs that were out there. And that really allowed us to get a better understanding of how the tools worked and where the potentials successes could be. And that led to one of our first AI applications that we deployed, which we called Nebula, which helps with some of our students who are struggling a little bit here. So really at the end of the day, I think Jill and I, we came in it slightly differently, but we saw that it was complimentary of one another, and I think that really helped form the basis for the article that we wrote.
Cynthia Golden: I love the pop-ins idea. I think that's great.
Jill Forrester: But I think too, our different approaches, and you'll see this in the article, that we don't just give one direct approach because every institution has its own culture, and you have to find the right way to work with the culture of your institution in order to move really any technology initiative forward. And that's what we do as leaders in this space is our job is to really understand that culture and think of avenues that will work best at our own institutions.
Jack Suess: So we're talking right now in mid-March, and as we think about this, we're recently Google released, if you're a Google campus, a whole slew of new functionality and their tool set. And you see this across different products. And so I'm just mentioning Google because it happened last week. How are you dealing with the fact that the vendors have a release timeframe that is much faster than the way that we normally think about this? And how are you beginning to be introducing some of, or are you introducing some of their functionality that's available for free, so to speak into your offerings and into your trainings for people to be using? Whether that's Microsoft with its copilot suite, which I think is what you're using, Dave, or whether it's Google or other things that may be there. Joe, maybe you go first on this one and then we'll go to Dave.
Jill Forrester: Well, Jack, that's a really, I mean, wow, that's a tough question and I think we'd all love a good solid answer on that, right? The vendors, they are releasing AI functionality in the software, so whether we like it or not, it's going to be there. And so our approach is really, again, I'll go back to the Poppins continuing to have those to show the embedded AI functionality that we're a Microsoft institution as well. So right, so copilot is there, but we cannot afford to give everyone the full copilot license. And so we've had to kind of do it on a case by case basis and a really limited way. We did do a pilot just so people could explore. So we had a very limited number of licenses we learned from that. But it's also showing up in other solutions that we have, like Slate tech solution slate that our admissions office uses.
Jill Forrester: They're rolling out technology. So these offices are all really having to talk about not just how they're going to use it, but you're right, should they be using it? And if they are, are there some constraints or guardrails that they want to put up as they're using it? And part of the work of our presidential working group as well as me and my leadership team is to help our campus community have those discussions in a productive way because I don't feel we can put our heads in the sand and say, well, we're just not going to use it because it's showing up in all of the software that we're using. And so it's important when we think about it, does our use of it as a partner align with our mission and values? And if it does, then you can move forward.
David Weil: So building off of that, I see there's really two parts of this question, Jack. One is providing frameworks or rubrics or guidelines for the institution to help the institution make decisions about whether or not a particular use case is appropriate. So one of the things that our presidential working group did is came up with a list of guiding principles, and I believe Dickinson did as well. And these are six or seven high level principles that we want to apply or think about when a application for AI comes to the table. For example, we think that the human, we don't want technology to replace humans, and that's sort of one of the real core guiding principles that we have there. So you have guidelines. We're also developing a rubric for the actual assessment, which again is based on those guiding principles there. So that's sort of on the policy or should we side of the equation then to your point about the technology?
David Weil: It constantly changes. Well, we created an AI exploration lab, which is a room where anyone on campus can come, and we have a number of different licenses there and different tools. So people can come in small groups and they can experiment and explore and understand how that works. We have many grants that we give to faculty. They're really not a huge dollar amount, but it provides some structure for faculty to go out and experiment. And then there's the expectation that they report back about how they're using that tool in the classroom. So there's a conversation, so people are learning from one another. So really is being intentional about how you approach this and giving space for people to explore or play, and then you have those guide rails or guidelines that help guide it for the institution.
Cynthia Golden: I'm curious, Dave, who's coming into the exploration lab? Is it students, faculty, administrators?
David Weil: Great question, Cynthia. It's a mix. At first, we opened the door and expected there to be this huge line of people dropping by. It didn't quite work that way, but what we are seeing happening is people are interested and we said, Hey, let's set up a time for you to come in and meet with some of the students to explore that further. And so yeah, it's staff members, it's some students who want to be able to use one of the more advanced tools that they may not have access to. But a lot of the most effective part of it is really when we sort of say, Hey, let's bring in two or three people from a department. We're going to sit around this table and work on a problem and go from there.
Jack Suess: So Dave, just as a follow up, you're using students, it sounds like in some of this, and it's been interesting because we've been doing just what I would call a skunk works project. It was working with some faculty in biology where it was a handful of faculty who answer all the advising, arcane advising questions for the department, so to speak, and they had a mailing list. And so we started to set up an AI advisor that could be doing this for them, and we've been using students to help build this and demo it and things like that. And we were just testing it out this last week before they go live and they're going to start using it. But this idea of leveraging students, are you using students as well, Jill, as you're thinking about it? To me, this is a great way of connecting faculty and students and others into ways where they're sharing in their learning journey with this new element.
Jill Forrester: Yeah, we are absolutely using students. We have a great example is actually a few years ago. So really before we saw the advent of chat GPT and some of the large language models come out, we had a student faculty project to develop a Japanese language chatbot. So this was to help language learners just have a conversation, but it was the old school chatbot. So think of the Excel spreadsheet where you have the questions and the answers. So it was really rigid. That works well for your introductory classes where very little about, you're just learning to converse in the language and you do need that structure, but as soon as you move beyond that, it just wasn't effective. So then when OpenAI released its APIs, we again worked with students. So this was actually a collaboration between our Japanese East Asian department and our Japanese language instructors, our language technologists.
Jill Forrester: And then we had a student in our computer science who took that and then built out a chatbot that now takes on the right persona that's needed and our language students. So it started in Japanese, but now they've generalized it. And so any language faculty member can add this feature into Moodle, which is our learning management system. And so that students who are taking any of our world language classes can use this chatbot to again, help develop those conversational skills. So I think it was a great interdisciplinary application, but also with really an emerging technology.
Cynthia Golden: So Jill, earlier you mentioned ethical concerns and how do you balance innovation with AI with things like data privacy or bias that we are worried about?
Jill Forrester: Yeah, well, I think that's where having your guiding principles is important. I think too, Dave and I both work at liberal arts institutions, and so we do take a very human centered approach to the work that we do. I think too, we teach critical thinking in our classes and we as employees and faculty members want to be critical thinkers as well. And so part of when we're talking about ai, part of that conversation is always about that ethical use. And it's nice to create good safe spaces to have those conversations, but you also need to balance that because you don't want to completely freeze yourself from moving forward, right? Because you're stuck and always asking the what if questions of what if questions. So we tend to always come back to does it align with the values of the institution? We're not using AI as a replacement, but as a partner. And so how can we use AI to innovate and to move the institution further along in its mission and its goals? How can we help our students cultivate a knowledge base around AI so that when they enter the workforce, they are prepared and feel kind of left behind?
David Weil: So in thinking about use cases, we have to, I think, deconstruct artificial intelligence a little bit because really there are all these different ways that we use it on our campuses. There's the productivity tools, like your chat GPTs, your copilot, and I think that's one bucket. And as an institution, we try to provide tools where we know that the data won't be used for training and won't be intermixed with others. And so we release that to the campus. But then when we are developing applications like something that might help our students or help with advising or something like that, we are very intentionally de-identifying the data, any data that we send out into the models. And so we will maintain all the identity information here in our own system, and then we will connect it with the responses that come back. And even though we know that they're not going to be using our data for training, we just think it's another layer of privacy and protection that we can put on there. So again, thinking intentionally about it, but also looking at some of the specific use cases and how you might want to approach it.
Cynthia Golden: That makes sense.
Jack Suess: So what advice would you give to other CIOs who are sort of right now starting to be looking at how they begin this journey, sort of leveraging, you started this a few months ago now, where you are right now, what's your advice, Joe?
Jill Forrester: Yeah, my advice would be I think you need to be agile. Often when we're going to be implementing technology, we want to make sure it's working really well. It's robust, and as we talked about at the beginning of the hour, this is moving so rapidly that by the time you were to perhaps work out all the kinks, so to speak, right? It's moved on. And so I think you have to be agile. I think you have to, if you have a, we'll call it a V one, a version one ready to go, maybe frame it as a pilot. A lot of times people are more receptive and their expectation if you say that it's a pilot, is that, yeah, things might not work perfectly, but framing it as a pilot so that also you are encouraging people to give you feedback to say, yeah, this isn't working. We wanted, maybe we need to tweak it. And so I think framing it as a pilot kind of can unfreeze you and help you to move forward more quickly.
David Weil: More quickly. For me, it's three words, and the first one, I'm going to double down on what Jill said. Absolutely. The pilot is a magic word because it frees people up. So pilot, explore, and so that my second word, you want to give people permission to just explore, play with it, learn it, experiment. And then the third word is data. It's all about the data because that's what the AI tools are feeding off of in order to do their magic. And so getting your data in order, having robust data environment, whether it's a data lakehouse or some other tool there, understanding that data, clean data is so critical. So pilot, explore data.
Jack Suess: I think pilot is the key word. I love explore. And the other thing I would just say for anyone listening is that if you're starting right now, if you'll catch up so much faster that by next fall you'll be with everyone else.
David Weil: Agreed.
Cynthia Golden: Probably true. Switching gears just a little bit, Dave, you and I talked a few weeks ago and when we were talking, you brought up a agentic ai, and so thinking about AI that can act with some degree of autonomy or make some decisions, would you talk about that a little bit and what potential you think might be there for this agen ai?
David Weil: So on all of our campuses, I think one of our goals, one of the reasons why we got into it is to make people's lives easier. And if you look at the student experience, at least at Ithaca, and I'm sure at other campuses, students experience this friction, they may have to tell their story multiple times or they have to click into multiple systems and things like that. And we've had a lot of tools over the years that help reduce that friction. And I think AI is another tool that has a lot of potential for doing that. But a lot of those processes, if you deconstruct them, you might need to access data that's in student information system, or you might need to understand a student's degree requirements or the course catalog or et cetera, et cetera. Ag agentic, ai, at least the way we are looking at it is the ability to create these small agents that then you can collectively string them together or have them work actually as a team is a better way to describe it, to solve a problem.
David Weil: So these are reusable agents that each one in and of itself has a responsibility, but then they work together. So the best mental model that I have for this is that you have a team that's sitting in a conference room and each member of the team has the area specialization, and yet they're all there hearing what the problem is that's going on. And then you have the orchestrator at the head of the table and might turn to Charlie and say, Charlie, do this. Go look at the course catalog and understand what's happening there. But meanwhile, Sally is listening to that conversation so that when the orchestrator turns to Sally and says, Hey Sally, I need you to do this part of it. Sally already knows what else has happened, it can just pick up there. And so we see this as having great potential for streamlining a number of the processes that our students do today, whether it's choosing electives or having a housing issue or other things. These agents will be able to more proactively act on a student's behalf at the student's control and request there. And we think it has great potential from that perspective.
Cynthia Golden: Jill, do you have anything? Any thoughts on this or
Jill Forrester: I am following closely what Dave is doing. I find it really fascinating. I mean, we have not yet explored any agentic AI yet, but now you see why I talk to Dave on a regular basis full of great ideas, and his team is doing amazing things.
Cynthia Golden: Thanks. I've learned a lot from him about this topic myself. So
David Weil: We have a pilot that we created for a student who needs to choose an elective. So right now they have to go to the course catalog, they have to look at their degree requirements, they have to look at the course schedule, they're clicking in different places, then they have to share that with their advisor. And the agent just says, hi, I can help you select the elective. What are you interested in? Student could say art history. And then the agent will say, well, here are these courses, these three will fit in your schedule the way you have it today. Is there one there that is interesting for you? And the student could say yes, and then it would put the schedule together and they could go back and forth and then the agent would set it off to the advisor. So one of the core values here or core tenets is that we do not want to replace the human to human interaction. We want to make it richer, we want to provide better insights, and we want to take care of the mundane. And again, we're excited about where this is going to head, and next year I'll come back and we'll show you some examples. I
Jack Suess: Hope so. So what's really interesting is this podcast, our crosscutting theme has really been leadership throughout the podcast, and you two are just incredible examples of leadership in action over the last 30 years roughly for doing things and how you're using leadership to be bringing this new technology to bear. I was wondering, could you talk about how you are about leadership and also a little bit about the fact that both of you were in interim roles and what that means because a unique experience for many people, and I think that you could be providing some perspective that may be helpful for some of our listeners who may find themselves in that role at some point in their career. So Dave, I'll have you go first and then Jill.
David Weil: Thank you. Yeah, being interim, it is probably one of the hardest things that you could possibly do, especially if you aspire for the ongoing gig. I was fortunate. It worked out positively for me, and I know it also worked out positively for Jill, but it's a year long interview. But the best piece of advice that someone told me and that I try to share with others is one of the things that limits people from moving up in an organization is that others don't see them in that new role. They see people as, oh, well, that's the middle manager Mark, and that's what Mark does and cannot get This mental picture of Mark being in an elevated role. Well, being an interim, that's a gift because now you can be in that role and you can act the way that others would expect to see from the CIO. So my advice is ignore to for the most part, not entirely, but ignore the interim and allow people to see you in that role so that then when you actually apply for it, they can say, of course Jill's the right person for that. She's been doing it for the last year. Why wouldn't we give it to her? But yeah, it's hard
Jill Forrester: And I'll follow up. Actually, Dave gave me that advice when I was on a call with him and I had just moved into the interim role and Dave said I had that experience too. And that was actually the start of our friendship. And when we really started collaborating and sharing experiences that we have at our two institutions and as leaders of technology at both of our colleges, I will say, yeah, right. It is a hands-on job interview. You are doing the job. And I agree wholeheartedly with what Dave said, it's best to embrace that and move into the role and act as if yes, you are the permanent person in the role. I think Jack, when we were talking last week, you had a great observation and it's so true. A challenge though is that often your previous role is not backfilled. You can find yourself straddling both positions, which it is a challenge to navigate.
Jill Forrester: There's just no way around that. But I will say from my experience and when I reflect back on being an interim, it was yes, it gave campus an opportunity to see the type of leader I would be in this position, but it also allowed me to think about the type of leader I wanted to be in the role. And I was able to see some of the challenges and the partners that I'd be working with from a new perspective. And as that I was able to think, do I want to be a leader that just maintains the status quo? Do I want to be a leader who's constantly pushing innovation and creating a really disruptive environment, or do I need to find that challenging space which is trying to balance keeping the lights on with the innovation? And to me, that's really the sweet spot and that's what I have aspired to be. But stepping into the interim role does give you that opportunity to have campus see you, how you would be as a leader in that role, but then also have you really take some time to think about the type of leader you want to be.
Cynthia Golden: Well, you brought up a good point about innovation because I think as leaders we always have that kind of push pull between pressure to innovate and keeping the trains running successfully. And so how do you foster innovation in your teams? I mean, from each of your leadership perspectives, maybe Dave and then Jill.
David Weil: Sure. Well, going back to that experimentation I think is important. I really believe connecting the work we do with those we serve. And so we constantly are talking about our why, our students. We have pictures in the hallway of our graduates who worked in it and what they're doing now. We bring them in, students in to talk at our division meetings. We have joint staff student meetings because I think you have to make that connection. And I hope people are here working at Ithaca College because they want to be at a college and the mission that we want to do. So I think if you use those as the raw ingredients and then trust and permission to experiment can lead to great innovation, and I think we're doing some interesting things that way.
Jill Forrester: Yeah, I will say I think Dave is a phenomenal leader in this space for giving his team and creating a framework within his team for them to innovate and experiment. And I always enjoy hearing about the new things that they're doing at Ithaca in that way. I will say, so innovation, it is a challenge at a small school where you typically have a smaller number of individuals to operationally provide that stable environment and then also have to innovate. But I think that it is critical for those of us who are leaders in this space to think of ways we can give our staff time to experiment, to give them the freedom to fail fast if something's not going to work, let's realize that quickly and so we can move on. Again, we'll come back to the word pilot. It is a magic word because it gives you some freedom to explore new things and kind of sets that expectation that things might not be completely perfect, you're exploring. But also I think anytime we as leaders can set up a way to reward our employees for innovation and innovative ideas that also works wonders for fostering that environment. I don't necessarily mean a monetary reward. It could be a recognition. And so building within your organization the fact that yeah, you're going to recognize innovative ideas, successful or not, that's the other important thing, right? Sometimes it's important to recognize things that people tried that maybe
David Weil: Weren't
Jill Forrester: As successful as they wanted because it shows you are creating a culture of innovation and it can help people have the freedom to think, yeah, I want to bring this idea forward and I want to try it, and my institution or my organization is going to support me in that.
Jack Suess: So we've had a great conversation. One of the questions as you think about leadership and you think about all the work that you're doing with innovation is how do you sort of step back and sort of reset yourself and have fun? So how do you both look at having fun and stepping away from technology and sort of reset yourself? Joe, I'll go with you first and then Dave.
Jill Forrester: Oh, I love this question. I am actually an avid knitter. I love to knit and crochet and I design things myself, and I find it to be a really creative and fun outlet and a job that can sometimes feel like you have a lot of pressure on you, and it can be pretty technical at times. It is to me a very relaxing to just enjoy being creative. Often people don't think of individuals in technology as being creative individuals. I am here to say that is not my experience at all. I think technologists are some of the most creative people I've ever met. And so yeah, I just like to sit back and relax with my knitting needles.
David Weil: For me, it's being near water, on water in water. There's something for me that's so restorative and just energizing about water. And it's one of the great things about being in Ithaca, New York most of the years, or most of the year, not maybe January, February, but is we have a beautiful ka yuka lake right here. And so that really does restore me, and I think that's where I can do a lot of thinking and just reenergizing from there.
Cynthia Golden: Do you have a boat?
David Weil: Yes, yes. Boat. There's all boat out there.
Cynthia Golden: I'm with you on the water.
David Weil: So I would say, where do you go? What do you do when you go on the boat? It's like nothing I read or I might just sit and float. And for me, that's perfect and it clears mind and centers me
Cynthia Golden: Well, this has been a great conversation. As Jack said, we often end with the question, what does the term integrative, CIO, the name of our podcast, what does that term mean to you, Jill, and then Dave?
Jill Forrester: Yeah. Well, I will say, yeah, I have really enjoyed the conversation as well and asking this question of all these CIOs that you have on this podcast, you guys are going to have to create a book of all of the responses that you've gotten. But for me, it's all about partnership. I think an integrative CIO is one that knows that technology is a strategic enabler and that our job is to build those partnerships across campus so that we're at the table. We are working with our colleagues across campus to advance the mission of the college. We don't do technology for technology's sake. As Dave mentioned. We do it to create a superior student experience to help them learn and become leaders in our society. And so I think as much as we can cultivate those partnerships, build the trust with our colleagues across campus, we can use technology to again, advance the mission of our colleges and universities.
David Weil: My answer's going to be right along the same lines, but it's being part of the fabric of the institution of really looking at being the strategic partner and looking at the tools that we have at our disposal to help the institution move forward. I firmly believe that our role is in some ways less about the technology these days, although we have to make sure that it's running and it has to be well architected and secure and all of that. It's more, and we have to have the services now, the services that help our students and be more effective, have less friction, be able to get those insights in order to make those important decisions that the institution needs to do. And so I think being an integrative CIO is someone who is there and able to bring that to bear. So thank you.
Jack Suess: I think that's a wrap, Cynthia.
Cynthia Golden: I think so too. Thank you both for joining us. This has been terrific, and thank you for having us listeners.
Jack Suess: Thank you very much.
This episode features:
Jill Forrester
Vice President, Information & Technology Services and CIO
Dickinson College
David Weil
Vice President IT & Analytics
Ithaca College
Cynthia Golden
Executive Strategic Consultant
Vantage Technology Consulting Group
Jack Suess
Vice President of IT & CIO
University of Maryland, Baltimore County