AI . . . Friend or Foe is an honorary issue in the 2024 EDUCAUSE Top 10.
"Let's wait and see what AI and ChatGPT will bring for the next years. This competition from OpenAI or Google or whatever probably is going to lead to some drastic changes in our traditional courses and assessment."
—Nick Bassiliades, President of Aristotle University IT Center and Professor at the School of Informatics, Aristotle University of Thessaloniki, Greece
Open AI stunned the world in November 2022 when it released ChatGPT, its web-based interface to the GPT-3.5 model. Suddenly, a generative AI model trained on massive amounts of internet content was accessible to many people and organizations across the globe. Since then, we've all experimented with use cases and entertained and frightened one another with stories about the results of creative queries, sinister or silly responses, and some powerful ways to use generative AI to be more productive and more innovative.
We know things to be different since that date. Generative AI is remarkably unlike all the technologies in the status quo. It is technology that invades the mind, so to speak. It brings some of the tacit knowledge in our collective mind and makes it available to all. Although artificial neural networks, the technology that has led to large language models and generative AI, are decades old, and although we have been using AI in chatbots and other products for years, ChatGPT brought this technology to the masses.Footnote1
Suddenly, anyone can have direct access to generative AI. It is tangible to and controllable by us all. We can now much more easily understand the many ways in which AI can and will change our lives, through transformations that will surpass the impact of other 21st-century innovations, from smartphones to social media to cloud computing.
But is AI good or bad, friend or foe? Most probably yes. The questions we should be asking are nonbinary: How do we make the best use of AI to benefit the higher education mission and its constituents? How do we transform higher education to adapt to an AI-infused future?
AI makes knowledge and expertise available in ways that they weren't in the past. It has the potential to help people "skill up" rapidly, including those who have traditionally lacked access to effective educational opportunities and resources. In higher education, AI can potentially help reduce administrative costs if applied to administrative processes, job descriptions, project charters, meeting summaries and follow-up, coding, onboarding, and training. Academic applications might include conducting assessment reform, developing course materials for introductory-level courses, and tutoring. All these are early ideas, and we will almost certainly create more powerful use cases in the coming months and years. AI may not have made the 2024 EDUCAUSE Top 10 list, but 30 percent of Top 10 survey respondents rated AI ... Friend or Foe: Developing an institutional approach with an importance rating of 9 or 10 out of 10.
Get smart and stay smart. What you knew last month about AI is not what you'll need to know next month. Create the time to build and maintain your understanding of AI. Create or use existing forums to share knowledge and ideas within and beyond the institution.
Earn your seat at the table. Boards, presidents, faculty, and others are looking to IT leadership to advise them on institutional AI strategy. This is an opportunity to be that integrative CIO.Footnote2
Dare to dream big. AI is more than a collection of tools to make incremental improvements. Work with institutional leaders and external experts to re-envision the institution and our sector.
Learn from your students. Students are using AI and thinking about AI in ways that are more elastic, innovative, and relevant to learning and student success than the rest of us. Involve them in AI strategy and experiments.
Be wary of the hype. There's money to made on AI, and many companies and consultants will be prone to overselling their "AI-driven" products and services.
The Key to Progress
Institutions with "organizational capital"—those that have already been investing in the cultural, workforce, and technological capabilities that are essential to effective institutional digital transformation—will be best positioned to adapt to, mitigate the risks of, and take advantage of AI. Some will have the resources to lead these efforts. Other institutions—smaller or less well-resourced but nevertheless agile, with digital transformation underway—can benefit from those early adopters and become fast followers.
From Strategy to Practice
What You're Saying
"So far, we are taking a cautionary approach overall but are interested in opportunities to experiment."
"AI is both a friend and a foe. It can be a useful tool at work and for both teaching and learning, and at the same time, it could be misused by students to cheat for their schoolwork."
"AI is really new for us. We are still trying to figure it out as friend and as foe as well."
"For the creative environment, AI will be a collaborative tool that will fit like a brush or a pencil or a canvas with which users will create more art."
"It's too early."
"We do not have a unified approach. And we need one!"
"Florida State University's strategy for AI focuses on applicable use cases, emphasizing its role of enabling learning and spearheading impactful initiatives that truly move the needle. Two examples underway measure employee engagement and workforce readiness. We're adjusting our tactics in real time through the power of AI to better reach our goals. Another example of our focus involves a pilot project around data analytics. We put our AI model up against several data scientists. The traditional methods produced the analysis of the data after two weeks. The AI used the same dataset and produced even more detailed results in just minutes. How could we use this additional time saved? How could that impact our university and community even more positively? This is where AI has the potential to be a beneficial resource in maximizing our ability to speed up decision-making and analysis and even reduce the mundane tasks so we can focus on what really matters."
"Wesleyan University is participating in a two-year AI engagement with Ithaka S+R. Additionally, the Consortium of Liberal Arts Colleges (CLAC) is in the early stages of developing something in this space to help our institutions move away from a place of fear and resistance to adoption."
What You're Working On
Comments provided by Top 10 survey respondents who rated this issue as important
Academic guidelines and planning
- Guidelines for instructors for syllabi regarding generative AI; suggestions for inclusion in courses.
- A teaching and learning AI task force was created last semester, with a final report pending. An administrative AI task force will likely be commissioned this fall (to consider updates to policies and standards, to be cognizant of data security issues, etc.).
- Developing academic principles to raise awareness about classroom use for faculty and students alike.
- We are actively working as a campus community to educate ourselves and carefully consider what our academic dishonesty policy should look like in this new world.
- We have created an institution-wide working group on AI for teaching and learning, which meets monthly. I co-facilitate with two others: an IT director in our distributed IT staff and a professor from the School of Education. We have broad representation and are working on recommendations and guidance for the institution.
- We have held a series of panels and discussions about generative AI, and they have been well attended. The associate provost for academic success assembled a committee to draft three different options for syllabus language regarding generative AI. In addition, we will have "playgrounds" and a symposium for returning faculty in the fall.
- The state system has started a faculty-led workgroup to examine AI's role in the classroom and institution. A report is due later in the year.
- Like many other institutions, we realized that AI is not going away. We (ITS) also determined that it is not one of those things that IT can make go away.So there is a campus group, led by faculty, working on AI use and its implications on teaching and learning.
- This has been included in our plagiarism policy, but we are working toward using AI as a friend to redesign authentic assessments.
- Constructing generative AI lab, developing eight courses for students.
- Integrated AI across our curriculum with a partnership with a local university.
- Moving to include AI in all aspects of education.
- Investigating tools to measure similarity results for student-submitted material. Allowing a 20 percent level of AI-developed material to help students improve their writing capability. Purchasing a tool that will provide faculty and students with insight into the use of AI.
- Our contracts office just put out some formal guidance on Chat GPT and will do so soon for Bard.
- Our organization is taking steps to ensure the responsible use of AI by establishing guidelines and principles. These will cover areas such as protecting data privacy, ensuring transparency in algorithmic decision-making, and setting up review mechanisms for potentially biased AI systems. By adhering to these guidelines, the college can demonstrate its commitment to using AI for the benefit of all.
- We are developing guidelines in partnerships with various campus areas (CTL, cybersecurity, legal/privacy, institute communication). At the same time, some of our faculty are also developing their own policies for their classes.
Strategy and planning
- We are developing a strategy to catalog AI use on campus and enable research and administrative use cases.
- We have an "AI.Humanities" initiative that is establishing our approach as an institution to consider AI from multiple perspectives.
- The university has formed a committee on generative AI in research and education, which is a combined initiative of the offices of the provost and the senior vice chancellor for research. It will produce a report, by the end of the fall 2023 semester, that will contain specific recommendations for how to move forward in this area.
- University-wide committee structure developed university position on AI, trying to help balance the opportunities and risks.
- We have implemented a cross-functional working group to look at the broad implications of AI across the institution (not just teaching and learning), and this group is intended to align policies, standards, and activities more broadly. The current stance is that we will not ban AI, but as our institution adopts it, we will ensure that it does so responsibly.
- Establishing an AI campus framework and tool to trial opportunities.
- We are in the early stages of formulating a "whole of system" approach to AI. An AI working group will include representatives from legal, IT, cybersecurity, data governance, privacy, HR, and compliance and risk management.
- Creation of working group to research, benchmark, and evaluate what top-tier universities are doing.
Training and awareness
- Just did a major workshop with academics on T&L and embracing AI. I think we are in front of this as much as we can be.
- We are collaborating with faculty and campus instructional designers to provide training and information on how to positively engage with AI technologies in the classroom.
- We have launched a webinar series focused on AI, with weekly talks from our faculty and industry on various aspects of AI including science, research, healthcare, teaching, administrative improvement, and bias and ethics.
- Coming together on a chat resource for campus.
- We have established an AI community of practice to share ideas on how different campus groups are engaging with AI.
- Researching how to implement AI protection and defenses against AI threats.
- Brian Basgen, "A Generative AI Primer," EDUCAUSE Review, August 15, 2023.Jump back to footnote 1 in the text.
- See "Issue #10: The Integrative CIO," in Susan Grajek and the 2019–2020 EDUCAUSE IT Issues Panel, "Top 10 IT Issues, 2020: The Drive to Digital Transformation Begins," EDUCAUSE Review, January 27, 2020. Jump back to footnote 2 in the text.
Vince Kellen is CIO, University of California San Diego.
Jim Russell is CIO and Vice President for Digital Strategy and Planning, Manhattanville College.
© 2023 Susan Grajek and the 2023–2024 EDUCAUSE Top 10 Panel. The text of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.