Exploring the Opportunities and Challenges with Generative AI

min read

Effectively integrating generative AI into higher education requires policy development, cross-functional engagement, ethical principles, risk assessments, collaboration with other institutions, and an exploration of diverse use cases.

Many open doors of different colors.
Credit: Dmitry Rukhlenko / Shutterstock.com © 2024

In October 2023, EDUCAUSE hosted a two-day discussion called "ChatGPT and Generative AI: Navigating Leadership Opportunities and Challenges" with eight higher education expert panelists and more than one hundred higher education professionals. Together, participants identified and discussed various areas of interest: cost, funding, and experimentation; pedagogical opportunities; policy strategies; AI use cases; and leadership strategies and ethical considerations. As the dialogue unfolded, a central theme emerged concerning how higher education institutions can adapt to the rapid pace of technological change, emphasizing the need for responsibility to guide the conversation around AI. Given that the AI space is complex and evolving, participants urged that ongoing guidance is needed. The event generated insights and recommendations that can help the higher education community navigate the rapidly changing and consequential developments around AI.

Cost, Funding, and Experimentation

As generative AI becomes more ubiquitous, and as AI functions are increasingly being added to various technology tools and systems, colleges and universities will need to address issues of cost and funding:

  • Currently there is a lack of visibility into the costs associated with AI implementation, especially for seemingly free services.
  • Much more transparency and awareness is needed when budgeting for AI resources.
  • As AI technologies mature, users will require a better understanding of the direct and indirect costs involved.

The increasing incorporation of AI features into existing software and products raises various concerns for higher education, among them procurement challenges and the importance of staying informed about AI features introduced through updates in widely used software. One approach to balance those concerns against the opportunities in exploring and experimenting with AI technologies is for institutions to procure and enable local, controlled AI environments that are carefully vetted and established for specific uses, especially learning/student uses. Doing so at a time when responsible use and practice are still developing could go a long way toward supporting successful applications.

Pedagogical Opportunities

A key challenge that generative AI tools pose is how to use them effectively for faculty and student development. Faculty members need to understand and critically evaluate AI outputs, and innovation must be balanced with digital literacy. Various approaches, such as integrating AI into coursework and fostering a culture of critique, can help bridge the gap between faculty and technology.

The issue of attribution in the context of AI-generated content raises various ethical considerations. Users—whether they be students, faculty, institutional staff, or others—have a responsibility to acknowledge AI contributions in the creation of products, and using AI in the creative process carries potential impacts on intellectual property and the inclusion of diverse voices.

Policy Strategies

The breadth and extent of the logistical and ethical challenges that generative AI poses for higher education highlight the need for proactive and agile policy development, ongoing dialogue, and collaboration across disciplines. As with many technologies, a particular challenge with policy development around AI is that it's a moving target. The technology is evolving rapidly, and our understanding of its implications is often lagging. Therefore, a practical approach might be to create a set of guiding principles or ethical frameworks that can adapt to changes in technology and emerging ethical concerns.

An exploration of the policy context for AI could include some or all of the following:

  • Conduct a thorough policy audit. Policy review and development should involve not only crafting new policies but also examining and updating the existing policy library.
  • Review existing institutional policies that could apply to various uses of AI. Consider the potential risks faced by students and faculty due to the misapplication of policies that were not originally designed with AI in mind, such as a plagiarism policy that predates the ways AI can be used in the writing process.
  • Establish and engage in cross-functional conversations on campus. Involve students in meaningful ways, and stay abreast of evolving AI technologies (especially those embedded in publicly available tools) to ensure responsible use and equitable integration.
  • Create task forces or agile groups to explore and track use cases and navigate the complex landscape of AI. This approach parallels the collaborative efforts needed during events such as the COVID-19 pandemic.

Another innovative approach to introducing AI policies could involve integrating AI discussions into broader topics of interest (e.g., academic research and human resources), fostering trust and understanding before delving into the specifics of policy development. This approach aims to engage stakeholders in a meaningful conversation about AI, its implications, and the institutional values guiding its use.

Other steps that institutions could consider when developing AI-related policies or frameworks include the following:

  • Assess the current landscape (local and beyond). Understand the current uses of AI within the institution. Identify where AI is being used, what types of data are involved, and the impact on students and faculty.
  • Engage stakeholders. Involve various stakeholders—including faculty, students, technology professionals, staff, and legal experts—in the conversation. Understanding different perspectives is crucial for creating comprehensive policies and guidelines.
  • Clarify existing policies. Review and clarify existing policies related to data privacy, academic integrity, and ethical conduct. Ensure that AI-related considerations are integrated where necessary. Especially consider crafting an AI policy that is in addition to and different from a plagiarism policy, and develop a handbook to help educate and encourage effective practice.
  • Understand AI-provider policies. Consider how policies govern the AI coming from or embedded in third-party tools.
  • Educate the community. Provide education and training on AI technologies for faculty, students, and staff. Help them understand the potential benefits, risks, and ethical considerations associated with AI.
  • Define ethical principles. Develop a set of ethical principles that guide the use of AI within the institution. Consider dimensions such as transparency, fairness, accountability, and privacy.
  • Undertake risk assessments. Conduct risk assessments to identify potential risks associated with AI use, and proactively consider how to respond to the risks. For example, some AI contract language allows data provided by students to be used in the development of new products. Policy development should include evaluating the impact on students' well-being, academic integrity, research, and data security.
  • Plan for iteration. Recognize that AI technologies and ethical considerations are evolving. Establish a framework that allows for continuous iteration and adaptation as technology and ethical norms develop.
  • Communicate about policies. Clearly and regularly communicate the AI-related policies to the entire community. Ensure that everyone is aware of the guidelines, expectations, and consequences associated with AI use.
  • Remember to monitor and audit. Implement mechanisms for monitoring and auditing AI applications within the institution. Regularly assess their impact on and adherence to ethical principles.
  • Collaborate with other institutions. Collaborate with other educational institutions and organizations to share insights, best practices, and collective wisdom while navigating the ethical landscape of AI in education.
  • Establish protocols for AI tool selection and use. Develop policies and/or guidelines addressing tools that may be used in instruction with regard to a centralized review of privacy, security, and quality.

A recent article in EDUCAUSE Review called "Cross-Campus Approaches to Building a Generative AI Policy" provides an in-depth look at the work of creating policies for generative AI.Footnote1 Meanwhile, check out the community-supported Syllabi Policies for AI Generative Tools for a growing collection of AI policies related to teaching and learning.

AI Use Cases

An EDUCAUSE QuickPoll conducted in April 2023 identified a long list of AI uses, which can be organized into four categories: Dreaming (helping you think), Drudgery (lightening your load), Design (building your content), and Development (advancing your work).Footnote2 AI use cases cover a wide and expanding set of users and functions, such as the following:

  • Student Advising Assistant: Implement AI to assist with student advising, streamlining the process and potentially enhancing personalized guidance.
  • Clinical Treatment and Research: Explore AI applications in clinical treatment and research, exploring potential advancements in health care and medical fields.
  • Literature Assignment: Assign literature students to hold conversations with ChatGPT, embodying characters from books and discussing personal anticipations, thereby demonstrating creative and educational use of AI.
  • Virtual Teaching Assistant: Leverage AI as a virtual teaching assistant, contributing to educational support and potentially reducing the burden on educators.
  • Coding and Script Generation: Use AI to generate script and code, indicating applications in programming and content creation.
  • Video Production and Analysis: Involve AI in video production, including cloning professors, analyzing course survey results, generating visuals for online courses, providing multilingual audio, and testing assignment instructions and grading rubrics.
  • Topic-Specific Virtual Tutors: Test various language models for efficacy as topic-specific virtual tutors, assessing their pedagogical impact on student learning.
  • Complex Learning Design Workflows: Manage complex learning design and development workflows using AI, suggesting efficiency improvements in educational content creation.
  • Accessibility Remediation: Use AI workflows and tools to address holistic accessibility remediation in learning media, ensuring inclusivity.
  • Online Course Generation: Generate online courses by leveraging AI to compile content from various institutions, indicating potential collaboration in education.
  • Student Success Analytics: Use AI to generate analytics related to student success, providing insights for educational improvements.
  • Communication and Change Management Plans: Develop communication and change management plans around upgrades, maintenance, and service launches, emphasizing the importance of strategic implementation.
  • IT Responsiveness Research: Conduct research and validation on IT responsiveness, disaster recovery, and ransomware preparedness, demonstrating AI's role in enhancing IT security and recovery.

This list of use cases and others collected from AI articles can and should be augmented with a local community's input, organized into categories, and assigned to working groups to explore and track outcomes.

Leadership Strategies and Ethical Considerations

Higher education institutions have an important role to play in shaping the future landscape of AI, and a blog post from Tyton Partners aims to demystify the hype surrounding AI and help institutional leaders know where to focus their energy and actions.Footnote3 One dimension of higher education's role is the potential for collaboration between academic institutions and AI-focused companies, emphasizing the need for tools designed for and aligned with ethical standards. Many AI tools have not been developed with built-in guardrails or warnings concerning the protocols for effective use, and AI tools have the potential to produce misleading or inaccurate content, underscoring the importance of responsible development and usage. A partnership between higher education and AI service providers could support responsible integration, ensuring that AI aligns with the values and goals of an academic community.

Keep in mind that the goal is not merely to set rules but to foster a culture of responsible AI use within higher education and to promote learning. Ethical considerations should be woven into the fabric of education and technology, emphasizing the importance of thoughtful, purposeful, and human-centric AI integration. The focus should be on fostering a mindset that encourages questioning and critical evaluation, ensuring that AI serves as a tool for empowerment rather than as a replacement for human intellect.

What's Next

EDUCAUSE will continue to create resources and convene groups to understand and, in some cases, influence the evolution of generative AI products and services for higher education. The EDUCAUSE Library includes an Artificial Intelligence (AI) topic page that is regularly updated with resources, including the recently developed "7 Things You Should Know About Generative AI."Footnote4 Anyone can join the EDUCAUSE AI Community Group to learn and collaborate with others on AI and AI-related topics. In December 2023, EDUCAUSE held a Member QuickTalk | Drafting an Institutional AI Policy; the resources from the event and a recording of it are available to anyone at an EDUCAUSE member institution. As with any new, potentially transformative technology, the best outcomes for AI will be achieved when community members come together to share information, ideas, concerns, and solutions in an evolving, uncertain space.

Notes

  1. Esther Brandon, Lance Eaton, Dana Gavin, and Allison Papini, "Cross-Campus Approaches to Building a Generative AI Policy," EDUCAUSE Review, December 12, 2023. Jump back to footnote 1 in the text.
  2. Mark McCormack, "EDUCAUSE QuickPoll Results: Adopting and Adapting to Generative AI in Higher Ed Tech," EDUCAUSE Review, April 17, 2023. Jump back to footnote 2 in the text.
  3. Kristen Fox and Catherine Shaw, "Artificial Intelligence in Higher Education: Trick or Treat?" Tyton Partners blog, October 31, 2023. Jump back to footnote 3 in the text.
  4. "7 Things You Should Know About Generative AI," EDUCAUSE Review, December 6, 2023. Jump back to footnote 4 in the text.

Veronica Diaz is Senior Director of Professional Learning and Development at EDUCAUSE.

© 2024 Veronica Diaz. The content of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.