Leveraging Generative AI for Inclusive Excellence in Higher Education

min read

Drawing from three lenses of inclusion, this article considers how to leverage generative AI as part of a constellation of mission-centered inclusive practices in higher education.

Many people chatting via an AI helper bot assistant.
Credit: ProStockStudio / Shutterstock.com © 2024

The hype and hesitation about generative artificial intelligence (AI) diffusion have led some colleges and universities to take a wait-and-see approach.Footnote1 However, AI integration does not need to be an either/or proposition where its use is either embraced or restricted or its adoption aimed at replacing or outright rejecting existing institutional functions and practices. Educators, educational leaders, and others considering academic applications for emerging technologies should consider ways in which generative AI can complement or augment mission-focused practices, such as those aimed at accessibility, diversity, equity, and inclusion. Drawing from three lenses of inclusion—accessibility, identity, and epistemology—this article offers practical suggestions and considerations that educators can deploy now. It also presents an imperative for higher education leaders to partner toward an infrastructure that enables inclusive practices in light of AI diffusion.

Accessibility

Inclusion through accessibility means removing barriers to access for people with disabilities and learning preferences—visible and invisible, disclosed and undisclosed. This type of inclusion can be developed reactively in response to an articulated need, such as making reasonable accommodations for students with disabilities, or intentionally as part of the learning design process.Footnote2 Accessibility through inclusion in physical classroom spaces might manifest as flexible seating that allows speakers to be visible from any point in the room, or it might involve the speaker repeating a student's question before answering it. In digital learning spaces, creating accessible, inclusive content might involve providing captioned or transcribed videos, appropriately formatted text, meaningful hyperlinks, alt text for images and graphs, and transparent content that "communicates information that may be otherwise implicit."Footnote3

How to Leverage AI for Inclusive Access

Following are a few of the ways that generative AI tools can help educators create more accessible content.

  • Generate alt text or long descriptions for images. The generated text can then be refined or used as a starting point.
  • Take notes and generate summaries of meetings or class sessions (adhering to privacy and security guidelines).
  • Create representative images of challenging concepts.
  • Generate outlines of lectures or course materials. Provide students with example prompts so they can do the same.

Many emerging AI tools can help people complete overwhelming or complex tasks. Educators should be willing to allow students to leverage AI in ways they may not have considered.Footnote4

Questions for Further Consideration

The following questions can help educators deepen their understanding of how to use AI for inclusive access.

  • How can AI offer multiple perspectives and access points to learning?
  • Who may be left behind in learning environments where AI is required, limited, or banned?
  • To what extent are perceptions about AI based on assumptions or fears?
  • For what purposes might learners leverage AI for benefits you may not know about?
  • How might AI be used to empower people with disabilities and create a more inclusive world?

Example in Practice

The 2024 Microsoft Ability Summit convened people interested in the intersection of disabilities, creativity, emerging technologies, and the future of work. Panelists shared how they are leveraging AI to fill gaps in the workplace (e.g., memory loss resulting from an accident), improve efficiency (e.g., completing tasks relatively quickly considering specialty computer mouse options), and make content accessible (e.g., prompts for adding descriptions of low-contrast images). One panelist explained how they had come to rely on the dark mode feature offered by many operating systems, apps, and websites to reduce eye strain and increase the time they could devote to a task. They leveraged AI to compose an email requesting that colleagues share screenshots or screen captures in dark mode when possible. The AI tool helped them with sentence construction so they could focus on crafting a message with an appropriate tone for their sensitive request to change a workflow. Imagine the possibilities if more educators knew how to use AI tools to benefit themselves and their students.

Identities

Colleges and universities that espouse the value of diversity, equity, and inclusion invite their communities to bring their "whole selves" to the educational experience (e.g., the "Bring your whole self to Berkeley" campaign from the University of California, Berkeley). In this context, inclusion means understanding and affirming the many, often overlapping personal and social identities that influence how individuals navigate the world.Footnote5 In The Cambridge Handbook of Social Theory, sociologist Peter Burke describes social identities as "sets of meanings that define who we are in terms of the roles we have, the groups or social categories to which we belong, or the unique characteristics that make us different from others."Footnote6 Specific characteristics—such as age, faith, race, ethnic heritage, nationality or birthplace, economic background, gender expression, language, abilities, etc.—might lie at the core of a person's identity. Secondary dimensions of identity might include educational background (e.g., first-generation college students), citizenship, relationship status, caregiver status, service affiliation (e.g., military veterans), housing status, hobbies, or other interests. By carefully applying AI technologies, educators can find ways to embrace diversity, build community, and foster belonging within existing value systems.

How to Leverage AI for Identity Inclusion

Educators can use the following strategies to intentionally design instructional content with identity inclusion in mind.

  • Provide a GPT or AI assistant with upcoming lesson content (e.g., lecture materials or assignment instructions) and ask it to provide feedback (e.g., troublesome vocabulary, difficult concepts, or complementary activities) from certain perspectives. Begin with a single perspective (e.g., first-time, first-year student), but layer in more to build complexity as you interact with the GPT output.
  • Use a GPT or custom AI assistant to generate alternative explanations for difficult concepts using examples or analogies that students are familiar with. In your prompt, specify the audience that will receive the explanation. For example, "Explain [concept] to [audience] using examples from a family gathering."
  • Use custom GPTs or AI assistants to build a sense of community and foster engagement by connecting students with clubs, organizations, and events that match their interests.
  • Create a custom GPT or AI assistant that is trained on the course syllabus and content. Add a heuristic to configure how the assistant interacts with students. Refine the GPT configuration by adding resources to support the experiences of veterans or first-generation college students, for example.
  • Create a space to discuss GPTs and invite students to share their perspectives, personal experiences, and concerns. Clarify your expectations about how students should use GPTs in your courses, including how to cite or acknowledge GPT use in their coursework.
  • Reflect on areas where students typically struggle (e.g., using academic language, structuring their writing, and visualizing end products), and demonstrate how students can use a GPT as a collaborative, self-reflective learning assistant to help them improve their work or navigate assignments.

Questions for Further Consideration

The following questions can help educators deepen their understanding of how AI can be used for identity inclusion.

  • How can you teach students to use GPTs to demystify the "hidden rules" of their discipline?
  • How might AI outputs reflect implicit biases in your discipline?
  • How can you critically evaluate AI outputs (e.g., text and images) to ensure that AI-generated lesson content affirms students' identities without perpetuating stereotypes?
  • How can you use examples of biased outputs to help students become critical consumers and ethical users of AI?
  • How can you use AI to roleplay and consider different perspectives or generate content based on different perspectives?

Example in Practice

Nancy Park is an English professor at a four-year public Hispanic-serving institution (HSI) where 63 percent of students are from historically underrepresented groups (60 percent are Latino/Latina/Latine, 49 percent are Pell-eligible, and 59 percent are first-generation college students. Park has intentionally integrated GPTs into her instructional practice. Rather than assuming students' familiarity with these tools, she begins each term with an open conversation, inviting students to share their perspectives and experiences. She has been surprised by some of the informal ways students have leveraged GPTs. Specifically, she noted an apology letter and a thank-you note as two unexpected applications.

Park introduced students to GPTs to support their summary writing ability, as she has found students to be lacking in this skill. Reimagining this assignment with GPTs began with self-reflection. Park is an expert writer and reflective educator. As she thought through the mental steps involved in summarizing, she realized she had forgotten what it was like to be a beginning writer and the cognitive demands of writing a summary.Footnote7

She also considered her students' identities, noting that many are non-native English speakers or first-generation college students and come from a variety of educational backgrounds. She recognized how GPTs could support students individually as writers in her course and beyond. As a result, Park shows students how to work with GPTs, demonstrating ways to engage in a self-reflective writing process, evaluate the benefits and limitations of AI tools, and cite their use correctly. Beyond improvements in students' summary writing skills, Park said GPTs have helped her students improve their sentence structure and heightened their understanding of academic language. Reflecting on this growth led Park to an instructional revelation that GPTs can demystify the "hidden rules" of academic English that are difficult for beginners—especially non-native English speakers. Today, Park considers other hidden rules that GPTs can clarify (e.g., rules for cover letters).

Coupled with feedback from former students, AI tools can provide educators with insight into their current students' worldview. These insights can inform the design of instructional approaches to help students leverage GPTs to navigate hidden academic structures.

Epistemology

Epistemological inclusion involves the ways of knowing, thinking, and doing in academic disciplines, such as what counts as evidence or what methods are appropriate for scholarly study. Scientific fields tend to rely on precise measurements, mathematical models, and observable phenomena. Historians, however, draw from oral histories, documents, artifacts, and other cultural products to analyze and interpret past events and subjective human experiences. Inclusion requires educators to model and reveal what a disciplinary expert does or thinks about so learners can view a situation or phenomenon from the expert's perspective.

How to Leverage AI for Epistemological Inclusion

Following are a few ways that educators can use AI to help students understand what experts know and how they think.

  • Model effective AI use in your disciplines. What tools are you using and for what purposes (in English, environmental science, or civil engineering, for example)? If you use AI to be more effective in your domain, reveal these practices to your students. They are in your class to learn disciplinary skills and ways of knowing from you.
  • Provide (or co-construct with your students) prompts for generative AI or generative AI searches in the same way that you might provide examples of search engine queries or library database searches for academic research.
  • Demonstrate and teach expert practices in information verification, analytical questioning, and academic integrity.
  • Consider how course and program-level outcomes might require revisions to account for discipline-specific AI applications.

Questions for Further Consideration

The questions below can help educators deepen their understanding of how AI can be used for epistemological inclusion.

  • How can AI be leveraged to reveal foundational, implicit ways of thinking and doing in disciplines?
  • What applications of generative searches, generative AI, and AI models are acceptable in your discipline and under what conditions? Who decides, and how do they decide?
  • What guidelines or recommendations do authoritative disciplinary bodies, such as the Modern Language Association, offer concerning generative AI text, citations, and other practices?
  • In addition to discipline-specific AI literacy, what skills will be critical to participate successfully in disciplinary practices? How should educators teach students to prompt AI, verify generative AI outputs, deal with ambiguity, and engage critically and ethically?

Example in Practice

As part of a 2022 seminar class offered at UC San Diego, Professor Jon Shurin incorporated iNaturalist, an online species identification tool with AI support, to help students identify and observe species in Costa Rica, an area celebrated for its biodiversity.Footnote8

Species identification is a fundamental skill that is often required for ecological research and monitoring. Species observations are used to monitor population trends, evaluate conservation actions, assess the health of an ecosystem, and more. iNaturalist allows users to upload photos of plants and animals and get suggested species names. Because iNaturalist data is public, experts can view and weigh in on species identifications. This tool provides aspiring early career ecologists and citizen scientists access to the ways that biologists know, think, and do, supporting learners in their disciplinary study.

During the five-week study-abroad course, Shurin's students identified 746 species. Identifying that many species would not have been possible without the help of AI.

Leadership Imperative

To confidently apply emerging technologies such as generative AI into their practices, educators need leaders to promote conditions for success through infrastructure and partnerships. Frameworks for institutional AI strategies and policy development offer structured starting points for this work.Footnote9 Some infrastructural elements include adequate access to AI tools and effective coordination between various teams and experts involved in innovation. This coordination helps avoid duplication of services and tools and promotes a flexible approach that prioritizes the best options for institutional stakeholders. Leaders should also know how to use these tools so they understand the cascading effects of their decisions. For example, the president and chief information officer at San Diego State University issued a call to action and a moral imperative for the educational community to prevent an AI divide. Acknowledging the urgency of AI diffusion is the first step in preparing students, educators, and stakeholders for a future that is already here.Footnote10 Once the inevitability of AI is embraced, educators can leverage AI tools and literacies to advance inclusive educational practices.

Questions for Further Consideration

Navigating the integration of AI in higher education institutions is a complex endeavor. While leadership plays a crucial role, executing on that role can be challenging. Though the ideal approach may be unclear, exploring the following essential questions may provide some guidance.

  • Who is involved in the decision-making process for tool adoption and use? How does AI complicate the adoption of existing academic technology tools? Whose perspectives should be included (if they are not already)?
  • How can emerging technologies, such as generative AI, be leveraged to evaluate existing systems and practices and the resources required to make improvements? Bold yet ethical and responsible approaches are essential for successful integration.
  • How is your institution addressing the myriad instances in which AI features are plugged into tools that have already been adopted, some of which may not have the option to be turned off?
  • How can AI-related learning outcomes be incorporated into course curricula?
  • Who is likely to benefit the most from AI tool adoption or the lack thereof? Who is likely to be disadvantaged?
  • How can you partner with other institutions to share resources and findings? Many institutions are working simultaneously on systematic tool review, policy development, documentation, upskilling, etc.

Conclusion

The "adopt or be left behind" paradigm may be a fallacy. However, by taking a few practical steps to expand their awareness and understanding of AI technologies and tools, educators and leaders will be better positioned to leverage AI to advance inclusive excellence in higher education.

Notes

  1. Sean Burns and Nicole Muscanell, "EDUCAUSE QuickPoll Results: A Growing Need for Generative AI Strategy," EDUCAUSE Review, April 15, 2024. Jump back to footnote 1 in the text.
  2. Amanda Heidt, "'Without These Tools, I'd Be Lost': How Generative AI Aids in Accessibility," Nature, April 28, 2024; Lorna Gonzalez and Kristi O'Neil, "A Taxonomy of Inclusive Design: On Disclosure, Accessibility, and Inclusion," EDUCAUSE Review, November 15, 2019. Jump back to footnote 2 in the text.
  3. Megan Eberhardt-Alstot, "Small Teaching: Transparent Content," Carefully Curated (blog), CSU Channel Islands Teaching & Learning Innovations Knowledge Base, April 23. 2024. Jump back to footnote 3 in the text.
  4. Heidt, "'Without These Tools, I'd Be Lost'," April 8, 2024; "AI & Accessibility," Center for Teaching Innovation, Cornell University, accessed June 5, 2024. Jump back to footnote 4 in the text.
  5. "Bring Your Whole Self to Berkeley," Berkeley People & Culture, UC Berkeley (website) accessed June 22,2024; Jackson Bartlett, "Navigating Social Identity in the Classroom," Center for the Advancement of Teaching Excellence, University of Illinois Chicago, August 8, 2022. Jump back to footnote 5 in the text.
  6. Peter Burke, "Identity," in The Cambridge Handbook of Social Theory, ed. Peter Kivisto (Cambridge: Cambridge University Press, 2020), 63. Jump back to footnote 6 in the text.
  7. Nancy Park, interview by Megan Eberhardt-Alstot, Zoom video recording, May 31, 2024. Jump back to footnote 7 in the text.
  8. Mario Aguilera, "Artificial Intelligence Drives New Frontiers in Biology," news release, UC San Diego School of Biological Sciences, March 3, 2023. Jump back to footnote 8 in the text.
  9. Angela Gunder, OLC Framework for the Comprehensive Design, Equitable Implementation, and Continuous Improvement of AI Strategy, Online Learning Consortium, March 2024; Mastering AI Policies: A Framework for Institutional Alignment, Anthology, November 23, 2023. Jump back to footnote 9 in the text.
  10. Adela de la Torre and James Frazee, "Bridging the AI Divide: A Call to Action," Inside Higher Ed, April 4, 2024; Charles Hodges and Ceren Ocak, "Integrating Generative AI into Higher Education: Considerations," EDUCAUSE Review, August 30, 2023. Jump back to footnote 10 in the text.

Lorna Gonzalez is Assistant Vice President for Digital Learning at California State University Channel Islands.

Kristi O'Neil-Gonzalez is Instructional Technologist, Accessibility Lead, at California State University Channel Islands.

Megan Eberhardt-Alstot is Director of the Learning Resources Center at California State University Channel Islands.

Michael McGarry is Academic Technology Lead at California State University Channel Islands.

Georgia Van Tyne is a Learning Designer at California State University Channel Islands.

© 2024 Lorna Gonzalez, Kristi O'Neil-Gonzalez, Megan Eberhardt-Alstot, Michael McGarry, and Georgia Van Tyne. The content of this work is licensed under a Creative Commons BY 4.0 International License.