In the Room Where It Happens: Generative AI Policy Creation in Higher Education

min read

To develop a robust policy for generative artificial intelligence use in higher education, institutional leaders must first create "a room" where diverse perspectives are welcome and included in the process.

5 people around a table with speech bubbles, cogs, and lightbulbs above their heads
Credit: alfasatryapermana / Shutterstock.com © 2025

In December 2023, we explored what it would take to develop a robust and meaningful policy for generative artificial intelligence (GenAI) use in higher education.Footnote1 Since then, institutional AI policy development has fallen short of our expectations. We have found (through direct observation and discussions with others) that colleges and universities generally are not prioritizing a diverse range of stakeholders in the decision-making process. This article provides additional guidance and direction to those who may want to get involved in—and invite others to join—conversations about the role of GenAI at their institutions.

Building the "Room Where It Happens"

If your institution has not created a space to explore GenAI use cases, opportunities, and implications—the metaphorical "room where it happens"—our 2023 EDUCAUSE Review article on building a policy serves as a nice primer. Here are some additional considerations to help you—as a decision-maker or committee member leading AI policy development—set up a room at your institution.

Intersectionality

All people have multiple identities and viewpoints, many of which are not entirely evident or expressed in the workplace. However, those perspectives can provide valuable insights about the community members on your campus. Research shows that diverse teams deliver more creative and higher-quality work than homogenous ones.Footnote2 Perspectives from IT security staff are just as important as those of staff members with disabilities or staff members who are non-native English speakers.

Each person on a diverse team offers specialized knowledge and a personal perspective that can inform how generative AI-powered tools will impact all members of the institutional community. Involving people with a wide range of expertise is crucial for exploring use cases and institutional needs. For example, AI facial recognition technology cannot currently identify darker-skinned individuals as accurately as it does lighter-skinned individuals. AI voice recognition may not accurately capture the speech of people with speech disorders or heavily accented pronunciation. These limitations can lead to inequities and undermine the usefulness of the technology.Footnote3 Although "future proofing" an AI policy may not be possible, a diverse and creative team can help limit and pre-empt the concerns and challenges that will arise with increasing generative AI use.

Belonging

Writing policies for technology adoption can be both exciting and scary, especially for people in marginalized groups whose voices may not have been considered during the software or hardware development stage.Footnote4 In higher education, we have had rich discussions about belonging for students, but we struggle when considering these concepts for staff and faculty. Mindfully constructing the AI policy room to include people who have been excluded from past conversations will foster a sense of ownership in the process. Such an approach can enhance participants' sense of belonging, "leading to richer and more effective policies."Footnote5

Bringing people together may not be enough to generate a truly inclusive environment, though.  Belonging is tied to one's sense of being valued personally and professionally. When a person's ideas are taken seriously and they are credited publicly for their contributions, they are more likely to develop a sense of belonging. Feeling psychologically safe leads to heightened engagement, increased knowledge sharing, and, ultimately, more creative solutions in the AI policy-development room.Footnote6 Informal dialog and interactions with committee members outside of prescribed meeting times and spaces can also result in increased feelings of inclusion. These types of interactions often allow people to present themselves more fully and authentically to colleagues.

Although a sense of belonging is often influenced by the culture of an institution, individuals can serve as the change agents a campus needs to foster a sense of belonging by participating in AI policy discussions. The rise of GenAI can build upon efforts of inclusion and belonging or be the impetus for building a community where everyone is included from the beginning! This may sound idealistic, but if you are in the room, you can contribute to its norms and expectations.

Finding the Room (or Rooms)

Since the advent of ChatGPT and other GenAI tools, there has been increased discussion on campuses about how to respond to this increasingly ubiquitous technology and define its place in higher education. However, the 2025 EDUCUASE AI Landscape Study revealed that less than 40 percent of higher education institutions surveyed have AI acceptable use policies.Footnote7 And it is not clear how those policies are constructed and who gets to be part of the policy creation process.

Conversations about AI have been ongoing on some campuses since late 2022; at others, discussions began during the 2023–2024 academic year. Some colleges and universities are just beginning to talk about it. Our collective experience consulting and speaking at dozens of institutions supports the findings of the 2025 EDUCAUSE AI Landscape Study: many institutions have yet to begin formal discussion or policy development, even as of spring 2025.Footnote8

Incidentally, multiple conversations may be happening simultaneously at an institution, and those conversations may not be connected. Individuals from various working groups may not even be aware of one another—something that happens at larger, decentralized campuses and smaller campuses alike. These conversations primarily occur at institutional intersections—particularly those involving technology. One prominent example is IT services. Those who provide these services are often focused on security, privacy, technology deployment, and vendor agreements. Consequently, these team members often determine or are deeply involved in institutional policy. Other places where campus AI policy discussions are happening include centers for teaching and learning, instructional design divisions, libraries, and accessibility services departments. Each of these areas has overlapping and distinct reasons for considering the role of GenAI in education and may be creating guidance, test-piloting tools, or researching it.

Because GenAI raises many issues for campuses to address, institutional groups often struggle to fully grasp and respond to its implications. As an individual, you can be most impactful by identifying where conversations are happening and what aspects interest you the most.

  • Students' use (in their academic programs, roles at the institution, or in general)
  • Faculty members' use (in their teaching, researching, or institutional and administrative duties)
  • Staff members' use
  • Administrators' use
  • Institutional AI tool procurement and implementation (including activating AI in already purchased tools)
  • Security and privacy considerations

Once you determine the type of AI work you'd like to advance, consider what your role should be within that scope. Here are some ideas to help you decide how best to participate and contribute your expertise and ideas. Each of the following actions may look different based on your preferred focus or angle.

  • Align: Contribute to building policies or practices that align with the larger goals of the group and the institution.
  • Advocate: Ensure that specific ideas, issues, or people are included in the conversation.
  • Communicate: Clarify the issues, implications, terminology, and considerations in a way that makes sense to those who are not "in the room."
  • Connect: Facilitate conversations across two or more campus groups.
  • Execute: Implement the policy and practices across the institution.
  • Process: Facilitate effective ways of developing and implementing policies and practices.
  • Question: Raise critical questions to surface assumptions and identify unintended consequences.

Although this list is not comprehensive, it should provide some ideas about what will best serve you and your institution.

Entering the Room

Although higher education presents itself as the gateway to upward mobility in a supposedly meritocratic society, many areas of academia are rigidly hierarchical. Consequently, power and politics may dictate who can enter the room where AI policies take shape, which means that not all people who should be included in conversations are included. Being aware of this reality is important.

Although getting into the room is easy for some people, others may struggle to get the door to budge.  Gaining access might require a range of tactics, including advocating for yourself to your supervisor; requesting formal, informal, and off-the-record conversations with institutional colleagues to learn more about the politics of the room; and determining what the "cost" of access would be (and whether it is worth it).

If you were invited to the room late in the process, stepping into that space can be tricky. Being brought in later than others may cause resentment or doubt. It can be easy to focus on that, but your goal should be to build a foundation for your contributions. Here are a few recommendations to prioritize:

  • Get up to speed—quickly. Request access to documentation and try to get briefed on the project by someone who has been involved since the beginning. What was the catalyst for the project? Who is already involved? Find out if there is an existing project charter or task outline that you can consult to ensure you are clear about the shared goals and deliverables.Footnote9
  • Check your ego at the door. Whether you're surprised that you weren't asked to participate from the get-go or disappointed that you weren't viewed as an asset to the conversation sooner, the temptation is to attribute a slow or missed invitation as a political decision or a reflection on your perceived value. However, that isn't necessarily the case. Take a moment to reflect on whether the "fundamental attribution error" is in effect.Footnote10 Although there may be bad actors in a situation, it's more likely that your skills were either not required earlier in the process or unknown to decision-makers. In many cases, people are simply overwhelmed.
  • Look forward. Although it's natural to wonder what would have happened if you had been brought on earlier, looking in the rearview mirror won't necessarily resolve your feelings. As difficult as situations like these can be, viewing them as learning opportunities can be productive. By being conscious of the circumstances surrounding occasions when you have been overlooked in the past, you can better advocate for yourself now and in the future.

Regardless of when you join the room, you can have a threefold impact: first, you are representing your department and colleagues (or a particular stakeholder demographic); second, you are innovating on behalf of your institution and bringing a fresh and dynamic perspective to the deliberation process; and third, you are laying the foundation for why you should be consulted early and often when future projects are being conceived. This is your opportunity to showcase the value of your perspective and skills—and to demonstrate why your involvement is essential to achieving a strong return on investment.

Making Room for Others

Once you have a seat at the table in the room, aside from focusing on what you have to offer and how best to advocate for your constituency, you also have an opportunity to look for stakeholder gaps—such as a lack of representation from certain departments (e.g., Human Resources) or an imbalance in representation (e.g., too many or too few staff and faculty members compared to upper-level administrators and business-community members). Apart from unequal numbers, certain voices may simply be louder or heard more clearly than others. In each of these cases, you have an opportunity to share your knowledge and experience and advocate for marginalized individuals or groups. However, before you begin advocating for additional voices in the room, be sure to discover how the room was created and who was involved previously.

In our 2023 EDUCAUSE Review article, "Cross-Campus Approaches to Building a Generative AI Policy," we discussed strategies for identifying stakeholders, especially those who are traditionally marginalized. Before you attend your first GenAI policy meeting, think about who you believe should be involved in the decision-making process by considering the groups with which you routinely interact. The following are examples of groups that are frequently overlooked:

  • ADA/Accessibility departments. The ADA/Accessibility department should always be included in decision-making opportunities, yet they are often left out of or consulted far too late in the process. For example, in July 2023, the U.S. Equal Employment Opportunity Commission issued updated guidelines on how the Americans with Disabilities Act pertains to candidates and employees with visual disabilities to ensure AI tools are accessible. Ginger Christ, an editor for HRDrive, wrote, "The guidance specifies that employers need to provide reasonable accommodations for any decision-making tools using algorithms or AI, such as for hiring." She adds, "Companies also should share information about how the technology evaluates applicants or employees and provide instructions on how to seek an accommodation, according to the guidance."Footnote11
  • IT and Counseling departments. Perhaps your team has watched too many episodes of The IT Crowd and assumes the CIO and other members of the IT department will scold and not listen to them. If so, you have missed an opportunity. Invite members of your IT department to be on hand to answer questions and troubleshoot pie-in-the-sky ideas. Conversely, IT staff members should tap faculty who specialize in communication to help create messaging around a policy. In addition, members of the counseling services department can provide a list of common terms and phrases used by the community to ensure that an appropriate sentiment analysis is constructed.
  • Other opportunity groups. Campus groups, such as those that support minoritized students, may have a unique perspective on the role various AI tools and programs could play in learning environments. In a 2023 Forbes article, Jeff Raikes wrote, "If we can diversify both the researchers who are creating AI systems and the datasets these algorithms use to learn, we can help teach them better habits and ensure more equitable outcomes."Footnote12 Even if committee members aren't developing a large language model, ensuring that decision-making spaces reflect diversity will help limit the impact of bias and discrimination of AI tools. As a campus leader, you can help ensure representatives of all stakeholder groups are involved as early in the process as possible.

Task groups commonly suffer from a representation fallacy, where inviting a single individual from one stakeholder group is considered equivalent to broader representation from other stakeholder groups. In some cases, parity may be difficult to assess. An unbalanced group may result from stakeholders being unable to attend meetings. For example, if meetings are held during hours when staff from student-facing offices can't leave their desks or during months when faculty members aren't required to be on campus, those stakeholders will be there in name only. Ensuring that space is made for all individuals to contribute fully and that there is balanced representation is paramount. Attaching a person's name to a meeting roster does not guarantee their voice will be heard.

If you notice a gap in stakeholder voices, several strategies can help you address that imbalance. You could recommend to the committee's leadership that specific people be invited. You could also address the committee to see if other members notice the absence of any stakeholders. Or you could persuade stakeholders who are not in the room to advocate for themselves, which affords them more agency in the process. The context and your experience will help you determine which option is appropriate.

Learning more about your fellow stakeholders' needs and concerns will help you to be a good steward of your institution. Listening closely to the input of others will enable you to gain a deeper, more nuanced understanding of the situation. As a result, you may even reevaluate your position on various issues surrounding GenAI use. Take advantage of the diverse group you helped to assemble to expand your knowledge toolkit.

Hitting Room Capacity

Expanding AI policy committees endlessly to include every stakeholder may not be feasible or effective. Because these committees often advance recommendations and policies for other decision-makers to approve, having all parties in the room may not be ideal or necessary. However, making sure all voices are heard is crucial. Even if the AI policy committee is full, there are other ways to include additional voices from across the institution.

Be an Advocate

If getting particular stakeholders into the room is not possible, you can still serve as their advocate and represent their voices. Meet regularly with them to share committee updates and get feedback to share with the room. For example, if the director of your ADA/Accessibility office can't attend committee meetings due to a standing scheduling conflict, task yourself with understanding and sharing the director's point of view on GenAI policy with the committee and the leadership team. Emphasize to other committee members that being a go-between is far less effective than having the director at the meetings and encourage inclusive scheduling so all committee members can participate.

Establish Working Groups

Forming working groups within the room is a great way to distribute leadership responsibilities and provide leadership opportunities. These groups can be organized according to topic or policy type or subdivided by school or program, as needed. Working groups can, for example, tackle sections of a policy or focus on how to apply the institutional policy to a particular department or program. Because they have narrower goals, working groups can manage scope creep more effectively. Committee members can then advocate for the specific concerns and perspectives of their working groups. These groups can also include stakeholders who are not committee members but still have valuable insights to add to the conversation.

Send a Survey or Host a Listening Session

If your committee is under time or resource constraints, drafting a survey or scheduling a listening session is another way to solicit input. Likert scales and open-ended questions provide ways to gather input from non-committee members and incorporate their feedback into policy decisions. Soliciting input in these ways also prevents making false assumptions about groups and allows for nuanced feedback. We recommend being intentional when asking others for their time and input. We already know that many people have survey fatigue. AI fatigue is also on the rise.Footnote13

Conclusion: The Room Where It Happens

Although the recommendations in this article were inspired by the need for inclusive policy development related to GenAI, they can and should be applied to any future disruptions in higher education. Finding the room, entering the room, and making room for others can become the new normal for building effective policy structures that leverage important considerations from key stakeholders. As with any type of advocacy work, evolving technologies and ways of thinking require higher education professionals (and institutions) to change and adapt in ways that best serve their students and communities. The most thoughtful way to do so is with policy committees that are diverse and represent as many perspectives as possible.

Let's continue this work by uplifting and supporting each other. Together, we can reimagine the room where AI policy work happens and create a better path for institutional governance.

Notes

  1. Esther Brandon, Lance Eaton, Dana Gavin, and Allison Papini, "Cross-Campus Approaches to Building a Generative AI Policy," EDUCAUSE Review, December 12, 2023. Jump back to footnote 1 in the text.
  2. Vivian Hunt, Lareina Yee, Sara Prince, and Sundiatu Dixon-Fyle, Delivering Through Diversity, (McKinsey & Company, January 2018). Jump back to footnote 2 in the text.
  3. Joy Buolamwini and Timnit Gebru, "Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification," Proceedings of the 1st Conference on Fairness, Accountability and Transparency, PMLR 81 (2018): 77–91; Eileen O'Grady, "Why AI Fairness Conversations Must Include Disabled People," Harvard Gazette, April 3, 2024. Jump back to footnote 3 in the text.
  4. For an excellent exploration of this topic, check out Joy Buolamwini, Unmasking AI: My Mission to Protect What Is Human in a World of Machines (Random House, 2023). Jump back to footnote 4 in the text.
  5. Gina Kennedy, "Policy Matters," Policy Matters Blog, The Association of College and University Policy Administrators, July 15, 2024. Jump back to footnote 5 in the text.
  6. Amy C. Edmonson and Zhike Lei, "Psychological Safety: The History, Renaissance, and Future of an Interpersonal Construct," Annual Review of Organizational Psychology and Organizational Behavior, March 21, 2014. Jump back to footnote 6 in the text.
  7. Jenay Robert and Mark McCormack, 2025 EDUCAUSE AI Landscape Study: Into the Digital AI Divide, (EDUCAUSE, February 2025). Jump back to footnote 7 in the text.
  8. Ibid. Jump back to footnote 8 in the text.
  9. Anna Baluch, "What Is a Project Charter? Everything You Need to Know," Forbes Advisor, May 29, 2024; "Project Charters," Minnesota State Community and Technical College, January 18, 2022. Jump back to footnote 9 in the text.
  10. Patrick Healy, "Fundamental Attribution Error: What It Is & How to Avoid It," Business Insights Blog, Harvard Business School Online, June 8, 2017. Jump back to footnote 10 in the text.
  11. Ginger Christ, "EEOC: Employers Must Ensure AI Tools Are Accessible for Workers with Visual Disabilities," HR Dive, July 27, 2023. Jump back to footnote 11 in the text.
  12. Jeff Raikes, "AI Can Be Racist: Let's Make Sure It Works for Everyone," Forbes, April 21, 2023. Jump back to footnote 12 in the text.
  13. Jack Davies, "Sending Too Many Surveys? How to Avoid Survey Fatigue," Qualtrics (blog), June 25, 2019; Sherzod Odilov, "Here's How Leaders Can Manage AI Fatigue," Forbes, February 14, 2024. Jump back to footnote 13 in the text.

Esther Brandon is the Associate Director for Teaching and Learning Technologies at Harvard Medical School.

Lance Eaton is an Educator, Writer and Public Speaker. He was recently appointed Senior Associate Director of AI in Teaching and Learning at Northeastern University.

Dana Gavin is the Director of the Writing Center at Dutchess Community College.

Allison Papini is Assistant Director / Manager of Research and Instruction Services at Bryant University.

© 2025 Esther Brandon, Lance Eaton, Dana Gavin, and Allison Papini. The content of this work is licensed under a Creative Commons BY-NC-SA 4.0 International License.