The use of generative AI tools on campus is an excellent opportunity for technology and other leaders to provide guidance to students, faculty, and staff about how to navigate these new technological waters.
In April 2023, we were involved in a panel with students at College Unbound. The conversation—"Generative AI and Higher Education: Disruption, Opportunities, and Challenges"—offered many different highlights, and the students brought rich thoughts, provocative considerations, and smart ideas, reinforcing the fact that discussions around what to do about generative AI (or about anything else, for that matter) are enhanced when students are involved.
Toward the end of the panel conversation, Stan asked the students what they thought could be done to help faculty, students, and staff navigate the rise of AI. Essentially, he was curious to hear about the roles that technology and other leaders could fulfill. After thinking about their answers and engaging in further reflection, we came up with ten suggestions for how to step up and in to the generative AI discussion in higher education.
1. Offer Short Primers on Generative AI
Short primers offered on the topic of generative AI would ideally be text and video and could be harvested from what is already available on different websites and video platforms (e.g., YouTube). LinkedIn Learning would be another useful place to find content if your institution has access. The goal here is to provide short but clear content, ideally focused on the following questions:
- What is generative AI?
- Why does generative AI feel new and/or different (in general and in higher education, specifically)?
- Who are the current providers of generative AI tools?
- How should these tools be used?
- What is the role of prompting in better understanding the usefulness and limitations of generative AI, and what are some strategies for using prompting?
Pro Tip: Use ChatGPT or other generative AI tools, and record what you do. This recording can serve as the basis for your own videos as an example of how you've integrated generative AI.
2. Explain How to Get Started
Providing clear guidance for faculty about how to get started using generative AI and what to try first can be helpful. Maybe the guidance is a "top 10 things to try," with each suggestion becoming progressively more dynamic or complex. This is your opportunity to help faculty and staff see the power of generative AI in action. Consider providing prompts that can help them understand how this tool would be useful for them individually and in relation to their roles at the institution, including for students.
3. Suggest Best Practices for Engaging with Generative AI
The next step is to target those who may already be familiar with generative AI tools but need help improving their skills. Guidance here might include a "Prompt Cookbook" with up/down voting or feedback from users who have found these tools useful/helpful or who want to add recommendations.Footnote1 These best practices might also identify limitations of the tools, along with workarounds. Linking to resources for the proper citation of generative AI tools would also be helpful.Footnote2
4. Give Recommendations for Different Groups
This step might be tricky to determine and appropriately represent the different groups, but the goal is to highlight several different groups of people who are navigating generative AI on campus and provide them with some insight. For instance, there are many skeptics about these tools (and given what we know about AI, these skeptics will be important voices to include), and there are many people who are excited about the tools. There are also people who don't even know how they feel about generative AI. A resource can provide a mixture of considerations for each type of person. For the skeptic, a framework could validate that skepticism while also providing some considerations about the importance of learning more about the generative AI tools that many industries will be using soon or are already using. For the tech evangelist, a framework could provide a healthy dose of skepticism and concern about these tools, particularly around biases, the environmental impact, and human exploitation related to the creation and maintenance of generative AI. Finally, for the person who hasn't formed any opinion, the framework could provide an anchor with which to begin exploring and testing out generative AI tools with a bit of clarity and confidence.
The overall goal isn't to persuade people to feel better about whatever their stance is but, rather, to help all of them leverage a more balanced and informed perspective.
5. Recommend Tools
Every day, it seems like a dozen new AI tools are being promised, previewed, or released. Many on campus are a bit lost, and this could be a great opportunity for technology and other teams to provide some recommendations for some of the best AI tools to use. These teams could also suggest relevant considerations: What do the generative AI tools cost? What do we know about data privacy and these tools? What should/shouldn't the tools be used for? Is there an institutional account/discount for any of these tools? New options include anticipated announcements and beta deployments from major productivity suite producers such as Microsoft and Google. Generative AI search is now available on both platforms, and productivity suite enhancements are being added every day.
6. Explain the Closed vs. Open-Source Divide
As open-source versions gain steam, tracing out the differences and values among them can be invaluable for users on campus. The open-source models are placing capable tools in the hands of hobbyist and professional organizations alike. This will lead to a proliferation of finely tuned AI models contextual to specific institutions. Colleges and universities will increasingly begin to host their own large language models. This will provide options for localized training and utilization with less risk and more utility due to potential personas and organizational data access.
Currently, most of these open-source models come with a performance penalty and do not measure up to the corporate models. However, who knows how long this advantage will remain intact? Engineers at Google, for example, are famously rumored to have stated that they have no "secret sauce" and no real competitive moat between other providers and open-source models.
Priming people on your campus about the differences between the models could help them be ready to embrace (or at least understand) the importance of whichever model your institution may ultimately choose. This could also help institutional leaders navigate issues such as how to allow access to the large language models for training purposes, how to decide who should maintain custody and update the derivative datasets, and how to determine what should and shouldn't be shared in the interest of academic openness and freedom of expression.
7. Avoid Pitfalls
While the use of generative AI tools is skyrocketing, this is often being done without organizational contracts or formal risk analysis. Institutional technology and other leaders need to understand the risks and opportunities associated with new AI technologies. Offices of general counsel will want to review the term sheets for the "free to use" options and to negotiate favorable terms for the for-fee and site licenses their institutions commit to. CIOs and CISOs should weigh in on what is acceptable use for their institutions. Clear written guidance needs to be quickly developed and socialized. When introducing these technologies to members of your campus community, be sure to advise them to steer clear of placing sensitive or restricted data into the tools and to limit identifiable information.
8. Conduct Workshops and Events
Holding workshops, drop-in sessions, and campus events can be another means of bringing people together to help them understand the complexity of generative AI. These trainings might be more effectively targeted toward different populations (e.g., faculty, staff, students) and different levels of users (e.g., novice, intermediate, and expert). Additionally, this could be a great way to recruit people for working groups or communities of practice as the generative AI landscape continues to change.
9. Spot the Fake
Providing tips about how to avoid fake or lower-quality tools could also be extremely helpful. Many quickly emerging AI platforms offer very little privacy information or even pricing (sometimes being framed as in "beta" mode) but will ask people for access to private data (e.g., personal information, email, data). Where and when should this be considered? Additionally, with all the hype currently surrounding AI, many of these tools and platforms will not pan out as viable in the long run. Experimentation is encouraged, but institutional technology and other leaders should engage with their eyes wide open and take measured risks when investing funds into capability exploration. Think back to the early days of the internet and the numerous websites that went the way of the dinosaur in the dot-com bust.
10. Provide Proper Guidance on the Limitations of AI Detectors
Many people and companies are likely becoming rich right now by creating tools that detects students' work created with generative AI. The reality is that these tools are not going to work in any significant or long-term way. There are too many other tools that can trick AI-detection tools. There is also increasing evidence that AI plagiarism detection not only creates false positives but does so disproportionately for students for whom English is not their primary language.Footnote3 A primer that explains the underlying problems and realistic abilities of AI-detection tools could save colleges and universities a lot of money.
The institutional IT organization is often the hub of digital safety, security, and privacy considerations. All of those manifest when it comes to the use of generative AI tools. This usage on campus is an excellent opportunity for technology and other leaders to provide guidance to students, faculty, and staff about how to navigate these new technological waters. If this can be done with good resources, accessible language, and thoughtful support, these leaders can provide a life raft to many.
- For example, see this "Prompt Guide" provided by Lance Eaton to attendees at the session "Ready or Not, Here AI Come: Exploring the Role of Generative AI in Higher Education," Massachusetts Colleges Online Conference on eLearning, June 15, 2023. Jump back to footnote 1 in the text.
- See, for example, "How Do I Cite Generative AI in MLA Style?" MLA Style Center (website), accessed September 20, 2023; Timothy McAdoo, "How to Cite ChatGPT," APA Style Blog, April 7, 2023; "Citation, Documentation of Sources," Chicago Manual of Style Online (website), accessed September 20, 2023. Jump back to footnote 2 in the text.
- Tara García Mathewson, "AI Detection Tools Falsely Accuse International Students of Cheating," The Markup, August 14, 2023. Jump back to footnote 3 in the text.
Lance Eaton is Director of Digital Pedagogy at College Unbound in Providence, Rhode Island.
Stan Waddell is Vice President and CIO of Carnegie Mellon University.
© 2023 Lance Eaton and Stan Waddell. The text of this work is licensed under a Creative Commons BY 4.0 International License.