Using Student Data to Bridge the AI Divide

min read

Knowing your students is essential for bridging the AI divide and paving the way for a more inclusive and equitable future.

Bridge made out of gray cogs. Center of bridge joining the two halves is red. 
Credit: melitas / Shutterstock.com © 2024

Artificial Intelligence (AI) has become, at what feels like warp speed, an inescapable element in the dynamic landscape of higher education. Yet, as educators embrace the potential of AI to revolutionize teaching and learning, we must confront the stark reality of inequitable access and the need for ethical considerations surrounding its implementation. San Diego State University (SDSU), under the leadership of President Adela de la Torre, has undertaken a groundbreaking initiative to address these challenges. Through a comprehensive October 2023 survey of SDSU undergraduate and graduate students, the university shed light on the urgent need for data-driven strategies to ensure equitable access, foster responsible use, and empower faculty, staff, and students in navigating the AI landscape.Footnote1

Moving to Data-Driven Decision-Making

The survey conducted by SDSU revealed much regarding how students experience AI, including comments regarding concerns about a gap in access to AI tools and a broader digital divide.Footnote2 Further, students who reported greater access to technological resources, as measured by number of smart devices owned, were more inclined to feel at ease with AI (see table 1). This suggests a critical need for higher education institutions to forge partnerships with AI vendors to negotiate fair pricing models that prioritize accessibility for all students. To this end, SDSU has established the Equitable AI Alliance (EAIA) as a collaborative, multi-institutional platform and consortium advocating for accessible AI solutions and sharing best practices.Footnote3

Table 1. Device Ownership and AI Complexity
Students who agreed with the statement “AI is too complex for me to grasp”(n=7,025)

Number of Smart Devices Owned

Count of Students

Percentage of Students

1 or fewer

312

38.1%

2

3,231

30.2%

3

2,725

26.5%

4 or more

757

19.7%

 

Central to addressing these challenges is the adoption of a data-driven approachFootnote4—one that takes all stakeholders, students included, into account. The insights gleaned from the survey of students provide a valuable foundation for informed decision-making. By collecting data on students’ perspectives, expectations, and usage patterns regarding AI tools, institutions can tailor their strategies to meet the needs of diverse student populations. Moreover, the longitudinal tracking of student opinions allows for the continuous refinement and adaptation of AI initiatives over time. In fact, 20.6% of the 2,749 respondents to an optional question suggested further use of surveys, questionnaires, and polls to continue collecting input from the campus community.

Promoting Responsible and Ethical Use

One of the key barriers to the adoption of AI among faculty is the perception of AI tools as potential liabilities, particularly in relation to plagiarism and academic misconduct. While the majority of students (71%) recognize the importance of AI in their future careers, many face barriers due to a palpable lack of encouragement from professors.

To overcome this challenge, institutions must provide educators with training on the beneficial and responsible uses of AI. Likewise, educators must provide students with clear guidelines and training on the ethical, safe, and effective uses of AI. Students are aware of the need for tool competence. The survey showed consistent references to students’ desire for AI mastery both to be successful students and to be prepared for a career market that requires such skills. Responses included the following:

  • “There’s a need for educational resources that demystify AI.”
  • “Teachers should be equipped to integrate AI into their teaching.”
  • “Training in AI should be part of our curriculum.”
  • “AI literacy should be a focus for future professionals.”
  • “AI will require us to continually adapt and learn new skills.”

When AI use is reframed as a skill rather than a threat, faculty and students can be empowered to leverage AI tools to enhance teaching, research, and learning, thereby gaining a competitive advantage in the digital age.

In addition to affordability and access concerns, ethical considerations surrounding AI implementation loom large. The use of AI tools that are not approved by an institution poses significant risks to both students and institutions. For example, campus community members may unknowingly feed protected data into free or otherwise unvetted AI tools, exposing potentially confidential or sensitive data. To safeguard privacy and mitigate other risks, institutions must establish clear guidelines on the ethical, transparent, and safe use of AI. These guidelines should outline a procurement process that includes a review for acquiring AI tools, noting protocols for vetting and approving standard AI tools and for continuing with ongoing training and support to promote responsible AI practices.

Leveraging Students’ Insights for Faculty Development

The Academic Senate of the California State University (CSU), the nation’s largest four-year public university system, passed a resolution in 2023 calling for, among other items, “professional development opportunities for faculty to learn about generative AI and its applications to ensure they are prepared to effectively integrate it into their teaching.”Footnote5 Further, diverse U.S. regulatory and governance agencies have called for widespread AI literacy training. For example, H.R. 6791 (Artificial Intelligence Literacy Act of 2023) was introduced in the U.S. House of Representatives in December 2023. In light of this call to action for comprehensive AI literacy, institutions must develop and implement structured training initiatives that encompass both technical proficiencies and pedagogical strategies pertinent to AI. These programs should offer a diverse array of practical and theoretical approaches, equipping faculty with the requisite skills to engage with AI effectively while instilling a thorough understanding of the ethical considerations required to guide students in responsible AI utilization.

At SDSU, one of the 23 CSU campuses, the Information Technology Division and the Office of the CIO collaborated with AI Faculty and Student Fellows to embrace a shared governance approach when designing both the AI student survey and the Academic Applications of AI (AAAI) microcredential program. The AAAI training aims to equip faculty with the practical, ethical, and interdisciplinary know-how necessary to leverage AI technologies across various academic domains. Delivered through the university's learning management system, the AAAI program comprises modules covering practical skills and theoretical foundations, offering participants a pathway to earn a digital badge on completion of all required activities.

This model of faculty and student-driven resource development, informed by insights from the AI student survey, exemplifies a commitment to ongoing learning and growth. By engaging students and faculty in the design, development, and facilitation of the microcredential, this inclusive approach not only created a sense of shared ownership but also galvanized institutional support for AI resources campus-wide. Furthermore, it facilitated the establishment of clear ethical guidelines governing AI use, fostering coherence in students’ understanding of faculty members’ expectations.

Currently, these guidelines are being considered by the SDSU University Senate, underscoring the impact of the AAAI program in shaping institutional policies and practices surrounding the use of AI in higher education. Importantly, these guidelines are just that: guidelines. By avoiding the impulse to issue firm directives or “policy” requirements, which tend to ossify in large organizations such as higher education institutions, the IT Committee of the SDSU University Senate left the door open for making easy updates when the AI ecosphere shifts again—as it surely will do, and soon. We also demonstrated a commitment to honoring academic freedom for our faculty stakeholders, increasing buy-in and follow-through on the front lines.

A Call to Action

As AI continues to reshape the landscape of higher education, the need for data-driven strategies has never been more pressing. By harnessing the power of students’ insights, institutions can quantify structural disadvantages—for example, how some students will have easy access to a range of sophisticated and specialized AI tools while others will struggle to capitalize on free and less-powerful versions of the same AI. Gathering student data keeps students at the core of the decision-making process. Knowing your students is essential for bridging the AI divide and paving the way for a more inclusive and equitable future. Let us seize this opportunity to empower faculty, staff, and students in embracing the transformative potential of AI in higher education.

SDSU extends an invitation to higher education leaders to join us in this critical endeavor by leveraging our AI student survey and sharing data on the evolution of the AI student experience. The SDSU AI Student Survey Instrument is freely available for community members to use or adapt. Together, we can chart a course toward a future in which AI serves as a catalyst for innovation, equity, and excellence in higher education and the workforce.

Notes

  1. The survey was sent to 37,728 undergraduate and graduate students, with a 20.7% (7,811) raw response rate. After data cleaning, our final sample size was 7,025. As far as we have found, this is the largest higher education student survey focusing exclusively on AI. Jump back to footnote 1 in the text.
  2. For more details on the survey results, see David Goldberg, Elisa Sobo, James Frazee, and Sean Hauze, “Generative AI in Higher Education: Insights from a Campus-Wide Student Survey at a Large Public University,” in Proceedings of the Society for Information Technology and Teacher Education (SITE) 2024 (Association for the Advancement of Computing in Education, 2024, in press). Jump back to footnote 2 in the text.
  3. Adela de la Torre and James Frazee, “Bridging the AI Divide: A Call to Action,” Inside Higher Ed, April 4, 2024. Jump back to footnote 3 in the text.
  4. Veronica Diaz, “Exploring the Opportunities and Challenges with Generative AI,” EDUCAUSE Review, February 6, 2024. Jump back to footnote 4 in the text.
  5. EDUCAUSE, Council on Governmental Relations, and the Association of American Universities, Comments in response to FAR Case 2021-017, "Federal Acquisition Regulation: Cyber Threat and Incident Reporting and Information Sharing," February 2, 2024. Jump back to footnote 5 in the text.

James P. Frazee is Interim Chief Information Officer and Vice President for the Information Technology Division at San Diego State University.

© 2024 James P. Frazee