Skip to Main Content

Artificial Intelligence and Libraries

chat loading...

This content is drawn from a report authored by the AU Library's Artificial Intelligence Exploratory Working Group. You can read the groups full report covering the current state of AI and making recommendations to library leadership in the American University Research Archive.

Current State within Higher Education

Since the launch of the ChatGPT tool in late 2022, concerns about generative AI tools and their impacts on higher education have intensified. Generative AI tools have clear implications for all types of academic, administrative, and technical work, and this breeds unease and apprehension for many people. Top concerns appearing in publications like Inside Higher Ed focus on academic integrity, skills development, and the future of work and learning. Media stories abound in which students are passing off the chatbot’s writing as their own or citing hallucinated citations that never existed. Adding to the confusion, since the launch of ChatGPT, higher education has seen many tech companies moving forward quickly with AI detection software and tools that can be integrated into their products.

The prevailing attitude from leaders in education seems to be that faculty and staff of higher education institutions will need to adapt to the realities of what this tool can do and what that might mean for education. Creativity will be needed as higher education institutions rethink what the possibilities might be for students to learn using AI tools, while teaching students how to use these tools ethically and responsibly.

As an example, the EDUCAUSE Horizon Report provided recommendations that the best strategy to address AI is to teach it, not prohibit it (Pelletier et al., 2023). Experts are predicting that current and future college students will most likely be expected to be proficient in AI skills before entering the workforce. In an interview between Harvard Business Review editor in chief Adi Ignatius and Karim Lakhani, a professor at Harvard Business School, Ignatius clarified that he believes AI skills will soon become “table stakes” for getting hired for a job (HBR, 2023). Lakhani, for his part, stated that “AI is not going to replace humans, but humans with AI are going to replace humans without AI” (HBR, 2023). The EDUCAUSE Report authors stress the importance of preparing college students for these realities, while also stressing the responsibility that institutions have to teach students the limitations and ethical implications of using generative AI tools (Robert & Muscanell, 2023). Seah suggests that the goal of higher education institutions should be to create literate individuals that can be critical consumers of information and competent contributors to shared knowledge by teaching students how to think critically about this technology (Seah, 2023).

We do not yet have a complete picture of what current usage of generative AI tools by students is, whether for coursework, part-time jobs, or their life outside school and work. However, a few studies have begun looking into student usage and the situations in which they use AI. A survey conducted of 389 American and Australian students indicated that fewer than 25% of students reported using generative AI for coursework (Smolansky et al., 2023). This study did not ask students to clarify how they used these tools, or for what types of tasks.

To add to the literature, two Stanford researchers, Denise Pope and Victor Lee, have been conducting research on high school students for 15 years. Their research focuses on survey and focus group data asking students about different aspects of their academic lives. In their research, they have found consistently high reports of cheating in self-reported data – between 60-70% of students in their surveys have reported engaging in at least one cheating behavior during the previous month. Their continued research has found that that percentage has stayed the same in 2023 surveys when they asked questions specific to ChaptGPT and other new generative AI tools (Spector, 2023). These data suggest that access to AI tools is not yet increasing the frequency of cheating. The researchers propose that it might be the case that students are not yet familiar with the technology and how to exploit it successfully, and we may see this number change in future years (Spector, 2023).

The Stanford surveys asked students how they felt an AI chatbot like ChatGPT should be allowed for school-related tasks. Most students responded that they thought it should be accepted to help them get started with a school-related task, such as summarizing difficult concepts, studying, and generating ideas for a paper. Most students also reported that using a chatbot to write a whole paper should never be allowed and want to learn how to use AI tools to help with the same kinds of support a parent or tutor would offer (Spector, 2023).

As cheating is generally a symptom of deeper problems with the learning process, the researchers suggest that introducing strategies that work to prevent and reduce general academic integrity violations can work to reduce AI-related academic integrity violations. “Strategies to help students feel more engaged and valued are likely to be more effective than taking a hard line on AI, especially since we know AI is here to stay and can actually be a great tool to promote deeper engagement with learning” (Spector, 2023).

More information and research are needed on the current state of student usage of and opinions about generative AI, but from the information at hand, we can predict that higher education institutions seem poised at an opportune moment to begin educating students on using generative AI tools authentically, critically, and ethically. Most students do not yet use these tools for coursework and they currently believe that these tools should not be used to fully replace their academic output. There is not yet a high prevalence of using AI to cheat. Higher educational institutions have a considerable opportunity to step in and introduce high-quality educational resources concerned with using AI responsibly and strategically (Spector, 2023).