Since the launch of ChatGPT in 2022, higher education as a sector has grappled with the role large language models and generative artificial intelligence tools can and should play in students’ lives.
A recent survey by Inside Higher Ed and Generation Lab found that nearly all college students say they know how and when to use AI for their coursework, which they attribute largely to faculty instruction or syllabus language.
Eighty-seven percent of respondents said they know when to use AI, with the share of those saying they don’t shrinking from 31 percent in spring 2024 to 13 percent in August 2025.
The greatest share of respondents (41 percent) said they know when to use AI because their professors include statements in their syllabi explaining appropriate and inappropriate AI use. An additional 35 percent said they know because their instructors have addressed it in class.
“It’s good news that students feel like they understand the basic ground rules for when AI is appropriate,” said Dylan Ruediger, principal for the research enterprise at Ithaka S+R. “It suggests that there are some real benefits to having faculty be the primary point of contact for information about what practices around AI should look like.”
The data points to a trend in higher education to move away from a top-down approach of organizing AI policies to a more decentralized approach, allowing faculty to be experts in their subjects.
“I think that faculty should have wide latitudes to teach their courses how they see fit. Trusting them to understand what’s pedagogically appropriate for their ways of teaching and within their discipline” is a smart place to start, Ruediger said.
The challenge becomes how to create campuswide priorities for workforce development that ensure all students, regardless of major program, can engage in AI as a career tool and understand academic integrity expectations.
Student Perspectives
While the survey points to institutional efforts to integrate AI into the curriculum, some students remain unaware or unsure of when they can use AI tools. Only 17 percent of students said they are aware of appropriate AI use cases because their institution has published a policy on the subject, whereas 25 percent said they know when to use AI because they’ve researched the topic themselves.
Ruediger hypothesizes that some students learn about AI tools and their uses from peers in addition to their own research.
Some demographic groups were less likely than others to be aware of appropriate AI use on campus, signaling disparities in who’s receiving this information. Nearly one-quarter of adult learners (aged 25 or older) said they don’t know how or when to use AI for coursework, compared to 10 percent of their traditional-aged peers. Similarly, two-year college students were less likely to say they are aware of appropriate use cases (20 percent) than their four-year peers (10 percent).
Students working full-time (19 percent) or those who had dropped out for a semester (20 percent) were also more likely to say they don’t know when to use AI.
While decentralizing AI policies and giving autonomy to faculty members can better serve academic freedom and AI applications, having clearly outlined and widely available policies also benefits students.
“There is a scenario here where [AI] rules are left somewhat informal and inconsistent that ends up giving an advantage to students who have more cultural capital or are better positioned to understand hidden curricular issues,” Ruediger said.
In a survey of provosts and chief academic officers this fall, Inside Higher Ed found that one in five provosts said their institution is taking an intentionally hands-off approach to regulating AI use, with no formal governance or policies about AI. Fourteen percent of respondents indicated their institution has established a comprehensive AI governance policy or institutional strategy, but the greatest share said they are still developing policies.
A handful of students also indicated they have no interest in ever using AI.
In 2024, 2 percent of Student Voice survey respondents (n=93) wrote in “other” responses to the question, “Do you have a clear sense of when, how or whether to use generative artificial intelligence to help with your coursework?” More than half of those responses—55—expressed distrust, disdain or disagreement with the use of generative AI. That view appears to be growing; this year, 3 percent of respondents (n=138) wrote free responses, and 113 comments opposed AI use in college for ethical or personal reasons.
“I hate AI we should never ever ever use it,” wrote one second-year student at a community college in Wyoming. “It’s terrible for the environment. People who use AI lack critical thinking skills and just use AI as a cop out.”
The Institutional Perspective
A separate survey fielded by Inside Higher Ed and Generation Lab found that more than half of student success administrators (55 percent) reported that their institution is “somewhat effective” at helping students understand how, when and whether to use generative AI tools in academic settings. (“Somewhat effective” is defined as “there being some structured efforts, but guidance is not consistent or comprehensive.”)
More than one-third (36 percent) reported their institution is not very effective—meaning they offer limited guidance and many students rely on informal or independent learning—and 2 percent said their institution is “very effective,” or that students receive clear guidance across multiple channels.
Ithaka S+R published its own study this spring, which found that the average instructor had at least experimented with using AI in classroom activities. According to Inside Higher Ed’s most recent survey of provosts, two-thirds of respondents said their institution offers professional development for faculty on AI or integrating AI into the curriculum.
Engaging Students in AI
Some colleges and universities have taken measures to ensure all students are aware of ethical AI use cases.
Indiana University created an online course, GenAI 101, for anyone with a campus login to earn a certificate denoting they’ve learned about practical applications for AI tools, ethical considerations of using those tools and how to fact-check content produced by AI.
This year the University of Mary Washington offered students a one-credit online summer course on how to use generative AI tools, which covered academic integrity, professional development applications and how to evaluate AI output.
The State University of New York system identified AI as a core competency to be included in all general education courses for undergraduates. All classes that fulfill the information literacy competency requirement will include a lesson on AI ethics and literacy starting fall 2026.
Touro University is requiring all faculty members to include an AI statement in their syllabi by next spring, Shlomo Argamon, associate provost for artificial intelligence, told Inside Higher Ed in a podcast episode. The university also has an official AI policy that serves as the default if faculty do not have more or less restrictive policies.









