Data Shows AI “Disconnect” in Higher Ed Workforce

Data Shows AI “Disconnect” in Higher Ed Workforce

Photo illustration by Justin Morrison/Inside Higher Ed | hoozone and PhonlamaiPhoto/iStock/Getty Images | skynesher/E+/Getty Images

New data shows that while 94 percent of higher education workers use AI tools, only 54 percent are aware of their institution’s AI use policies and guidelines. And even when colleges and universities have transparent policies in place, only about half of employees feel confident about using AI tools for work.

“[That disconnect] could have implications for things like data privacy and security and other data governance issues that protect the institution and [its] data users,” Jenay Robert, senior researcher at Educause and author of “The Impact of AI on Work in Higher Education,” said on a recorded video message about the report. Educause published the findings Monday in partnership with the National Association of College and University Business Officers, the College and University Professional Association for Human Resources and the Association for Institutional Research.

In the fall, roughly three years after generative artificial intelligence tools went mainstream and some higher education institutions began partnering with tech companies, researchers surveyed 1,960 staff, administrators and faculty across more than 1,800 public and private institutions about AI’s relationship to their work. Ninety-two percent of respondents said their institution has a work-related AI strategy—which includes piloting AI tools, evaluating both opportunities and risks and encouraging use of AI tools. And while the vast majority of respondents (89 percent) said they aren’t required to use AI tools for work, 86 percent said they want to or will continue to use AI tools in the future.

But the report also reveals concerns about AI’s integration into the campus workplace, and shows that not every worker is on the same page regarding which tools to implement and how.

For example, 56 percent of respondents reported using AI tools that are not provided by their institutions for work-related tasks. Additionally, 38 percent of executive leaders, 43 percent of managers and directors, 35 percent of technology professionals and 30 percent of cybersecurity and privacy professionals reported that they are not aware of policies designed to guide their work-related use of AI tools.

“Given that institutional leaders and IT professionals are the two groups of stakeholders most likely to have decision-making authority for work-related AI policies/guidelines, the data suggest that many institutions may simply lack formal policies/guidelines, rather than indicating insufficient communication about policies,” Robert wrote in an email to Inside Higher Ed.

And even if they are aware of AI use policies, most workers still don’t know whether to fear or embrace AI.

The majority of respondents (81 percent) expressed at least some enthusiasm about AI, with 33 percent reporting that they were “very enthusiastic/enthusiastic” and 48 percent reporting a mix of “caution and enthusiasm.” Meanwhile, 17 percent said they were “very cautious/cautious” about it.

The survey yielded a similar breakdown of responses to questions about impressions of institutional leaders’ attitudes toward AI: 38 percent said they thought their leaders were “very enthusiastic/enthusiastic”; 15 percent said they were “very cautious/cautious” about it, and 36 percent said their leaders express a mix of “caution and enthusiasm.”

But Kevin McClure, chair of the department of educational leadership at the University of North Carolina at Wilmington, told Inside Higher Ed that embrace of AI may be skewed. That’s because only 12 percent of the survey’s respondents were faculty, whereas the rest held staff, management or executive roles.

“This survey was also sent to institutional researchers and people affiliated with human resources,” he said. “Those people are working in the realm of technology, processing forms, paperwork data analysis and filing reports.”

And the framing of the report’s questions about workers’ levels of caution and enthusiasm may have contributed to the elevated excitement about AI captured in the report, McClure added.

So many people said they share a mix of caution and enthusiasm “because that was one of the choices,” he said. “To me, it reads like people are feeling it out—they can see the use cases for AI but also have concerns. That gets washed out by combining it with enthusiasm.”

Risks and Rewards

Nonetheless, that mix of caution and enthusiasm stems from the risks and benefits higher education workers associate with AI.

Sixty-seven percent of respondents identified six or more “urgent” AI-related risks, including an increase in misinformation, the use of data without consent, loss of fundamental skills requiring independent thought, student AI use outpacing faculty and staff AI skills, and job loss. Some of those concerns align with the findings of Inside Higher Ed’s own surveys of provosts and chief technology officers, which found that the majority of both groups believe AI is a moderate or serious risk to academic integrity.

“Almost more important than the specific risks that people are pointing out is the number of risks that people are pointing out,” Robert, the report’s author, said. “This really validates the feeling that we’re all having about AI when it comes to this feeling of overwhelm that there really are a lot of things to pay attention to.”

At the same time, 67 percent of respondents to the Educause survey identified five or more AI-related opportunities as “most promising,” including automating repetitive processes, offloading administrative burdens and mundane tasks, and analyzing large datasets.

“A lot of people want tools that will simplify the [administrative burden] of higher ed. Not a lot of that is going to save a ton of time or money. It’s just going to be less of an annoyance for the average worker,” McClure said. “That suggests that people aren’t looking for something that’s going to transform the workplace; they just want some assistance with the more annoying tasks.”

And according to the report, most colleges don’t know how efficient those tools are: Just 13 percent of respondents said their institution is measuring the return on investment (ROI) for work-related AI tools.

“Measuring the ROI of specific technologies is challenging, and this is likely one of the biggest reasons we see this gap between adoption and measurement,” Robert said. “As higher education technology leaders consider longer term investments, ROI is becoming a more pressing issue.”

Source link