Tag: Events

  • How AI Can Smooth College Credit Transfer

    How AI Can Smooth College Credit Transfer

    Upward transfer is viewed as a mechanism to provide college students with an accessible and affordable on-ramp to higher education through two-year colleges, but breakdowns in the credit-transfer process can hinder a student’s progress toward their degree.

    A recent survey by Sova and the Beyond Transfer Policy Advisory Board found the average college student loses credits transferring between institutions and has to repeat courses they’ve already completed. Some students stop out of higher education altogether because transfer is too challenging.

    CourseWise is a new tool that seeks to mitigate some of these challenges by deploying AI to identify and predict transfer equivalencies using existing articulation agreements between institutions. So far, the tool, part of the AI Transfer and Articulation Infrastructure Network, has been adopted at over 120 colleges and universities, helping to provide a centralized database for credit-transfer processes and automate course matching.

    In the most recent episode of Voices of Student Success, host Ashley Mowreader speaks with Zachary Pardos, an associate professor at the University of California, Berkeley, about how CourseWise works, the human elements of credit transfer and the need for reliable data in transfer.

    An edited version of the podcast appears below.

    Q: As someone who’s been in the education-technology space for some time, can you talk about this boom of ed-tech applications for AI? It seems like it popped up overnight, but you and your colleagues are a testament to the fact that it’s been around for decades.

    Zach Pardos, associate professor at UC Berkeley and the developer of CourseWise

    A: As soon as a chat interface to AI became popularized, feasible, plausible and useful, it opened up the space to a lot of people, including those who don’t necessarily have a computer science background. So in a way, it’s great. You get a lot more accessibility to this kind of application and work. But there have also been precepts—things that the field has learned, things that people have learned who’ve been working in this space for a while—and you don’t want to have to repeat all those same errors. And in many ways, even though the current generation of AI is different in character, a lot of those same precepts and missteps still apply here.

    Q: What is your tool CourseWise and why is it necessary in the ed-tech space?

    A: CourseWise is a spinoff of our higher education and AI work from UC Berkeley. It is meant to be a credit-mobility accelerator for students and institutions. It’s needed because the greatest credit-mobility machine in America, the thing that gets families up in socioeconomic status, is education. And it’s the two-year–to–four-year transition often that does that, where you can start at a more affordable school that gives two-year associate’s degrees and then transition to a four-year school.

    But that pathway often breaks down. It’s often too expensive to maintain, and so for there to be as many pathways as possible that are legitimate between institutions, between learning experiences, basically acknowledging what a student has learned and not making them do it again, requires us to embrace technology.

    Q: Can you talk more about the challenges with transfer and where course equivalency and transfer pipelines can break down in the transition between the two- and four-year institutions?

    A: Oftentimes, when a student applies to transfer, they’ll have their transcript evaluated [by the receiving institution], and it’ll be evaluated against existing rules.

    Sometimes, when it’s between institutions that have made an effort to establish robust agreements, the student will get most of their credit accepted. But in instances where there aren’t such strong ties, there’s going to be a lot of credit that gets missed, and if the rules don’t exist, if the institution does go through the extra effort, or the student requests extra effort to consider credit that hasn’t been considered before, this can be a very lengthy process.

    Sometimes that decision doesn’t get made until after the student’s first or second semester, semesters in which they maybe had to decide whether or not to take such a course. So it really is a matter of not enough acknowledgment of existing courses and then that process to acknowledge the equivalency of past learning being a bit too slow to best serve a learner.

    Q: Yeah. Attending a two-year college with the hopes of earning a bachelor’s degree is designed to help students save time and money. So it’s frustrating to hear that some of these students are not getting their transfer equivalencies semesters into their progress at the four-year, because that’s time and energy lost.

    A: Absolutely. It’s unfortunately, in many cases, a false promise that this is the cheaper way to go, and it ends up, in many cases, being more expensive.

    Q: We can talk about the transfer pipeline a lot, but I’ll say one more thing: The free marketplace of higher education and the idea that a student can transfer anywhere is also broken down by a lack of transfer-articulation agreements, where the student’s credits aren’t recognized or they’re only recognized in part. That really hinders the student’s ability to say, “This is where I want to go to college,” because they’re subject to the whims of the institutions and their agreements between each other.

    A: That’s right, and it’s not really an intentional [outcome]. However, systems that have a power dynamic often have a tendency not to change, and that resistance to change, kind of implicitly, is a commitment not to serve students correctly.

    Accreditors Weigh In

    The Council of Regional Accrediting Commissions (C-RAC) supports the exploration and application of AI solutions within learning evaluation and credit transfer, according to a forthcoming statement from the group to be released Oct. 6. Three accrediting commissions, MSCHE, SACSCOC and WSCUC, are holding a public webinar conversation to discuss transfer and learning mobility, with a focus on AI and credit transfer on Oct. 6. Learn more here.

    So what you do need is a real type of intervention. Because it’s not in any one spot, you could argue, and you could also make the argument that every institution is so idiosyncratic in its processes that you would have to do a separate study at every institution to figure out, “OK, how do we fix things here?” But what our research is showing on the Berkeley end is that there are regularities. There are patterns in which credit is evaluated, and where you could modify that workflow to both better serve the institution, so it’s not spending so many resources on manually considering equivalencies, and serve the student better by elevating opportunities for credit acceptance in a more efficient way.

    That’s basically what CourseWise is. It’s meant to be an intervention that serves the institution and serves the student by recognizing these common patterns to credit acceptance and leveraging AI to alleviate the stress and friction that currently exists in affording that credit.

    Q: Can you walk us through where CourseWise fits into the workflow? How does it work practically?

    A: CourseWise is evolving in its feature set and has a number of exciting features ahead, which maybe we’ll get to later. But right now, the concrete features are that on the administrator side, on the staff or admissions department side, you upload an institution’s existing articulation agreements—so if you’re a four-year school, it’s your agreements to accept credit from two-year schools.

    So then, when you receive transcripts from prospective transfer students, the system will evaluate that transcript to tell you which courses match existing rules of yours, where you’ve guaranteed credit, and then it’ll also surface courses that don’t already have an agreement.

    If there’s a high-confidence AI match, it’ll bring that to the administrator’s attention and say, “You should consider this, and here’s why.” It’ll also bring to their attention, “Here’s peer institutions of yours that have already accepted that course as course-to-course credit.”

    A screenshot of the CourseWise software, showing a query course, Math 270: Linear Algebra, and how it compares to the equivalent courses on Linear Algebra.

    CourseWise compares classes in institutions’ catalogs to identify existing agreements for credit transfer and possible course-to-course transfers to improve student outcomes.

    Q: Where are you getting that peer-to-peer information from?

    A: We think of CourseWise as a network, and that information on what peer institutions are doing is present. We have a considerable number of institutions from the same system. California is one—we have 13 California institutions, and we’re working on more. The other is State University of New York, SUNY. We have the SUNY system central participating in a pilot. It’ll be up to the individual institutions to adopt the usage. But we have data at the system-center level, and because of that centralized data, we are able to say, for every SUNY institution that’s considering one of the AI credit acceptance requests, give that context of, “Here are other four-year peer institutions within your system that already accept this—not just as generic elective credit, but accept it as perhaps degree satisfying, or at least course-to-course credit.”

    Q: That’s awesome; I’m sure it’s a time saver. But where do the faculty or staff members come back into the equation, to review what the AI produced or to make sure that those matches are appropriate?

    A: Faculty are a critical part of the governance of credit equivalency in different systems. They have different roles; often it’s assumed that faculty approve individual courses. That’s true in most cases. Sometimes it’s committees; different departments will have a committee of faculty, or they may even have a campus standing committee that considers this curricular committee that makes those decisions.

    But what CourseWise is doing right now to incorporate faculty appropriately is we’re allowing for the institution to define what is that approval workflow and the rules around that. If it’s a lower-division statistics class, can your admission staff make that decision on acceptability, even if it’s not existing in a current agreement?

    Under what circumstances does it need to be routed to a faculty member to approve? What kind of information should be provided to that faculty member if they don’t have it, making it easy to request information, like requesting a syllabus be uploaded by the sending institution or something to that effect?

    Oftentimes, this kind of approval workflow is done through a series of emails, and so we’re trying to internalize that and increase the transparency. You have different cases that get resolved with respect to pairs of courses, and you can see that case. You can justify why a decision was made, and it can be revisited if there’s a rebuttal to that decision.

    Now, over time, what we hope the field can see as a potential is perhaps for certain students, let’s say, coming from out of state; it’s more a faculty committee who gives feedback to a kind of acceptance algorithm that is then able to make a call, and they can veto that call. But it creates a default; like with ChatGPT, there’s an alignment committee that helps give feedback to ChatGPT answers so that it is better in line with what most users find to be a high-quality response. Because there’s no way that we can proactively, manually accept or evaluate every pair of institutions to one another in the United States—there’s just no FTE count that would allow for that, which means that prospective students from out of state can’t get any guarantee if we keep it with that approach.

    Faculty absolutely have control. We’re setting up the whole workflow so an institution can define that. But one of the options we want to give institutions is the option to say, “Well, if the student is coming from out of state or coming from this or that system, you can default to a kind of faculty-curated AI policy.”

    Q: That’s cool. I’ve heard from some colleges that they have full teams of staff who just review transcripts every single day. Having a centralized database where you can see past experiences of which courses have been accepted or rejected—that can save so much time and energy. And that’s not even half of what CourseWise is doing.

    A: Absolutely, and we work closely with leadership and these institutions to get feedback. And one of the people involved in that early feedback is Isaiah Vance at the Texas A&M University system, and he’s given us similar feedback where, if you have a new registrar or a new leadership that comes in, and they want to know how good the data is, they want that kind of transparency of how were decisions made, if you have that transparency in that organization to look that over, it can really help an institution get comfortable with those past decisions or decide how they should change in the future.

    Q: What are some of the outcomes you’ve seen or the feedback you’ve heard from institutions that are using the tool?

    A: We have a study that we’re about to embark upon to measure a before-and-after change in how institutions are doing business and how much it’s saving time or not, versus a control of not having the system when making these decisions.

    We don’t have the results of that yet. We do have a paper out on where articulation officers, for example, are spending their time. They’re spending a lot of time on looking for the right course that might articulate. So we definitely have identified there is a problem. It’s an open question to what degree CourseWise is remedying that. We certainly are working nonstop to remedy it, but we’re going to measure that rigorously over the next year.

    Some early feedback is positive, but also interesting that institutions, many of them, are spending a lot of time getting that initial data uploaded, catalog descriptions, articulations and the rigorousness and validity of that data. Maybe it’s spread across a number of Excel spreadsheets at some institutions—that problem is real—and so I think it’s going to take a field-level or industry-level effort to make sure that everyone can be on board with that data-wrangling stage.

    Q: That was my hypothesis, that the tool has a lot of benefits once everything’s all set up and they’ve done the labor of love to hunt down and upload all these documents, find out which offices they’re hiding behind.

    A: There are a number of private foundations, funders who are invested in that particular area. So I’m optimistic that there’s a solution out there and that we’ll be a part of that.

    Q: I wonder if we can talk about how this tool can improve the student experience with transfer and what it means to have these efficiencies and database to lean back on.

    A: Right now, most of the activity is with the four-year schools, because they’re the ones uploading the articulations. They’re the ones evaluating transcripts. But in the next four months, we’re releasing a student-facing planner, which will directly affect students at the sending institutions.

    This planner will allow a student who’s at a community college to choose what destination school and major they’re interested in that’s part of the CourseWise network. Then [CourseWise provides] what courses they need to take, or options of courses to take that will transfer into the degree program that they’re seeking, such that when they transfer, they would only have to do the equivalent of two full years of academic work at that receiving school.

    It would also let them know what other majors at other institutions they may want to consider because of how much of the credit that they’ve already taken is accepted into the degree programs there. So the student may be 20 percent of the way in completing their initially intended destination program, but maybe they’re 60 percent of the way to another program that they didn’t realize.

    Q: What’s next for CourseWise?

    A: So the student part is the navigation, the administrator articulation expansion and policy for expansion is creating the pathways; you need a GPS in order to know what the paths are and how to traverse them as a learner. But also states—I mentioned regularities—there are commonalities in how these processes take place, but there’s also very specific state-level concerns and structures, like common course numbering, credit for prior learning, an emphasis on community colleges accepting professional certificate programs and so forth.

    I think the future is both increasing that student-facing value, helping with achievement from the student point of view. But then also leveraging the fundamental AI equivalency engine and research to bring in these other ways of acknowledging credit, whether it’s AP credit or job-training credit or certificates or cross-walking between all these different ways in which higher education chooses to speak about learning, right?

    If you have a requirement satisfied in general education in California, how do you bring that to New York, given New York’s general education requirements? Are there crosswalks that can be suggested and established with the aid of AI? And I’m excited about connecting these different sorts of dialects of education using technology.

    Source link

  • Missouri President Wants Local Officials to Address Crime

    Missouri President Wants Local Officials to Address Crime

    University of Missouri president Mun Choi is pressing local officials about crime rates near the Columbia campus after a student from neighboring Stephens College died Sunday following a downtown shooting, KCUR and the Columbia Missourian reported. 

    The president’s demand to address the city’s “rampant crime rate” has gathered some support, but critics say that his characterization of the local climate is overexaggerated, pointing to data from the local police department.

    The shooting, which also resulted in serious injuries to two others, took place early Saturday morning on the college town’s main street. One individual, not from the city, got into a verbal dispute and then opened fire toward the people he was confronting. The three individuals he hit, however, were bystanders.    

    In a letter sent the same day as the shooting, Choi called on city and county leaders to bolster the police presence and prosecute crimes to the fullest extent of the law. He also urged them to take down encampments of unhoused individuals, pass a loitering notice and repeal policies that “attract criminals to the region.”  

    But when asked during a press conference Monday what policies and practices he believes “attract criminals,” the MU president said he had none to cite. Neither the shooter in the Saturday incident nor any of the victims have been identified as unhoused, according to local reporting.

    “That is why I am asking [local leaders] to evaluate the processes that we have and the practices,” he explained. “Are we giving the impression to potential criminals that this is a region that doesn’t take crime enforcement as well as the punishment that comes with it seriously?”

    Choi later added that students and local business owners have been raising safety concerns about the city’s unhoused population. According to university data, the number of arrests and trespassing violations issued to the unhoused has “gone up dramatically” since 2019, he said.

    That is different, however, from what some local police department data shows.

    In a Facebook post Monday, the city’s mayor, Barbara Buffaloe, said there have been 58 gunshot incidents since the beginning of the year. That’s down from 105 in the first nine months of 2024.

    Columbia Police Department chief Jill Schlude did note in a separate letter, however, that since 2019 more crimes have been concentrated downtown, occurring between midnight and 3 a.m. 

    “The connection between late-night social activity and violence is clear, and that is where we continue to focus our efforts,” Schlude said.

    Regardless of any disputes over the data, multiple government officials—including Gov. Mike Kehoe, several members of the Columbia City Council and Mayor Buffaloe—have voiced support for Choi’s general call to improve safety. Buffaloe has also committed to forming a task force on the matter, and the CPD has outlined plans to increase the police presence downtown. 

    “Statistics cannot be used solely as a reason for us to move away from what needs to be done in the city of Columbia,” Choi said.

    Source link

  • Lane Community College Board Apologizes to President

    Lane Community College Board Apologizes to President

    The Lane Community College Board of Education apologized to President Stephanie Bulger at its Tuesday meeting for how members disrespected her on the basis of her race and sex, Lookout Eugene-Springfield reported

    The board’s apology follows the findings of an investigative report released in August that determined board members were frequently dismissive of Bulger—a Black woman—and often deferred questions to male staff members. The report found that former board chair Zach Mulholland was frequently hostile toward Bulger and often cut her off in their interactions. (He was also found to have physically intimidated a student at a board meeting.) Although Mulholland was censured by the board last month, he has resisted calls to step down.

    Much of the report focused on Mulholland, but other members were also implicated.

    “The board recognizes and is accountable for the harm caused to you, President Bulger,” said Austin Fölnagy, the current board chair, who was also accused of dismissive behavior. “We are deeply sorry for the negative impact our behavior has had on you and the college community at large. President Bulger, please accept the board’s apology for treating you badly.” 

    He added that the board is “committed to learning from our shortcomings” and will take “remedial actions including training in bias, discrimination and harassment” this fiscal year.

    Bulger has been president of the Oregon community college since July 2022.

    Source link

  • UNC Professor on Leave After Alleged Advocacy of Political Violence

    UNC Professor on Leave After Alleged Advocacy of Political Violence

    Eros Hoagland/Getty Images

    Officials at the University of North Carolina at Chapel Hill placed Professor Dwayne Dixon on leave Monday while the university investigates his “alleged advocacy of politically motivated violence,” said Dean Stoyer, UNC Chapel Hill’s vice chancellor for communications and marketing.

    Dixon, an associate professor of Asian and Middle Eastern studies, used to be a member of Silver Valley Redneck Revolt, a chapter of the antifascist, antiracist, anticapitalist political group Redneck Revolt. The group was formed in 2016 and some members, including Dixon, were present at the 2017 Unite the Right rally in Charlottesville, Va., to provide armed security and medical assistance to counterprotesters. Redneck Revolt disbanded in 2019 and has no active chapters, according to its website.

    In a 2018 interview with The Chronicle of Higher Education, Dixon described himself as an “anarchist,” and he is no stranger to blowback for his political activism and support for gun rights. He was arrested for bringing a semiautomatic rifle to a Ku Klux Klan counterprotest in Durham, N.C., in 2018—the case was later dismissed as unconstitutional on the grounds that the charges violated Dixon’s First and Second Amendment rights. He was also among 20 people who protected counterprotesters in Durham when white supremacists protested the removal of a Confederate statue in 2017. Through all these events, Dixon remained employed at UNC Chapel Hill.

    Why is Dixon in the hot seat now? The answer is convoluted, but it begins with fliers on the Georgetown University campus.

    On Sept. 24, Andrew Kolvet, a spokesperson for the late Charlie Kirk’s Turning Point USA, posted on X a photo of a flier on the Georgetown campus in Washington, D.C., that read, “Hey Fascist! Catch!”—a nod to engraving on the casing of bullets left behind by Kirk’s suspected killer—and “The only political group that celebrates when Nazis die.” The flier also included a QR code to a Google form for a potential Georgetown chapter of the John Brown Gun Club, a Redneck Revolt affiliate organization known as a “leftist gun-rights group” with multiple independent chapters, including one in the D.C. area, according to the Counter Extremism Project. It “arms itself to defend against far-right violence and often appears as a security force at protests to protect against expected far-right violence,” the CEP wrote. Google has since removed the form for violating its terms of service.

    University officials removed the fliers and reported them to the FBI. Education Secretary Linda McMahon also weighed in: “At a moment like this, Georgetown has to determine what it stands for as an institution … Allowing violent rhetoric to fester on our nation’s campuses without consequences is dangerous. It must be condemned by institutional leaders,” she wrote on X. “I am grateful to those who spoke out against this and made noise about the posters on campus—you made a difference. There is power in speaking up to reveal these hateful ideologies that have incited deadly violence.”

    Kolvet posted again, this time linking to a recent Fox News article that cited Dixon’s involvement in Redneck Revolt based on an old blog post that has since been taken down. “I posted this flyer our team spotted at Georgetown University, and now we find out professors at ‘elite’ schools are members of this group and its offshoots,” Kolvet wrote. “This professor must be immediately fired and the group/network investigated.”

    Dixon was placed on leave Monday, which will “allow the University to investigate these allegations in a manner that protects the integrity of its assessment,” UNC’s Stoyer said in his statement. “Depending upon the nature and circumstances of this activity, this conduct could be grounds for disciplinary action up to and including potential termination of employment.”

    UNC Chapel Hill officials declined to answer any other questions about Dixon and did not say whether Kolvet’s post or the Fox News article led to the investigation. Dixon did not reply to a request for comment but told the student newspaper The Daily Tarheel that he left the Silver Valley Redneck Revolt in 2018.

    A Change.org petition to reinstate Dixon is circulating and as of Wednesday evening had more than 900 signatures. In a statement Wednesday, the North Carolina chapter of the American Association of University Professors, as well as UNC Chapel Hill’s AAUP president, condemned the university’s actions and demanded Dixon be reinstated.

    “Right-wing activists are attacking Dixon for prior membership in a group that has been inactive since 2019, and are baselessly connecting him to flyers allegedly posted by a different group on a different campus outside of North Carolina. Fox News picked up the story on September 27, 2025, without verifying the existence of the flyers, and apparently this was enough for UNC’s administration to remove a professor from the classroom in the middle of the semester and bar him from campus,” the statement read. “Let’s call this what it is: UNC administrators are capitulating to a call from a right-wing group, infamous for attacking faculty, to fire a professor based on an unsubstantiated rumor.”

    Dixon joins the ranks of dozens of college and university faculty members who have been placed on leave, disciplined or fired in the weeks since Kirk was shot and killed. All of these professors have been investigated after right-wing personalities identified them on social media. Two of them—Michael Hook, who was placed on leave for social media comments he made about Kirk’s death, and Thomas Alter, who was terminated after being accused of inciting violence during a speech—have been reinstated by court orders.

    Source link

  • How Credit for Prior Learning Strengthens Workforce Ties

    How Credit for Prior Learning Strengthens Workforce Ties

    In today’s rapidly evolving workforce landscape, higher education institutions face mounting pressure to demonstrate value, relevance and return on investment. Amid this challenge lies an underutilized strategy with remarkable potential: credit for prior learning.

    We’ve long recognized CPL’s benefits for students. Learners who receive CPL credits are more likely to complete their degrees (49 percent vs. 27 percent for those without) and, on average, they earn 17.6 additional credits, finish nine to 14 months sooner and save between $1,500 and $10,200 in tuition costs (CAEL). But what’s often overlooked is CPL’s power to transform relationships between educational institutions and employers—creating a win-win-win for students, institutions and industry.

    Beyond a Student Benefit

    The traditional narrative around CPL emphasizes student advantages: increased enrollment, improved completion rates and reduced time to graduation. These metrics matter tremendously, but they tell only part of the story.

    CPL can serve as a bridge between academia and industry, creating powerful new partnerships. When colleges and universities embrace robust CPL programs, they send a clear message to employers: We value the training and development you provide. Recognizing corporate training as creditworthy learning demonstrates respect for workplace knowledge and underscores higher education’s commitment to real-world relevance.

    Employer and Workforce Gains

    For employers, CPL validates that their internal training programs have academic merit. This recognition strengthens recruitment and retention efforts, as workers see clear pathways to advance their education without duplicating learning they’ve already mastered. Companies that invest in employee development also gain educational partners who understand industry needs and value the attributes that drive employee success.

    The benefits extend further: Organizations with tuition remission or reimbursement programs can reduce costs while enhancing employee motivation and persistence.

    Deeper Collaboration Between Higher Ed and Industry

    As institutions evaluate workplace training for credit equivalency, they gain invaluable insights into industry practices and skill needs. This exchange allows colleges to refine curricula to better meet market demand, ensuring graduates possess the competencies employers seek—not just those defined within academic silos.

    The hard but necessary conversations—between faculty and corporate training leaders—help ensure CPL evaluations are rigorous and relevant. Key questions include: Why include certain topics but not others? How do we know participants can demonstrate knowledge? Does the training align with broader disciplinary or leadership needs, or is it niche? These discussions strengthen both educational and workplace outcomes.

    Reimagining CPL

    The future of higher education lies in breaking down artificial barriers between academic and workplace learning. By embracing CPL as a cornerstone strategy—not only for student success but also for employer partnerships—institutions can position themselves at the nexus of education and employment.

    This approach doesn’t diminish academic rigor; it expands our understanding of where and how meaningful learning occurs. Done well, CPL creates pathways that honor all learning, regardless of where it happens. And for learners, the message is clear: Your hard work counts.

    Source link

  • Contextualizing Completion Gaps for First-Gen Students

    Contextualizing Completion Gaps for First-Gen Students

    First-generation students are twice as likely to leave college without completing a bachelor’s degree than their peers, even if they come from higher-income backgrounds and come to college academically prepared, according to a new report from the Common App. The findings suggest these factors do make a difference for student success outcomes but don’t erase other barriers first-generation students might face.

    The report, released Thursday and the fourth in a series on first-generation students, used data from the National Student Clearinghouse Research Center to track enrollment, persistence and completion rates for 785,300 Common App applicants in the 2016–17 application cycle. (Students whose parents didn’t complete bachelor’s degrees made up 32 percent of the sample.) The report also took into account how a range of factors could affect student outcomes, including students’ incomes, their levels of academic preparation and how well-resourced their colleges are.

    Previous studies have shown that “first-generation students are certainly not a monolith,” said Sarah Nolan, lead author of the report and a research scientist at Common App. “We were hoping to give readers a sense for … which first-generation students might in particular need more support.”

    The good news is the report found first-generation applicants enroll in college at rates on par with their peers. Over 90 percent of Common App applicants, first-generation and otherwise, enrolled in college within six years of applying.

    But first-generation students were slightly more likely to not enroll immediately (17 percent) or to enroll at a two-year college (12 percent) compared to other applicants (14 percent and 4 percent, respectively). That gap mostly closed when comparing students with strong academic records, defined as having SAT or ACT scores or GPAs in the top quartile. According to the report, that finding may be because a higher share of first-generation students may need extra coursework before enrolling in a four-year institution.

    Students might also work to save up for college first or opt for community colleges’ more affordable tuition rates, the report suggested. Lower-income first-generation students, who qualified for application fee waivers, were also less likely to immediately enroll at four-year institutions and more likely to first enroll at a community college compared to similar students not from first-generation backgrounds.

    Over all, “we are really heartened to see that there’s really not very strong differences in college enrollment,” Nolan said.

    Completion rates, however, are another story. While about 70 percent of first-generation students do complete a bachelor’s degree within six years of enrolling, the report found stark disparities between them and their peers.

    About half of first-generation students completed a bachelor’s degree within four years, compared to 68 percent of continuing-generation students, a gap of 18 percentage points. And that disparity persisted when looking at six-year graduation rates. About 69 percent of first-generation students graduated within six years, compared to 86 percent of continuing-generation students, a 17-percentage-point difference.

    These gaps shrank but didn’t disappear for first-generation students with strong academic records and higher incomes. Academically prepared first-generation students were twice as likely to disenroll with no degree than their continuing-generation counterparts, 14 percent and 6 percent, respectively. In a similar vein, 24 percent of higher-income first-generation students left college without a degree within six years compared to 12 percent of their continuing-generation counterparts. Even for first-generation students who were both academically prepared and relatively well-off, these gaps remained.

    Differences in the institutions first-generation and continuing-generation students attend—and the levels of supports they offer—didn’t account for completion-rate gaps, either.

    Even when attending the exact same institutions, first-generation students were 10 percentage points less likely to earn a bachelor’s degree within six years than continuing-generation students.

    However, higher per-student expenditures did seem to contribute to better student success outcomes. At institutions that spent at least $20,000 per student, 84 percent of first-generation graduated within six years, compared to 94 percent of continuing-generation students. The gap between first-generation and continuing-generation students’ completion rates widened to 15 percentage points at colleges that spent more moderately, $10,000 to $15,000 per student, and 17 percentage points at colleges with low per-student expenditures, less than $7,500.

    These findings suggest that, while first-generation students disproportionately face financial constraints and barriers to college prep, it doesn’t explain away their graduation rate gaps. And students attending less resourced institutions isn’t a full explanation, either. Other obstacles must be at play.

    What those barriers are may be “best answered by speaking with first-generation students themselves and unpacking what’s happening at the individual level,” Nolan said. But first-generation students likely struggle with limited access to information about higher ed and its “hidden curriculum” of expectations, regardless of income, high school performance or which college they attend.

    “Having the right resources at the right time on the pathway—that’s really critical for student success,” Nolan added.

    The stakes of success are high—the report found many first-generation students spent considerable time and money on college with no degree to show for it. Almost a third of first-generation students who didn’t earn a degree were enrolled for at least four years.

    But a hopeful finding is that “additional investment can be quite positive for helping these students really actualize their potential,” Nolan said.

    Source link

  • Academic Staff Need Academic Freedom, Too (opinion)

    Academic Staff Need Academic Freedom, Too (opinion)

    Late last spring, something disturbing happened in my classroom. For the first time in 15 years of teaching, I opened by telling my students I wasn’t sure if I was allowed to speak. The class was an introduction to the philosophy of education, and months earlier I’d scheduled this day for our opening discussion on critical pedagogy. But in light of charged campus climates and broader legal threats facing institutions nationwide, I realized that as an academic staff member who engages in teaching and research, I was particularly vulnerable.

    What followed was one of the more important classes I’ve taught, though not about the subject I’d planned. We spent the hour investigating our institution’s academic freedom policies, asking questions of whom those policies included and excluded. We discovered the troubling reality: Although I was expected to facilitate complex educational discussions, I lacked clear protections to do so safely.

    My situation reflects a growing crisis in higher education that has received little attention. While much has been written about the vulnerabilities of contingent faculty, there has been almost no discussion of the academic freedom needs of one of higher education’s most rapidly growing workforces: third-space professionals.

    The Rise of the Third Space

    Over the past two decades, universities have dramatically expanded what researcher Celia Whitchurch terms “third-space” professionals: staff who blend academic and administrative functions but operate in the ambiguous territory between traditional faculty and staff roles.

    These roles aren’t new or unprecedented. The American Association of University Professors has long recognized that librarians, despite often holding staff status, require academic freedom protections given their integral role in teaching and research. What’s new is the scale and diversity of academic work now performed by nonfaculty academic professionals.

    This growth represents the contemporary evolution of a workforce shift that began in the 1970s, when academic support roles developed in response to diverse students entering colleges through open admissions policies. The 1990s brought expansion into new fields like faculty development and community-based learning, as colleges recognized these roles could enhance teaching practices institutionwide. Most recently, colleges have seen explosive growth in data-driven student success and enrollment management roles.

    What unites these professionals is their expertise in designing and delivering on the academic mission of the university, with special emphasis on student success. They lead pedagogical and curricular initiatives, make decisions about learning interventions, analyze data that reveals uncomfortable truths about institutional performance, and advocate for evidence-based policy revisions. They also regularly teach college courses, write and receive major grants, and publish in peer-reviewed journals. In essence, they do academic work, but without academic protections.

    Why Academic Freedom Matters for Third-Space Work

    The problem is easy to name but difficult to address. Institutions have radically restructured how academic work gets done based on the shifting needs of students and priorities of institutions, without a reciprocal restructuring of how academic work gets supported or protected. Third-space professionals need academic freedom protections for four key reasons.

    1. Educational decision-making: These professionals make pedagogical and curricular choices about student learning interventions, program design and educational strategies. Without academic freedom, they face pressure to implement approaches based on administrative convenience, pressure from faculty or donor preferences, rather than evidence-based best practices. What happens, for instance, when a faculty member feels the writing center’s approach to writing pedagogy conflicts with their own vision for writing in their classroom?
    2. Data interpretation and reporting: Student success professionals analyze retention, graduation and achievement data that may reveal uncomfortable truths about institutional performance or equity gaps. They need protection when their findings challenge institutional narratives or suggest costly reforms. What happens when an institutional researcher’s analysis shows that a flagship retention program isn’t working, but the administration has just featured it in a major donor presentation?
    3. Policy advocacy: Their direct work with students gives them insights into institutional policies and processes that harm student success. They should be able to advocate for necessary changes without fear of retaliation, even when those changes conflict with administrative priorities or departmental preferences. What happens when an academic adviser discovers that the prerequisite structure in a major is creating unnecessary barriers for students, but changing it would require difficult conversations with powerful department heads?
    1. Research and assessment: Many third-space professionals conduct and publish research on student success interventions, learning outcomes and institutional effectiveness. This scholarship requires the same protections as traditional academic research. What happens when assessment reveals the ineffectiveness of first-year seminar teaching, but presenting findings could damage relationships with faculty colleagues?

    The Problem of Selective Recognition

    Universities have already recognized that faculty work has diversified and requires differentiated policy structures. Many institutions now distinguish between research professors (focused on scholarship and grant acquisition), teaching professors (emphasizing teaching practice) and professors of practice (bringing professional expertise into academic settings). Each category receives tailored policies for promotion, performance evaluation and professional development that align with their distinct contributions.

    Yet on the staff side, institutions continue to operate as if all nonfaculty work is identical. A writing center director publishing on linguistic justice, an assistant dean of students developing crisis-intervention protocols for student mental health emergencies and a facilities director managing building maintenance are all governed by the same generic “staff” policies. This isn’t just administratively awkward: It’s a fundamental misalignment between how work actually happens and how institutions recognize and protect that work.

    Applying Consistent Logic

    The way forward isn’t revolutionary, but simply the application of the same logic that most universities already use for faculty. Rather than the outdated single “staff” category, colleges and universities need at least three distinct categories that reflect how staff work actually happens.

    1. Academic staff: Professionals engaged in teaching, research, curriculum design and educational assessment, including learning center directors, faculty developers, institutional researchers, professional academic advisers and academic program directors. These roles require academic freedom protections, scholarly review processes and governance representation.
    2. Student life staff: Professionals focused on co-curricular support, belonging and student life, including residence life coordinators, activities directors and counseling staff. These roles need specialized professional development and advancement pathways that recognize and support their expertise in student development.
    3. Operational staff: Professionals handling business functions, facilities and administrative operations. These roles can continue with traditional staff policies and support structures.

    This framework enables differentiated policy environments and support structures across multiple areas. Critically, academic freedom policies can be tailored to protect inquiry for staff who engage in this kind of work, while recognizing that other staff have different professional needs.

    The expansion of third-space/academic staff roles represents higher education’s recognition that effective student success requires diverse forms of expertise working collaboratively. But without policy frameworks that acknowledge and protect this academic work, institutions risk undermining the very innovations they’ve created. When the professionals responsible for student success cannot engage in free inquiry, challenge ineffective practices or advocate for evidence-based approaches, everyone loses—especially students.

    Aaron Stoller is associate vice president for student success and a lecturer in education at Colorado College.

    Source link

  • Toward a Trauma-Informed Writing Process (opinion)

    Toward a Trauma-Informed Writing Process (opinion)

    “Your writing isn’t academic enough.”

    A single sentence from a faculty mentor cut deeper than I expected—because it wasn’t the first time my voice had been questioned. I spent decades believing I was not good enough to become a writer. Not because I lacked skill or insight, but because I was writing through a deep wound I didn’t yet understand.

    That statement was a flashpoint, but the wound began long before:

    • When I, as a shy Guatemalan immigrant child, felt I was lacking academically and learned to shrink my voice.
    • When I was told that my ways of knowing—grounded in culture, emotion, embodiment—didn’t belong in academic writing.
    • When I absorbed the perfectionism and shame that academia breeds.

    For years, I edited myself into invisibility—performing an academic voice that was praised for its polish and precision but stripped of everything that made it mine.

    And I am not alone.

    The Invisible Wounds We Carry

    In my work as a writing consultant and developmental editor, I hear the same story over and over: Brilliant scholars—often from historically excluded communities—are convinced they are bad writers when, in reality, they are carrying unprocessed writing trauma.

    We rarely name it as such. But that is what it is:

    • The trauma of repeatedly being told your voice is wrong or not “rigorous.”
    • The trauma of navigating academic culture that rewards conformity over authenticity.
    • The trauma of absorbing deficit narratives about your language, identity or intellectual worth.

    Academic spaces can be punishing, performative and isolating. Add in past wounds—whether from classrooms, reviewers, supervisors or broader systems—and writing becomes more than putting words on a page. It becomes a battleground.

    I once had a client who burst into tears during a one-on-one session with me. She opened the document she had avoided for weeks. The moment her fingers hovered over the keyboard, she said, her chest tightened. She felt dizzy, like the room was closing in.

    “I can’t do this,” she whispered.

    What was she working on? A simple literature review. But there was nothing simple about it.

    Her body remembered: her first-year doctoral seminar, where she was told her writing wasn’t academic enough. Being cut off in class. Watching her white male peer echo her words and be praised for his “insight.”

    Writing didn’t feel liberating. It felt like re-enactment.

    Her tears weren’t a breakdown. They were a breakthrough. Her nervous system was doing exactly what it was designed to do: keep her safe.

    I’ve experienced that spiral, too. Sitting in front of a blank screen, begging my brain to write something!—only to be met with my inner chorus:

    • I teach people how to write—what’s my problem?
    • I’m not going to say anything that hasn’t already been said.
    • This is going to take forever—and I’d rather not disappoint myself.
    • I’m not really a good writer. I’m just faking it.

    Even after years of writing—journals, academic papers, dissertation, books—it still doesn’t feel easy. I have to work at it each day. Writing, for me, is like a relationship. At first, it’s exciting. Words flow; ideas spark. But eventually, the doubts creep in. You start to ghost your own document.

    But real relationships, and real writing, require showing up. Even when you’re tired. Even when it’s hard. Even when it feels like your worst critic lives inside your own head.

    This Isn’t All in Your Head—It’s All in Your Body

    These blocks that haunt you as you imagine writing aren’t signs that you shouldn’t write the thing. These are survival strategies your nervous system uses to protect you. And yes—they show up at your desk.

    This is all to say that, in my experience, writing blocks tend to be trauma responses—not character flaws or technical writing issues. Now, are there times when folks are challenged by things like time management? Of course. But to me, that is just a symptom of something deep-seated.

    We’re told to “just sit down and write,” as if our struggle is solely or partly a matter of discipline, time management or motivation. But often, it’s not that we don’t want to write. We actually really want to write. It’s that our body—our entire nervous system—is sounding an alarm.

    Not safe. Not ready. Not now.

    The response varies. It’s not one-size-fits-all. But it’s always trying to protect us.

    Let’s break these responses down.

    1. Fight: You argue with your work. Nothing sounds good enough. Every sentence feels off. You rewrite the same paragraph 10 times and still hate it. You pick fights with your draft like it owes you money. You hover over the “delete” key like a weapon. You get lost in perfectionist loops, convinced that your argument is weak, your evidence lacking, your phrasing too soft, too bold, too elementary, too you.

    This is the part of you that learned, somewhere along the way, that the best defense is a good offense. If you criticize your writing first, no one else can beat you to it.

    It’s a form of protection dressed as hypervigilance.

    It’s exhausting. And it’s not your fault.

    1. Flight: You avoid it completely. The minute you open the document, your chest tightens. So instead, you check your email, clean the kitchen, research grants for a project you haven’t even started, reformat your CV for the fifth time or suddenly become very concerned about the state of your inbox folders. Every task feels urgent—except the one you actually need to do.

    It doesn’t mean you don’t care. It means your system is trying to escape danger. And in academia, writing often is danger, because of what it represents—exposure, judgment, potential rejection—and what it can lead to: excommunication, cancellation, even deportation.

    Flight says, “If I don’t go near the source of pain, I won’t have to feel it.” But avoidance doesn’t erase fear. It buries it. And that buried fear just grows heavier.

    1. Freeze: You stare at the screen, paralyzed. You’ve carved out time, made the tea, lit the candle—and still, nothing happens. The cursor blinks like it’s mocking you. You reread the same sentence 30 times. You open a new tab, then another. You scroll, refresh, skim, click—but you’re not absorbing anything.

    Your body might go still, but inside, it’s chaos: looping thoughts, spiraling doubts, blankness that feels like suffocation.

    This is shutdown. Your brain says, “Too much.” So it hits pause.

    It might look like laziness, but it’s actually self-preservation.

    1. Fawn: You overfocus on pleasing others.

    This one’s sneaky. You’re writing. You’re producing. But you’re doing it in someone else’s voice. You try to imagine what your adviser would say. You filter every word through Reviewer 2’s past critiques. You write with a white, cis-hetero-masculine ghost looking over your shoulder.

    You say what you think you should say. You cite whom you think you have to cite. You mute your own voice to keep the peace.

    You’re not writing to be heard. You’re writing to be accepted.

    Fawning isn’t about submission. It’s about safety. It’s about staying small so you don’t become a target. But in doing so, you slowly disappear from your own work.

    What if your block isn’t failure?

    What if it’s your body’s way of saying:

    “This way of writing doesn’t feel safe.”

    “These expectations aren’t sustainable.”

    “You are not a machine. You are a whole human.”

    Writing as a Site of Healing, Not Harm

    If we understand writing blocks as trauma responses, then the answer isn’t more pressure or productivity hacks.

    The answer is care.

    A trauma-informed writing practice prompts us to shift our questions:

    • Instead of “Why am I procrastinating?” ask, “What am I protecting myself from?”
    • Instead of “How can I write more?” ask, “What would make this feel safer?”
    • Instead of “Why can’t I just get it done?” ask, “What do I need to feel supported right now?”

    This practice is about making room for your whole self at the writing table.

    It includes:

    • Slowing down to listen to your resistance. What is it trying to tell you? What stories or fears are surfacing?
    • Creating emotional safety before expecting output. That might mean grounding rituals, community check-ins or simply naming your fear out loud.
    • Reframing writing as healing, not harm. What if writing wasn’t about proving your worth but about reclaiming your voice? What if it became a place to process, reflect, resist—and even rest?

    Because here’s the truth: You can’t punish yourself into productivity.

    You can’t shame your voice into clarity.

    But you can write your way into wholeness—slowly, gently, in your own time.

    Resistance Is Wisdom

    Let’s stop treating our writing resistance as evidence of failure. What if it’s an invitation to listen? A clue to your next move? A doorway into a new way of knowing? Let’s not avoid resistance but lean into it, face it and treat it with compassion.

    Ask yourself,

    • What if my block isn’t a wall, but a mirror?
    • What if my voice needs tenderness, not toughness?
    • What if my writing can be a place where I feel more like myself, not less?

    Maybe the goal isn’t to “push through” your writing block.

    Maybe it’s to create the conditions where it feels safe enough to speak your voice.

    You don’t need to force yourself to write like someone you’re not.

    You don’t need to perform brilliance to be taken seriously.

    You don’t need to sacrifice your health on the altar of productivity.

    You need practices that restore your voice, not erase it.

    You need writing that nourishes, not punishes.

    A trauma-informed writing practice invites your whole self to the page. It makes room for and challenges you to lean into the imperfection, reflection and vulnerability. It reframes writing not as punishment but as possibility.

    Toward a More Human Academy

    In this political moment—where academic freedom is under attack, DEI initiatives are being dismantled and scholars are being silenced for telling the truth—we can’t afford to ignore how trauma shapes whose voices get heard, cited or erased.

    Trauma-informed writing is a form of resistance.

    It’s how we push back against systems that demand performance over presence, conformity over courage.

    It’s how we cultivate an academy where all voices—especially those long excluded—can write with power, truth and unapologetic authenticity.

    I’m still healing my own writing wounds. Maybe you are, too.

    But here’s what I know now: Writing wounds don’t heal overnight.

    They heal when we meet them with compassion—every time we dare to put words on a page.

    Aurora Chang is the founder of Aurora Chang Consulting LLC, where she provides developmental editing, holistic faculty support and writing consulting rooted in compassion and authenticity. A former professor and faculty developer, she now partners with academics to reclaim their voices, sustain their careers and write with purpose.

    Source link

  • First-Year Survey Data Prompts New Partnerships, Coms Strategy

    First-Year Survey Data Prompts New Partnerships, Coms Strategy

    Participation in extracurricular activities and campus events is tied to student retention, but a significant number of students don’t get plugged in. A 2024 survey by Inside Higher Ed and Generation Lab found that 26 percent of respondents had never attended a campus event and 35 percent weren’t involved in activities outside the classroom.

    About three in 10 college students said they’d be more involved on campus if they were more aware of the available extracurricular opportunities. Staff at the University of Arizona recognized this awareness gap and, for the past decade, have surveyed incoming students to identify their interests and provide them with tailored resources.

    The New Student Information Form provides campus leaders with actionable data and information about the students joining the Wildcat community.

    Survey Says

    According to a 2024 survey of student success leaders by Inside Higher Ed and Hanover Research, 44 percent of respondents said their institution is highly effective at collecting data for student success and 40 percent said their college effectively uses student success data to inform decisions and initiatives. One-third of respondents said their institution has built a culture of data around student success.

    How it works: The University of Arizona first launched the NSIF in 2012 to understand student interests and their previous experiences with extracurricular activities in high school, said Jenny Nirh, director of collaboration, communications and outreach of student success at Arizona. In 2021, the form was revamped by the student success office with a focus on anticipated student resource needs and other personal information that could be relevant to academic success, such as caregiver status.

    In a typical year, between 85 and 90 percent of incoming first-year and transfer students complete the form, representing as many as 8,000 students, said senior analyst Laura Andrews, who is responsible for NSIF. In 2024, 6,500 first-year students responded to the NSIF, an 88 percent response rate.

    When asked where they wanted to get plugged in during their first year, two-thirds of respondents indicated they were interested in internships, while 58 percent selected student employment and the same share chose academic clubs.

    Over the past five years, staff have seen students report the same anticipated needs. In 2024, the greatest share of students said they expect to need at least some help accessing and managing financial aid (63 percent) and academic supports (56 percent).

    In addition to focusing on their interests, the survey asked students about their perceptions of college and the campus community. While a majority of respondents expressed excitement about being a student at Arizona, one in five indicated they were unsure about their ability to fit in and a similar number said they were uncertain whether their peers would assist them if they needed help.

    A question about caretaking responsibilities was added in 2023 to identify those students and connect them with childcare or caregiving resources available on campus, Nirh said.

    Data in action: Using NSIF data, staff have been able to respond to individuals’ needs and create strategic initiatives within various departments and offices that ensure no student is left behind. Now, the student success department tailors communications to students based on responses and promotes relevant support services.

    Each college at the university is given a breakdown of survey results for their incoming students, including their interests and expectations. The report is often distributed to department heads and faculty or used for student outreach purposes.

    Using data, staff found that Pell Grant recipients were more likely than the general campus population to say they wanted help navigating financial aid (19 percentage points higher) and student employment (12 percentage points higher). In addition, first-generation students were 10 percentage points more likely than the average student to say they needed support asking for help. The Thrive Center within the student success division uses this data in their first-generation support initiative, First Cats, and in their efforts to boost financial wellness.

    The Housing and Residence Life Division conducts an end-of-year survey of residents about their support needs and whether students sought help; staff received similar responses to those articulated by incoming students in the NSIF months earlier. However, students said they were less likely to seek help for their personal development—including mental health, time management and socialization—than for navigating campus life.

    If your student success program has a unique feature or twist, we’d like to know about it. Click here to submit.

    Source link

  • AI Hallucinations May Soon Be History

    AI Hallucinations May Soon Be History

    We all are witness to the incredibly frenetic race to develop AI tools, which publicly kicked off on Nov. 30, 2022, with the release of ChatGPT by OpenAI. While the race was well underway prior to the first public release, we have been able to follow, version by version, the updates from OpenAI (GPT), Google (Gemini) and Anthropic (Claude), which are among the many versions of AI for personal and commercial use. This competition has been fast and furious. One of the most disturbing issues that has surfaced along the way has been the proclivity of generative AI to hallucinate.

    In 2023, IBM dug into the hallucination issue:

    “Generally, if a user makes a request of a GenAI tool, they desire an output that appropriately addresses the prompt (that is, a correct answer to a question). However, sometimes AI algorithms produce outputs that are not based on training data, are incorrectly decoded by the transformer or do not follow any identifiable pattern. In other words, it ‘hallucinates’ the response. The term may seem paradoxical, given that hallucinations are typically associated with human or animal brains, not machines. But from a metaphorical standpoint, hallucination accurately describes these outputs, especially in the case of image and pattern recognition (where outputs can be truly surreal in appearance).”

    Roland Moore-Colyer reported in Live Science earlier this year that instances of hallucinations seemed to become more difficult to detect as the prompts became more advanced:

    “However, AI hallucinations present a problem when it comes to delivering accurate and correct information, especially if users take the information at face value without any checks or oversight. ‘This is especially problematic in domains where decisions depend on factual precision, like medicine, law or finance,’ Watson said. ‘While more advanced models may reduce the frequency of obvious factual mistakes, the issue persists in more subtle forms. Over time, confabulation erodes the perception of AI systems as trustworthy instruments and can produce material harms when unverified content is acted upon.” And this problem looks to be exacerbated as AI advances. “As model capabilities improve, errors often become less overt but more difficult to detect.’”

    A variety of attempts were made to correct the hallucination issue. Most notable was that many AI models added an architecture to optimize and enhance AI responses titled retrieval augmented generation (RAG). IBM explains in this article:

    “RAG allows GenAI models to access additional external knowledge bases, such as internal organizational data, scholarly journals and specialized datasets. By integrating relevant information into the generation process, chatbots and other natural language processing (NLP) tools can create more accurate domain-specific content without needing further training.”

    On Sept. 14, OpenAI researchers published a not-yet-peer-reviewed paper, “Why Language Models Hallucinate,” on arXiv. Gemini 2.5 Flash summarized the findings of the paper:

    Key Findings from the Paper

    Systemic Problem: Hallucinations are not simply bugs but a systemic consequence of how AI models are trained and evaluated.

    Evaluation Incentives: Standard evaluation methods, particularly binary grading systems, reward models for generating an answer, even if it’s incorrect, and punish them for admitting uncertainty.

    Pressure to Guess: This creates a statistical pressure for large language models (LLMs) to guess rather than say “I don’t know,” as guessing can improve test scores even with the risk of being wrong.

    Fundamental Flaw: The paper highlights a fundamental flaw in the industry’s approach to evaluating AI progress, prioritizing raw accuracy over reliability and factual correctness.

    This is much like the test-taking strategy in certain standardized tests, for which subjects are advised to guess rather than not answering a question for which they don’t have a reliable answer. Hence, in order to achieve the best outcome, models invent answers that could be plausible, but for which they don’t have solid evidence. That, of course, undermines the validity of the response and the credibility of the tool.

    This, clearly, is unsatisfactory. With the realization of this shortcoming in the training protocol for the models, the companies have begun tweaking their products to respond that they are unable to answer. However, we are also seeing the increasing prevalence of “deep research”–capable models such as GPT-5 and Gemini 2.5 Pro. These are able to dig deeper and deeper, uncovering more and more sources that can hold the desired answer.

    Here’s an example that is prompted by my lifelong enthusiasm for baseball. In previous “Online: Trending Now” columns, I have advocated we all provide more context and greater detail in our prompts. But, in this case, I presented a simple one-sentence prompt without context to three leading frontier generative AI apps: “Who called a players’-only meeting during a rain delay on a Wednesday night in Cleveland?” I included no mention of what sport, what team, what season, etc. Claude Sonnet 4 responded,

    “I’d need more context to answer this question accurately. There have been many baseball games with rain delays on Wednesday nights in Cleveland over the years, and players’-only meetings during such delays aren’t always widely reported or memorable events.

    Could you provide more details about:

    • What year or approximate timeframe this occurred?
    • Which team(s) were involved?
    • Any other context about the situation or why this meeting was significant? With more specific information, I’d be better able to help identify who called the meeting and the circumstances surrounding it.”

    Note that Claude, without deep research, did not hallucinate. It did recognize this was likely baseball, and it said it needed more information. Certainly, that is better than making up an answer such as we might have seen in previous months and years.

    GPT-5 thought for 11 seconds and came up with the right answer in a succinct response.

    Gemini 2.5 Pro thought for about the same length of time, digging through multiple sources, found the answer and confirming sources, then responded in more detail,

    Jason Heyward Rallied Cubs with Players’-Only Meeting During Historic World Series Rain Delay

    During a tense rain delay in the decisive Game 7 of the 2016 World Series, Chicago Cubs right fielder Jason Heyward called a players’-only meeting that has since become a legendary moment in the franchise’s storied history. The brief stoppage in play occurred on a Wednesday night in Cleveland after the ninth inning, with the Cubs and the then–Cleveland Indians locked in a tie. With the weight of a 108-year championship drought on their shoulders, Heyward gathered his teammates in the weight room behind the visitor’s [sic] dugout at Progressive Field … Following the delay and Heyward’s impromptu meeting, the Cubs came out in the top of the tenth inning and scored two runs. The Indians would rally for one run in the bottom of the frame, but the Cubs ultimately held on to win 8-7, securing their first World Series title since 1908.

    Know that hallucinations are fading from generative AI. We can be more confident in the recently released deep research models. This provides the kind of reliable research that we depend upon in higher education.

    Source link