Tag: college

  • What the saga of Oxford Business College tells us about regulation and franchising

    What the saga of Oxford Business College tells us about regulation and franchising

    One of the basic expectations of a system of regulation is consistency.

    It shouldn’t matter how prestigious you are, how rich you are, or how long you’ve been operating: if you are active in a regulated market then the same rules should apply to all.

    Regulatory overreach can happen when there is public outrage over elements of what is happening in that particular market. The pressure a government feels to “do something” can override processes and requirements – attempting to reach the “right” (political or PR) answer rather than the “correct” (according to the rules) one.

    So when courses at Oxford Business College were de-designated by the Secretary of State for Education, there’s more to the tale than a provider where legitimate questions had been raised about the student experience getting just desserts. It is a cautionary tale, involving a fascinating high-court judgment and some interesting arguments about the limits of ministerial power, of what happens when political will gets ahead of regulatory processes.

    Business matters

    A splash in The Sunday Times back in the spring concerned the quality of franchised provision from – as it turned out – four Office for Students registered providers taught at Oxford Business College. The story came alongside tough language from Secretary of State for Education Bridget Phillipson:

    I know people across this country, across the world, feel a fierce pride for our universities. I do too. That’s why I am so outraged by these reports, and why I am acting so swiftly and so strongly today to put this right.

    And she was in no way alone in feeling that way. Let’s remind ourselves, the allegations made in The Sunday Times were dreadful. Four million pounds in fraudulent loans. Fake students, and students with no apparent interest in studying. Non-existent entry criteria. And, as we shall see, that’s not even as bad as the allegations got.

    De-designation – removing the eligibility of students at a provider to apply for SLC fee or maintenance loans – is one of the few levers government has to address “low quality” provision at an unregistered provider. Designation comes automatically when a course is franchised from a registered provider: a loophole in the regulatory framework that has caused concern over a number of years. Technically an awarding provider is responsible for maintaining academic quality and standards for its students studying elsewhere.

    The Office for Students didn’t have any regulatory jurisdiction other than pursuing the awarding institutions. OBC had, in fact, tried to register with OfS – withdrawing the application in the teeth of the media firestorm at the end of March.

    So everything depended on the Department for Education overturning precedent.

    Ministering

    It is “one of the biggest financial scandals universities have faced.” That’s what Bridget Phillipson said when presented with The Sunday Times’ findings. She announced that the Public Sector Fraud Authority would coordinate immediate action, and promised to empower the Office for Students to act in such cases.

    In fact, OBC was already under investigation by the Government Internal Audit Agency (GIAA) and had been since 2024. DfE had been notified by the Student Loans Company about trends in the data and other information that might indicate fraud at various points between November 2023 and February 2024 – notifications that we now know were summarised as a report detailing the concerns which was sent to DfE in January 2024. The eventual High Court judgement (the details of which we will get to shortly) outlined just a few of these allegations, which I take from from the court documents:

    • Students enrolled in the Business Management BA (Hons) course did not have basic English language skills.
    • Less than 50 per cent of students enrolled in the London campus participate, and the remainder instead pay staff to record them as in attendance.
    • Students have had bank details altered or new bank accounts opened in their name, to which their maintenance payments were redirected.
    • Staff are encouraging fraud through fake documents sent to SLC, fake diplomas, and fake references. Staff are charging students to draft their UCAS applications and personal statements. Senior staff are aware of this and are uninterested.
    • Students attending OBC do not live in the country. In one instance, a dead student was kept on the attendance list.
    • Students were receiving threats from agents demanding money and, if the students complained, their complaints were often dealt with by those same agents threatening the students.
    • Remote utilities were being used for English language tests where computers were controlled remotely to respond to the questions on behalf of prospective students.
    • At the Nottingham campus, employees and others were demanding money from students for assignments and to mark their attendance to avoid being kicked off their course.

    At the instigation of DfE, and with the cooperation of OBC, GIAA started its investigation on 19 September 2024, continuing to request information from and correspond with the college until 17 January 2025. An “interim report” detailing emerging findings went to DfE on 17 December 2024; the final report arrived on 30 January 2025. The final report made numerous recommendations about OBC processes and policies, but did not recommend de-designation. That recommendation came in a ministerial submission, prepared by civil servants, dated 18 March 2025.

    Process story

    OBC didn’t get sight of these reports until 20 March 2025, after the decisions were made. It got summaries of both the interim and final reports in a letter from DfE notifying it that Phillipson was “minded to” de-designate. The documentation tells us that GIAA reported that OBC had:

    • recruited students without the required experience and qualifications to successfully complete their courses
    • failed to ensure students met the English language proficiency as set out in OBC and lead provider policies
    • failed to ensure attendance is managed effectively
    • failed to withdraw or suspend students that fell below the required thresholds for performance and/or engagement;
    • failed to provide evidence that immigration documents, where required, are being adequately verified.

    The college had 14 days to respond to the summary and provide factual comment for consideration, during which period The Sunday Times published its story. OBC asked DfE for the underlying material that informed the findings and the subsequent decision, and for an extension (it didn’t get all the material, but it got a further five days) – and it submitted 68 pages of argument and evidence to DfE, on 7 April 2025. Another departmental ministerial submission (on 16 April 2025) recommended that the Secretary of State confirm the decision to de-designate.

    According to the OBC legal team, these emerging findings were not backed up by the full GIAA reports, and there were concerns about the way a small student sample had been used to generalise across an entire college. Most concerningly, the reports as eventually shared with the college did not support de-designation (though they supported a number of other concerns about OBC and its admission process). This was supported by a note from GIAA regarding OBC’s submission, which – although conceding that aspects of the report could have been expressed more clearly – concluded:

    The majority of the issues raised relate to interpretation rather than factual accuracy. Crucially, we are satisfied that none of the concerns identified have a material impact on our findings, conclusions or overall assessment.

    Phillipson’s decision to de-designate was sent to the college on 17 April 2025, and it was published as a Written Ministerial Statement. Importantly, in her letter, she noted that:

    The Secretary of State’s decisions have not been made solely on the basis of whether or not fraud has been detected. She has also addressed the issue of whether, on the balance of probabilities, the College has delivered these courses, particularly as regards the recruitment of students and the management of attendance, in such a way that gives her adequate assurance that the substantial amounts of public money it has received in respect of student fees, via its partners, have been managed to the standards she is entitled to expect.

    Appeal

    Oxford Business College appealed the Secretary of State’s decision. Four grounds of challenge were pursued with:

    • Ground 3: the Secretary of State had stepped beyond her powers in prohibiting OBC from receiving public funds from providing new franchised courses in the future.
    • Ground 1: the decision was procedurally unfair, with key materials used by the Secretary of State in making the decision not provided to the college, and the college never being told the criteria it was being assessed against
    • Ground 4: By de-designating courses, DfE breached OBCs rights under Article 1 of the First Protocol to the European Convention on Human Rights (to peaceful enjoyment of its possessions – in this case the courses themselves)
    • Ground 7: The decision by the Secretary of State had breached the public sector equality duty

    Of these, ground 3 was not determined, as the Secretary of State had clarified that no decision had been taken regarding future courses delivered by OBC. Ground 4 was deemed to be a “controversial” point of law regarding whether a course and its designation status could be a “possession” under ECHR, but could be proceeded with at a later date. Ground 7 was not decided.

    Ground 1 succeeded. The court found that OBC had been subject to an unfair process, where:

    OBC was prejudiced in its ability to understand and respond to the matters of the subject of investigation, including as to the appropriate sanction, and to understand the reasons for the decision.

    Judgement

    OBC itself, or the lawyers it engaged, have perhaps unwisely decided to put the judgement into the public domain – it has yet to be formally published. I say unwisely, because it also puts the initial allegations into the public domain and does not detail any meaningful rebuttal from the college – though The Telegraph has reported that the college now plans to sue the Secretary of State for “tens of millions of pounds.”

    The win, such as it is, was entirely procedural. The Secretary of State should have shared more detail of the findings of the GIAA investigation (at both “emerging” and “final” stages) in order that the college could make its own investigations and dispute any points of fact.

    Much of the judgement deals with the criteria by which a sample of 200 students were selected – OBC was not made aware that this was a sample comprising those “giving the greatest cause for suspicion” rather than a random sample, and the inability of OBC to identify students whose circumstances or behaviour were mentioned in the report. These were omissions, but nowhere is it argued by OBC that these were not real students with real experiences.

    Where allegations are made that students might be being threatened by agents and institutional staff, it is perhaps understandable that identifying details might be redacted – though DfE cited the “”pressure resulting from the attenuated timetable following the order for expedition, the evidence having been filed within 11 days of that order” for difficulties faced in redacting the report properly. On this point, DfE noted that OBC, using the materials provided, “had been able to make detailed representations running to 68 pages, which it had described as ‘comprehensive’ and which had been duly considered by the Secretary of State”.

    The Secretary of State, in evidence, rolled back from the idea that she could automatically de-designate future courses without specific reason, but this does not change the decisions she has made about the five existing courses delivered in partnership. Neither does it change the fact that OBC, having had five courses forcibly de-designated, and seen the specifics of the allegations underpinning this exceptional decision put into the public domain without any meaningful rebuttal, may struggle to find willing academic partners.

    The other chink of legal light came with an argument that a contract (or subcontract) could be deemed a “possession” under certain circumstances, and that article one section one of the European Convention on Human Rights permits the free enjoyment of possessions. The judgement admits that there could be grounds for debate here, but that debate has not yet happened.

    Rules

    Whatever your feelings about OBC, or franchising in general, the way in which DfE appears to have used a carefully redacted and summarised report to remove an institution from the sector is concerning. If the rules of the market permit behaviour that ministers do not like, then these rules need to be re-written. DfE can’t just regulate based on what it thinks the rules should be.

    The college issued a statement on 25 August, three days after the judgement was published – it claims to be engaging with “partner institutions” (named as Buckinghamshire New University, University of West London, Ravensbourne University London, and New College Durham – though all four had already ended their partnerships with the remaining students being “taught out”) about the future of the students affected by the designation decision – many had already transferred to other courses at other providers.

    In fact, the judgement tells us that of 5,000 students registered at OBC on 17 April 2025, around 4,700 had either withdrawn or transferred out of OBC to be taught out. We also learn that 1,500 new students, who had planned to start an OBC-delivered course after 2025, would no longer be doing so. Four lead providers had given notice to terminate franchise agreements between April 2024 and May of 2025. Franchise discussions with another provider – Southampton Solent University – underway shortly before the decision to de-designate, had ended.

    OBC currently offers one course itself (no partnership offers are listed) – a foundation programme covering academic skills and English language including specialisms in law, engineering, and business – which is designed to prepare students for the first year of an undergraduate degree course. It is not clear what award this course leads to, or how it is regulated. It is also expensive – a 6 month version (requiring IELTS 5.5 or above) costs an eyewatering £17,500. And there is no information as to how students might enroll on this course.

    OBC’s statement about the court case indicates that it “rigorously adheres to all regulatory requirements”, but it is not clear which (if any) regulator has jurisdiction over the one course it currently advertises.

    If there are concerns about the quality of teaching, or about academic standards, in any provider in receipt of public funds they clearly need to be addressed – and this is as true for Oxford Business College as it is for the University of Oxford. This should start with a clear plan for quality assurance (ideally one that reflects the current concerns of students) and a watertight process that can be used both to drive compliance and take action against those who don’t measure up. Ministerial legal innovation, it seems, doesn’t quite cut it.

    Source link

  • How AI Can Smooth College Credit Transfer

    How AI Can Smooth College Credit Transfer

    Upward transfer is viewed as a mechanism to provide college students with an accessible and affordable on-ramp to higher education through two-year colleges, but breakdowns in the credit-transfer process can hinder a student’s progress toward their degree.

    A recent survey by Sova and the Beyond Transfer Policy Advisory Board found the average college student loses credits transferring between institutions and has to repeat courses they’ve already completed. Some students stop out of higher education altogether because transfer is too challenging.

    CourseWise is a new tool that seeks to mitigate some of these challenges by deploying AI to identify and predict transfer equivalencies using existing articulation agreements between institutions. So far, the tool, part of the AI Transfer and Articulation Infrastructure Network, has been adopted at over 120 colleges and universities, helping to provide a centralized database for credit-transfer processes and automate course matching.

    In the most recent episode of Voices of Student Success, host Ashley Mowreader speaks with Zachary Pardos, an associate professor at the University of California, Berkeley, about how CourseWise works, the human elements of credit transfer and the need for reliable data in transfer.

    An edited version of the podcast appears below.

    Q: As someone who’s been in the education-technology space for some time, can you talk about this boom of ed-tech applications for AI? It seems like it popped up overnight, but you and your colleagues are a testament to the fact that it’s been around for decades.

    Zach Pardos, associate professor at UC Berkeley and the developer of CourseWise

    A: As soon as a chat interface to AI became popularized, feasible, plausible and useful, it opened up the space to a lot of people, including those who don’t necessarily have a computer science background. So in a way, it’s great. You get a lot more accessibility to this kind of application and work. But there have also been precepts—things that the field has learned, things that people have learned who’ve been working in this space for a while—and you don’t want to have to repeat all those same errors. And in many ways, even though the current generation of AI is different in character, a lot of those same precepts and missteps still apply here.

    Q: What is your tool CourseWise and why is it necessary in the ed-tech space?

    A: CourseWise is a spinoff of our higher education and AI work from UC Berkeley. It is meant to be a credit-mobility accelerator for students and institutions. It’s needed because the greatest credit-mobility machine in America, the thing that gets families up in socioeconomic status, is education. And it’s the two-year–to–four-year transition often that does that, where you can start at a more affordable school that gives two-year associate’s degrees and then transition to a four-year school.

    But that pathway often breaks down. It’s often too expensive to maintain, and so for there to be as many pathways as possible that are legitimate between institutions, between learning experiences, basically acknowledging what a student has learned and not making them do it again, requires us to embrace technology.

    Q: Can you talk more about the challenges with transfer and where course equivalency and transfer pipelines can break down in the transition between the two- and four-year institutions?

    A: Oftentimes, when a student applies to transfer, they’ll have their transcript evaluated [by the receiving institution], and it’ll be evaluated against existing rules.

    Sometimes, when it’s between institutions that have made an effort to establish robust agreements, the student will get most of their credit accepted. But in instances where there aren’t such strong ties, there’s going to be a lot of credit that gets missed, and if the rules don’t exist, if the institution does go through the extra effort, or the student requests extra effort to consider credit that hasn’t been considered before, this can be a very lengthy process.

    Sometimes that decision doesn’t get made until after the student’s first or second semester, semesters in which they maybe had to decide whether or not to take such a course. So it really is a matter of not enough acknowledgment of existing courses and then that process to acknowledge the equivalency of past learning being a bit too slow to best serve a learner.

    Q: Yeah. Attending a two-year college with the hopes of earning a bachelor’s degree is designed to help students save time and money. So it’s frustrating to hear that some of these students are not getting their transfer equivalencies semesters into their progress at the four-year, because that’s time and energy lost.

    A: Absolutely. It’s unfortunately, in many cases, a false promise that this is the cheaper way to go, and it ends up, in many cases, being more expensive.

    Q: We can talk about the transfer pipeline a lot, but I’ll say one more thing: The free marketplace of higher education and the idea that a student can transfer anywhere is also broken down by a lack of transfer-articulation agreements, where the student’s credits aren’t recognized or they’re only recognized in part. That really hinders the student’s ability to say, “This is where I want to go to college,” because they’re subject to the whims of the institutions and their agreements between each other.

    A: That’s right, and it’s not really an intentional [outcome]. However, systems that have a power dynamic often have a tendency not to change, and that resistance to change, kind of implicitly, is a commitment not to serve students correctly.

    Accreditors Weigh In

    The Council of Regional Accrediting Commissions (C-RAC) supports the exploration and application of AI solutions within learning evaluation and credit transfer, according to a forthcoming statement from the group to be released Oct. 6. Three accrediting commissions, MSCHE, SACSCOC and WSCUC, are holding a public webinar conversation to discuss transfer and learning mobility, with a focus on AI and credit transfer on Oct. 6. Learn more here.

    So what you do need is a real type of intervention. Because it’s not in any one spot, you could argue, and you could also make the argument that every institution is so idiosyncratic in its processes that you would have to do a separate study at every institution to figure out, “OK, how do we fix things here?” But what our research is showing on the Berkeley end is that there are regularities. There are patterns in which credit is evaluated, and where you could modify that workflow to both better serve the institution, so it’s not spending so many resources on manually considering equivalencies, and serve the student better by elevating opportunities for credit acceptance in a more efficient way.

    That’s basically what CourseWise is. It’s meant to be an intervention that serves the institution and serves the student by recognizing these common patterns to credit acceptance and leveraging AI to alleviate the stress and friction that currently exists in affording that credit.

    Q: Can you walk us through where CourseWise fits into the workflow? How does it work practically?

    A: CourseWise is evolving in its feature set and has a number of exciting features ahead, which maybe we’ll get to later. But right now, the concrete features are that on the administrator side, on the staff or admissions department side, you upload an institution’s existing articulation agreements—so if you’re a four-year school, it’s your agreements to accept credit from two-year schools.

    So then, when you receive transcripts from prospective transfer students, the system will evaluate that transcript to tell you which courses match existing rules of yours, where you’ve guaranteed credit, and then it’ll also surface courses that don’t already have an agreement.

    If there’s a high-confidence AI match, it’ll bring that to the administrator’s attention and say, “You should consider this, and here’s why.” It’ll also bring to their attention, “Here’s peer institutions of yours that have already accepted that course as course-to-course credit.”

    A screenshot of the CourseWise software, showing a query course, Math 270: Linear Algebra, and how it compares to the equivalent courses on Linear Algebra.

    CourseWise compares classes in institutions’ catalogs to identify existing agreements for credit transfer and possible course-to-course transfers to improve student outcomes.

    Q: Where are you getting that peer-to-peer information from?

    A: We think of CourseWise as a network, and that information on what peer institutions are doing is present. We have a considerable number of institutions from the same system. California is one—we have 13 California institutions, and we’re working on more. The other is State University of New York, SUNY. We have the SUNY system central participating in a pilot. It’ll be up to the individual institutions to adopt the usage. But we have data at the system-center level, and because of that centralized data, we are able to say, for every SUNY institution that’s considering one of the AI credit acceptance requests, give that context of, “Here are other four-year peer institutions within your system that already accept this—not just as generic elective credit, but accept it as perhaps degree satisfying, or at least course-to-course credit.”

    Q: That’s awesome; I’m sure it’s a time saver. But where do the faculty or staff members come back into the equation, to review what the AI produced or to make sure that those matches are appropriate?

    A: Faculty are a critical part of the governance of credit equivalency in different systems. They have different roles; often it’s assumed that faculty approve individual courses. That’s true in most cases. Sometimes it’s committees; different departments will have a committee of faculty, or they may even have a campus standing committee that considers this curricular committee that makes those decisions.

    But what CourseWise is doing right now to incorporate faculty appropriately is we’re allowing for the institution to define what is that approval workflow and the rules around that. If it’s a lower-division statistics class, can your admission staff make that decision on acceptability, even if it’s not existing in a current agreement?

    Under what circumstances does it need to be routed to a faculty member to approve? What kind of information should be provided to that faculty member if they don’t have it, making it easy to request information, like requesting a syllabus be uploaded by the sending institution or something to that effect?

    Oftentimes, this kind of approval workflow is done through a series of emails, and so we’re trying to internalize that and increase the transparency. You have different cases that get resolved with respect to pairs of courses, and you can see that case. You can justify why a decision was made, and it can be revisited if there’s a rebuttal to that decision.

    Now, over time, what we hope the field can see as a potential is perhaps for certain students, let’s say, coming from out of state; it’s more a faculty committee who gives feedback to a kind of acceptance algorithm that is then able to make a call, and they can veto that call. But it creates a default; like with ChatGPT, there’s an alignment committee that helps give feedback to ChatGPT answers so that it is better in line with what most users find to be a high-quality response. Because there’s no way that we can proactively, manually accept or evaluate every pair of institutions to one another in the United States—there’s just no FTE count that would allow for that, which means that prospective students from out of state can’t get any guarantee if we keep it with that approach.

    Faculty absolutely have control. We’re setting up the whole workflow so an institution can define that. But one of the options we want to give institutions is the option to say, “Well, if the student is coming from out of state or coming from this or that system, you can default to a kind of faculty-curated AI policy.”

    Q: That’s cool. I’ve heard from some colleges that they have full teams of staff who just review transcripts every single day. Having a centralized database where you can see past experiences of which courses have been accepted or rejected—that can save so much time and energy. And that’s not even half of what CourseWise is doing.

    A: Absolutely, and we work closely with leadership and these institutions to get feedback. And one of the people involved in that early feedback is Isaiah Vance at the Texas A&M University system, and he’s given us similar feedback where, if you have a new registrar or a new leadership that comes in, and they want to know how good the data is, they want that kind of transparency of how were decisions made, if you have that transparency in that organization to look that over, it can really help an institution get comfortable with those past decisions or decide how they should change in the future.

    Q: What are some of the outcomes you’ve seen or the feedback you’ve heard from institutions that are using the tool?

    A: We have a study that we’re about to embark upon to measure a before-and-after change in how institutions are doing business and how much it’s saving time or not, versus a control of not having the system when making these decisions.

    We don’t have the results of that yet. We do have a paper out on where articulation officers, for example, are spending their time. They’re spending a lot of time on looking for the right course that might articulate. So we definitely have identified there is a problem. It’s an open question to what degree CourseWise is remedying that. We certainly are working nonstop to remedy it, but we’re going to measure that rigorously over the next year.

    Some early feedback is positive, but also interesting that institutions, many of them, are spending a lot of time getting that initial data uploaded, catalog descriptions, articulations and the rigorousness and validity of that data. Maybe it’s spread across a number of Excel spreadsheets at some institutions—that problem is real—and so I think it’s going to take a field-level or industry-level effort to make sure that everyone can be on board with that data-wrangling stage.

    Q: That was my hypothesis, that the tool has a lot of benefits once everything’s all set up and they’ve done the labor of love to hunt down and upload all these documents, find out which offices they’re hiding behind.

    A: There are a number of private foundations, funders who are invested in that particular area. So I’m optimistic that there’s a solution out there and that we’ll be a part of that.

    Q: I wonder if we can talk about how this tool can improve the student experience with transfer and what it means to have these efficiencies and database to lean back on.

    A: Right now, most of the activity is with the four-year schools, because they’re the ones uploading the articulations. They’re the ones evaluating transcripts. But in the next four months, we’re releasing a student-facing planner, which will directly affect students at the sending institutions.

    This planner will allow a student who’s at a community college to choose what destination school and major they’re interested in that’s part of the CourseWise network. Then [CourseWise provides] what courses they need to take, or options of courses to take that will transfer into the degree program that they’re seeking, such that when they transfer, they would only have to do the equivalent of two full years of academic work at that receiving school.

    It would also let them know what other majors at other institutions they may want to consider because of how much of the credit that they’ve already taken is accepted into the degree programs there. So the student may be 20 percent of the way in completing their initially intended destination program, but maybe they’re 60 percent of the way to another program that they didn’t realize.

    Q: What’s next for CourseWise?

    A: So the student part is the navigation, the administrator articulation expansion and policy for expansion is creating the pathways; you need a GPS in order to know what the paths are and how to traverse them as a learner. But also states—I mentioned regularities—there are commonalities in how these processes take place, but there’s also very specific state-level concerns and structures, like common course numbering, credit for prior learning, an emphasis on community colleges accepting professional certificate programs and so forth.

    I think the future is both increasing that student-facing value, helping with achievement from the student point of view. But then also leveraging the fundamental AI equivalency engine and research to bring in these other ways of acknowledging credit, whether it’s AP credit or job-training credit or certificates or cross-walking between all these different ways in which higher education chooses to speak about learning, right?

    If you have a requirement satisfied in general education in California, how do you bring that to New York, given New York’s general education requirements? Are there crosswalks that can be suggested and established with the aid of AI? And I’m excited about connecting these different sorts of dialects of education using technology.

    Source link

  • Lane Community College Board Apologizes to President

    Lane Community College Board Apologizes to President

    The Lane Community College Board of Education apologized to President Stephanie Bulger at its Tuesday meeting for how members disrespected her on the basis of her race and sex, Lookout Eugene-Springfield reported

    The board’s apology follows the findings of an investigative report released in August that determined board members were frequently dismissive of Bulger—a Black woman—and often deferred questions to male staff members. The report found that former board chair Zach Mulholland was frequently hostile toward Bulger and often cut her off in their interactions. (He was also found to have physically intimidated a student at a board meeting.) Although Mulholland was censured by the board last month, he has resisted calls to step down.

    Much of the report focused on Mulholland, but other members were also implicated.

    “The board recognizes and is accountable for the harm caused to you, President Bulger,” said Austin Fölnagy, the current board chair, who was also accused of dismissive behavior. “We are deeply sorry for the negative impact our behavior has had on you and the college community at large. President Bulger, please accept the board’s apology for treating you badly.” 

    He added that the board is “committed to learning from our shortcomings” and will take “remedial actions including training in bias, discrimination and harassment” this fiscal year.

    Bulger has been president of the Oregon community college since July 2022.

    Source link

  • Spending Soars, Rankings Fall at New College of Florida

    Spending Soars, Rankings Fall at New College of Florida

    More than two years into a conservative takeover of New College of Florida, spending has soared and rankings have plummeted, raising questions about the efficacy of the overhaul.

    While state officials, including Republican governor Ron DeSantis, have celebrated the death of what they have described as “woke indoctrination” at the small liberal arts college, student outcomes are trending downward across the board: Both graduation and retention rates have fallen since the takeover in 2023.

    Those metrics are down even as New College spends more than 10 times per student what the other 11 members of the State University System spend, on average. While one estimate last year put the annual cost per student at about $10,000 per member institution, New College is an outlier, with a head count under 900 and a $118.5 million budget, which adds up to roughly $134,000 per student.

    Now critics are raising new questions about NCF’s reputation, its worth and its future prospects as a public liberal arts college.

    A Spending Spree

    To support the overhaul, the state has largely issued a blank check for New College, with little pushback from officials.

    While some—like Florida Board of Governors member Eric Silagy—have questioned the spending and the state’s return on investment, money keeps flowing. Some critics say that’s because the college is essentially a personal project of the governor.

    “With DeSantis, I think his motivation for the takeover was that he was running for president and he needed some educational showcase. And he picked us because we were an easy target,” one New College of Florida faculty member said, speaking on the condition of anonymity.

    But now, two-plus years and one failed presidential run later, money continues to flow to the college to help establish new athletics programs and recruit larger classes each year. Part of the push behind such recruiting efforts, the faculty member said, is because of retention issues.

    “It’s kind of like a Ponzi scheme: Students keep leaving, so they have to recruit bigger and bigger cohorts of students, and then they say, ‘Biggest class ever’ because they have to backfill all the students who have left,” they said.

    Nathan Allen, a New College alum who served as vice president of strategy at NCF for almost a year and a half after the takeover but has since stepped down, echoed that sentiment, arguing that administrators are spending heavily with little return on investment and have failed to stabilize the institution. He also said they’ve lost favor with lawmakers, who have expressed skepticism in conversations—even though New College is led by former Speaker of the Florida House Richard Corcoran, a Republican.

    “I think that the Senate and the House are increasingly sensitive to the costs and the outcomes,” Allen said. “Academically, Richard’s running a Motel 6 on a Ritz-Carlton budget, and it makes no sense.”

    While New College’s critics have plenty to say, supporters are harder to find.

    Inside Higher Ed contacted three NCF trustees (one of whom is also a faculty member), New College’s communications office, two members of the Florida Board of Governors (including Silagy) and the governor’s press team for this article. None responded to requests for comment.

    A Rankings Spiral

    Since the takeover, NCF has dropped nearly 60 spots among national liberal arts colleges in the U.S. News & World Report Best Colleges rankings, from 76th in 2022 to 135th this year.

    Though critics have long argued that such rankings are flawed and various institutions have stopped providing data to U.S. News, the state of Florida has embraced the measurement. Officials, including DeSantis, regularly tout Florida’s decade-long streak as the top state for higher education, and some public universities have built rankings into their strategic plans. But as most other universities in the state are climbing in the rankings, New College is sliding, a fact unmentioned at a Monday press conference featuring DeSantis and multiple campus leaders.

    Corcoran, the former Republican lawmaker hired as president shortly after the takeover, did not directly address the rankings slide when he spoke at the briefing at the University of Florida. But in his short remarks, Corcoran quibbled over ranking metrics.

    “The criteria is not fair,” he said.

    Specifically, he took aim at peer assessment, which makes up 20 percent of the rankings criteria. Corcoran argued that Florida’s institutions, broadly, suffer from a negative reputation among their peers, whose leaders take issue with the conservative agenda DeSantis has imposed on colleges and universities.

    “This guy has changed the ideology of higher education to say, ‘We’re teaching how to think, not what to think,’ and we’re being peer reviewed by people who think that’s absolutely horrendous,” Corcoran said.

    An Uncertain Future

    As New College’s cost to the state continues to rise and rankings and student outcomes decline, some faculty members and alumni have expressed worry about what the future holds. While some believe DeSantis is happy to keep pumping money into New College, the governor is term limited.

    “It’s important to keep in mind that New College is not a House or Senate project; it’s not a GOP project. It’s a Ron DeSantis project. Richard Corcoran has a constituency of one, and that’s Ron,” Allen said.

    Critics also argue that changes driven by the college’s administration and the State University System—such as reinstating grades instead of relying on the narrative evaluations NCF has historically used and limiting course offerings, among other initiatives—are stripping away what makes New College special. They argue that as it loses traditions, it’s also losing differentiation.

    Rodrigo Diaz, a 1991 New College graduate, said that the Sarasota campus had long attracted quirky students who felt stifled by more rigid academic environments. Now the administration and state are imposing “uniformity,” he said, which he argued will be “the death of New College.”

    And some critics worry that death is exactly what lies ahead for NCF. The anonymous faculty member said they feel “an impending sense of doom” at New College and fear that it could close within the next two years. Allen said he has heard a similar timeline from lawmakers.

    Even Corcoran referenced possible closure at a recent Board of Governors meeting.

    In his remarks, the president emphasized that a liberal arts college should “produce something different.” And “if it doesn’t produce something different, then we should be closed down. But if we are closed down, I say this very respectfully, Chair—then this Board of Governors should be shut down, too,” Corcoran said, noting that many of its members have liberal arts degrees.

    To Allen, that remark was an unforced error that revealed private conversations about closure are likely happening behind closed doors.

    “I think Richard made the mistake of not realizing those conversations haven’t been public. He made them public, but the Board of Governors is very clearly talking to him about that,” he said.

    But Allen has floated an alternative to closure: privatization.

    Founded in 1960, New College was private until it was absorbed by the state in 1975. Allen envisions “the same deal in reverse” in a process that would be driven by the State Legislature.

    “I think that the option set here is not whether it goes private or stays public, I think it’s whether it goes private or closes,” Allen said. “And I think that that is increasingly an open conversation.”

    (Though NCF did not respond to media inquiries, Corcoran has voiced opposition to such a plan.)

    Allen has largely pushed his plan privately, meeting with lawmakers, faculty, alumni and others. Reactions are mixed, but the idea seems to be a growing topic of conversation on campus. The anonymous faculty member said they are increasingly warming to the idea as the only viable solution, given that they believe the other option is closure within the next one to three years.

    “I’m totally convinced this is the path forward, if there is a path forward at all,” they said.

    Diaz said the idea is also gaining momentum in conversations with fellow alumni. He called himself “skeptical but respectful” of the privatization plan and said he has “a lot of doubt and questions.” But Diaz said that he and other alumni should follow the lead of faculty members.

    “Now, if the faculty were to jump on board with the privatization plan, then I think that people like myself—alumni like myself, who are concerned for the future of the college—should support the faculty,” Diaz said. “But the contrary is also true. If the faculty sent up a signal that ‘We don’t like this, we have doubts about this,’ then, in good conscience, I don’t think I could back the plan.”

    Source link

  • Majority of California Community College Students Lack Basic Needs

    Majority of California Community College Students Lack Basic Needs

    Two in three community college students in California lack reliable access to food or housing, according to a new study.

    The 2025 Real College CA Student Survey, led by the Community College League of California, found that 46 percent of students are food insecure and 58 percent are housing insecure, which is higher than national estimates: The most recent study from the Hope Center at Temple University found that 41 percent of all college students are food insecure and 48 percent indicated housing insecurity.

    Community college students in California reported slightly lower rates of basic needs insecurity in this survey than in 2023, but the number of students needing help remains high.

    “It is important to highlight when trends are moving in the right direction, but also that there’s still a lot of work to do,” Katie Brohawn, director of research, evaluation and development at the Research and Planning Group for California Community Colleges, said in a Sept. 24 webinar.

    Methodology

    Over 76,000 community college students responded to the survey, 3,300 of whom completed it in Spanish. The respondents represented 102 of the 116 institutions in the California Community College system.

    The background: For many community college students, financial and mental health concerns can be among the top barriers to completion.

    “Before students can thrive academically, their basic needs must be met,” said Tammeil Gilkerson, chancellor of the Peralta Community College District in Oakland, during the webinar.

    A fall 2023 study from EdSights found that students at public two-year institutions report the highest levels of financial distress, even though those are among the most affordable institutions across sectors.

    One recent study from the Annenberg Institute at Brown University found that nearly 41 percent of community college students experienced food insecurity and 60 percent reported housing insecurity.

    Compared to their four-year peers, community college students are also more likely to be from low-income families, racially minoritized, first-generation, immigrant and adult learners. Each of these groups faces unique challenges in their persistence and retention in higher education.

    The previous Real College CA survey, administered in 2023, helped college leaders and others in the state identify the role basic needs insecurity plays in students’ academic progress and overall success, particularly as the state was recovering from the COVID-19 pandemic, Gilkerson said.

    “While we are no longer in the height of the pandemic, its ripple effects remain and they collide with record housing costs, persistent inflation in food and basic goods, and continued debates about the role of higher education, equity and access in our society,” Gilkerson said.

    The data: The latest survey found that only 38 percent of students had high food security, while 46 percent had low or very low food security. The most common concerns students identified were worrying about food running out before they can afford to purchase more (52 percent) or being unable to afford balanced meals (49 percent).

    Nearly three in five students said they experienced some level of housing insecurity, and one in five reported being homeless in the past 12 months. While only 8 percent of respondents self-identified as homeless, more said they were couch-surfing (16 percent) or staying at a hotel or motel without a permanent home to return to (6 percent).

    Basic needs insecurity also varied by region and institution across the state, with the highest reported rates of food and housing insecurity at 70 percent and 78 percent, respectively. The report did not identify which colleges had the highest and lowest rates of basic need insecurity.

    Basic needs insecurities disproportionately impact African American and Black students as well as American Indian or Alaska Native students, compared to their peers. Older students (ages 26 to 30), LGBTQ+ students, independent students, Pell Grant recipients, single parents, former foster youth and those with a history of incarceration were also more likely to indicate food or housing insecurity.

    The data also points to a correlation between students’ grades and their rates of basic needs insecurity. While students at all levels had some degree of food or housing insecurity, those earning grades lower than B’s were much more likely to indicate they lacked essential resources.

    “If we really are dedicated to improving the academic success of students in our colleges, it’s the basic means that we need to meet. Because if we don’t do that, it doesn’t matter how wonderful a student you are, you’re not going to be able to succeed at the rate that you would otherwise,” Brohawn said.

    Not every student is aware of or utilizing campus resources that could address these challenges; over one-third of respondents said they were unaware of basic needs supports at their college, and only 25 percent had accessed the Basic Needs Center. Among students who used resources, most did so to obtain food.

    Identifying solutions: Over the past five years, California has made strides to better support learners with basic needs insecurity, recognizing housing challenges as a significant barrier to student success.

    The state launched a rapid rehousing program to support learners at public institutions including the CCC, California State University and University of California systems. A 2022 bill began requiring colleges to stock discounted health supplies, such as toiletries and birth control, addressing students’ basic needs in a new way.

    A pilot program also provides cash to financially vulnerable students at California colleges, including those who were formerly incarcerated, former foster youth and parents.

    The report’s authors recommended providing targeted interventions for vulnerable populations and enhancing accessibility and awareness of supports, as well as advocating for systemic changes, such as increased funding for basic needs initiatives or policies that provide living wages and affordable housing for students.

    Source link

  • NAEP scores for class of 2024 show major declines, with fewer students college ready

    NAEP scores for class of 2024 show major declines, with fewer students college ready

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters.

    Students from the class of 2024 had historically low scores on a major national test administered just months before they graduated.

    Results from the National Assessment of Educational Progress, or NAEP, released September 9, show scores for 12th graders declined in math and reading for all but the highest performing students, as well as widening gaps between high and low performers in math. More than half of these students reported being accepted into a four-year college, but the test results indicate that many of them are not academically prepared for college, officials said.

    “This means these students are taking their next steps in life with fewer skills and less knowledge in core academics than their predecessors a decade ago, and this is happening at a time when rapid advancements in technology and society demand more of future workers and citizens, not less,” said Lesley Muldoon, executive director of the National Assessment Governing Board. “We have seen progress before on NAEP, including greater percentages of students meeting the NAEP proficient level. We cannot lose sight of what is possible when we use valuable data like NAEP to drive change and improve learning in U.S. schools.”

    These results reflect similar trends seen in fourth and eighth grade NAEP results released in January, as well as eighth grade science results also released Tuesday.

    In a statement, Education Secretary Linda McMahon said the results show that federal involvement has not improved education, and that states should take more control.

    “If America is going to remain globally competitive, students must be able to read proficiently, think critically, and graduate equipped to solve complex problems,” she said. “We owe it to them to do better.”

    The students who took this test were in eighth grade in March of 2020 and experienced a highly disrupted freshman year of high school because of the pandemic. Those who went to college would now be entering their sophomore year.

    Roughly 19,300 students took the math test and 24,300 students took the reading test between January and March of 2024.

    The math test measures students’ knowledge in four areas: number properties and operations; measurement and geometry; data analysis, statistics, and probability; and algebra. The average score was the lowest it has been since 2005, and 45% of students scored below the NAEP Basic level, even as fewer students scored at NAEP Proficient or above.

    NAEP Proficient typically represents a higher bar than grade-level proficiency as measured on state- and district-level standardized tests. A student scoring in the proficient range might be able to pick the correct algebraic formula for a particular scenario or solve a two-dimensional geometric problem. A student scoring at the basic level likely would be able to determine probability from a simple table or find the population of an area when given the population density.

    Only students in the 90th percentile — the highest achieving students — didn’t see a decline, and the gap between high- and low-performing students in math was higher than on all previous assessments.

    This gap between high and low performers appeared before the pandemic, but has widened in most grade levels and subject areas since. The causes are not entirely clear but might reflect changes in how schools approach teaching as well as challenges outside the classroom.

    Testing officials estimate that 33% of students from the class of 2024 were ready for college-level math, down from 37% in 2019, even as more students said they intended to go to college.

    In reading, students similarly posted lower average scores than on any previous assessment, with only the highest performing students not seeing a decline.

    The reading test measures students’ comprehension of both literary and informational texts and requires students to interpret texts and demonstrate critical thinking skills, as well as understand the plain meaning of the words.

    A student scoring at the basic level likely would understand the purpose of a persuasive essay, for example, or the reaction of a potential audience, while a students scoring at the proficient level would be able to describe why the author made certain rhetorical choices.

    Roughly 32% of students scored below NAEP Basic, 12 percentage points higher than students in 1992, while fewer students scored above NAEP Proficient. An estimated 35% of students were ready for college-level work, down from 37% in 2019.

    In a survey attached to the test, students in 2024 were more likely to report having missed three or more days of school in the previous month than their counterparts in 2019. Students who miss more school typically score lower on NAEP and other tests. Higher performing students were more likely to say they missed no days of school in the previous month.

    Students in 2024 were less likely to report taking pre-calculus, though the rates of students taking both calculus and algebra II were similar in 2019 and 2024. Students reported less confidence in their math abilities than their 2019 counterparts, though students in 2024 were actually less likely to say they didn’t enjoy math.

    Students also reported lower confidence in their reading abilities. At the same time, higher percentages of students than in 2024 reported that their teachers asked them to do more sophisticated tasks, such as identifying evidence in a piece of persuasive writing, and fewer students reported a low interest in reading.

    Chalkbeat is a nonprofit news site covering educational change in public schools.

    For more news on national assessments, visit eSN’s Innovative Teaching hub.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • College admissions in a rapidly evolving world

    College admissions in a rapidly evolving world

    The years ahead will be anything but boring for college admissions officers. From demographic changes and increasing college competition to budget cuts and evolving approaches for admissions requirements — not to mention tectonic federal policy shifts and the rapid proliferation of artificial intelligence — the field is as fluid as ever. 

    Those topics and more were under discussion at the Sept. 18-20 annual conference of the National Association for College Admission Counseling in Columbus, Ohio. 

    While there are many forces outside the control of the admissions office, attendees tuned into the internal challenges and opportunities they’re navigating. Panelists, for instance, dug into data on diversity in college enrollment, how to best prepare future students for college math classes and when to deploy AI in institutional operations. 

    Here’s an in-depth look at some of the most interesting conversations Higher Ed Dive heard at NACAC’s 2025 conference:

    Source link

  • Higher education postcard: Jesus College, Cambridge

    Higher education postcard: Jesus College, Cambridge

    In about 520CE, or so the story goes, Radegund was born, daughter of Bertachar, one of three brother kings of Thuringia.

    Uncle Hermanfrid, one of the other brothers, killed Bertachar; Radegund moved into his household. Hermanfrid allied with another king, Theuderic, to defeat Radegund’s other uncle, Baderic, and thus became sole King of Thuringia. And in so doing he reneged on an agreement with Theuderic.

    I hope you’re paying attention, because there’ll be a short test later.

    Now Theuderic was not the kind to forget a slight, and in 531, when Radegund was 11, he invaded Thuringia, with his brother Clothar. They defeated Hermanfrid, and Ragemund was taken into Clothar’s household. She lived in Picardy until 540, when Clothar married her, bringing his total of wives to six. (The other wives were Guntheuca, Chunsina, Ingund, Aregund and Wuldetrada, just in case you think I’m making this up.)

    In 545 Clothar murdered Radegund’s last surviving brother, and that was clearly the last straw, as she fled. She sought the protection of the church, and Medardus, Bishop of Noyen, ordained her as deaconess. In about 560 she founded the abbey of Sainte-Croixe near Poitiers, and she died in 567, having reputedly lived an austere, ascetic life, renowned for her healing powers. Or so the story goes.

    Now fast forward 600 years or so. Malcolm IV, King of Scotland and Earl of Huntingdon, visited Poitiers, the site of the cult of the now-sanctified Radegund. He gave ten acres of land to found a priory, dedicated to St Mary and St Radegund. And this land was in what would in time become central Cambridge.

    Now fast forward another 300 years. The priory now had a – ahem – reputation. John Alcock, Bishop of Ely, in whose see the priory sat, was given permission by Pope Alexander VI and King Henry VII to dissolve the priory. This was in 1496; the later description of the priory as a “community of spiritual harlots” may have been the cause; it may also, of course, have been a post facto justification. In any event, the priory was dissolved and a college founded in its place. The College of the Blessed Virgin Mary, Saint John the Evangelist and the glorious Virgin Saint Radegund, near Cambridge, which is now more commonly known as Jesus College, Cambridge, took over the priory buildings, and away it went.

    Bishop Alcock, by the way, gave the college its arms: the three cocks’ heads play on his surname. No sniggering at the back there.

    For hundreds of years Jesus was, in essence, a training college for clergy, staying small. But in 1863 Henry Morgan was appointed tutor of the college, and set about his duties with energy. The railway boom at the time meant that some of the original priory lands could be sold, bringing in cash with which Morgan expanded the college: by 1871 there were four times as many students as ten years previously; by 1881 the college had nearly doubled in size again from 1871. And these students would not be confined to those seeking a career in the Church of England.

    Let’s have a look at some Jesus College people. (What’s the correct term? Jesuits is logical but it really does have a more specific meaning. Jesusites? Jesusians? I bet there’s a correct term, and I bet someone will comment to say.)

    A good place to start is Thomas Cranmer, Archbishop of Canterbury at the time of Henry VIII, and architect of the English reformation. Cranmer may have (but probably didn’t) attended the college as a student, but he was certainly reader in divinity at Jesus from 1517 to 1528. He didn’t keep strong connections to the college after moving into court circles as archbishop, but as he was ultimately executed as a traitor (he backed the wrong team in the post-Edward VI power struggle) this may have been no bad thing, for the college at least.

    Let’s then move on to Laurence Sterne, student at the college 1733–37. He became a clergyman, but no-one remembers him for this. Because he arguably invented the English novel, with The Life and Opinions of Tristram Shandy, Gentleman published between 1759 and 1767. If you don’t know it, have a read; it is well worth it. It may change your opinions about just how modern modern writing is.

    Next in the roll of honour is Thomas Malthus, student of the college 1784–88 and fellow 1793–1804. As an economist Malthus was influential. In his Essay on the Principle of Population as it Affects the Future Improvement of Society he argued that population growth was unsustainable, because demand for food would inevitably outstrip supply. It is worth noting that the world population at that time was about 800 million; it is ten times that today. And while food is not fairly distributed across the world, neither is there a population crash as Malthus argued there would be.

    And now let’s move on to Samuel Taylor Coleridge, romantic poet and opium addict, author of The Rime of the Ancient Mariner and Kubla Khan. Coleridge was a student at Jesus. He developed his opium habit while at college, and in his third year dropped out to join the army, under the assumed name of Silas Tomkyn Comberbache. His brother had to pay a bribe to get him out of the army, and although he returned to Cambridge thereafter he never quite graduated.

    In 2019 Jesus appointed its first female master, Sonita Alleyne, having first admitted women as students in 1979. Alleyne was also the first black head of an Oxbridge college, preceding Valerie Amos at University College, Oxford by a year. More generally, the college has a very good run through its history here.

    Jesus is a sporty college, and its boat club is very strong. It holds the most headships of the river in the May and the Lent bumps, across both men’s and women’s boats. (I tried to explain about Cambridge rowing a while ago – here’s the link in case you’re interested.)

    And here’s a jigsaw of the card – hope you enjoy it!

    Source link

  • 3 College Student Retention Strategies to Prioritize at This Time of Year

    3 College Student Retention Strategies to Prioritize at This Time of Year

    Retention is not what you do. It is the outcome of what you do.

    It’s that time of year when retention committees, student success professionals, and leadership teams across the country calculate the retention rate for the fall 2024 cohort and compare it with their previous years’ outcomes. Some campuses have undoubtedly stayed the same, others decreased, and some increased, but the overall conversation is usually about how “it” can be done better for the fall 2025 class. 

    Let’s talk about “it” for a minute. Many of you have heard the message that two of our founders, Lee Noel and Randi Levitz, and the student success professionals who have followed in their footsteps, have shared for several decades: Retention is not what you do. “It” is the outcome of what you do. “It” is the result of quality faculty, staff, programs and services. As you consider improvements to your efforts which will impact the fall 2025 entering class and beyond, keep in mind the following three student retention strategies and practices. 

    1. Assess college student retention outcomes completely

    The first strategy RNL recommends is a comprehensive outcomes assessment. All colleges and universities compute a retention rate at this time of year because it has to be submitted via the IPEDS system as part of the federal requirements. But many schools go above and beyond what is required and compute other retention rates to inform planning purposes. For example, at what rates did you retain special populations or students enrolled in programs designed to improve student success? In order to best understand what contributed to the overall retention rate, other outcomes have to be assessed as well. For instance, how many students persisted but didn’t progress (successfully completed their courses)? Before you finalize the college student retention strategies for your fall 2025 students, be sure you know how your 2024 students persisted and progressed so that strategies can be developed for the year ahead. 

    2. Know what worked and what didn’t

    The second strategy we recommend is to consider what worked well during the previous year and what didn’t. Many of us have been in situations where we continue to do the same thing and expect different results, which has been called insanity! (Fun fact, this quote is often attributed to Einstein, but according to Google, was not actually said by him!) A common example would be the academic advising model.  RNL has many years of data which show that academic advising is one of the most important college student retention strategies. But just doing what you have always done may not still be working with today’s college students. Advising is an area which needs constant attention for appropriate improvements. Here are a few questions for you to consider: Does your academic advising model, its standards of practice, and outcomes assessment reveal that your students are academically progressing by taking the courses needed for completion? Can you identify for each of your advisees an expected graduation date (which is one of the expected outcomes of advising)? Establishing rich relationships between advisors and advisees, providing a quality academic advising experience, can ultimately manage and improve the institution’s graduation rate. 

    3. Don’t limit your scope of activity

    Once you have assessed the 2024 class outcomes and the quality of your programs and services, RNL encourages you to think differently about how you will develop college student retention strategies that will impact the 2025 class. Each college has an attrition curve, or a distribution of students with their likelihood of being retained. The attrition curve, like any normal distribution, will show which students are least and most likely to retain and will reveal the majority of students under the curve. See the example below:

    The Retention Attrition Curve showing that campuses should focus retention efforts on students who can be influenced to re-enroll. The Retention Attrition Curve showing that campuses should focus retention efforts on students who can be influenced to re-enroll.

    As you consider your current activities, you may find that many of your programs are designed for the students at the tail end of the curve (section A above) or to further support the students who are already likely to persist (section B). Institutions set goals to increase retention rates but then limit the scope of students they are impacting. To have the best return on retention strategies, consider how you can target support to the largest group of students in the middle (section C) who are open to influence on whether they stay or leave, based on what you do or don’t do for them, especially during their first term and their first year at your school. 

    Onward for the year ahead

    RNL congratulates those of you who have achieved your retention goals for the 2024 cohort. You certainly must have done some things right and must have had student retention strategies that were effective. For those of you who are looking for new directions in planning, consider the three practices outlined above. 

    And if you aren’t currently one of the hundreds of institutions already working with RNL, you may want to implement one or more of the RNL student success tools to support your efforts: the RNL motivational survey instruments to identify those students who are most dropout prone and most receptive to assistance, the RNL student retention data analytics to identify the unique factors that contribute to persistence at your institution, and the RNL satisfaction-priorities surveys that inform decision making and resource allocation across your campus population. RNL can provide support in all of these areas along with on-going consulting services to further direct and guide retention practices that can make a difference in your enrollment numbers and the success of both your students and your institution.  Contact me to learn more in any of these areas. 

    Note: Thanks to my former colleague Tim Culver for the original development of this content.

    Ask for a complimentary consultation with our student success experts

    What is your best approach to increasing student retention and completion? Our experts can help you identify roadblocks to student persistence and maximize student progression. Reach out to set up a time to talk.

    Request now

    Source link

  • The Widening Gap: Income, College, and Opportunity with Zachary Bleemer

    The Widening Gap: Income, College, and Opportunity with Zachary Bleemer

    One of the great promises of higher education is that it acts as a social ladder—one that allows students from low-income backgrounds to climb up and reach a higher social and economic status. No one, I think, ever believed it was a guaranteed social leveler, or that children from wealthier families didn’t have an easier time succeeding after college because of their own, and their family’s, social and cultural capital. But most people, in America at least, believed that on the whole it played a positive role in increasing social mobility.

    Over the past couple of decades, though, particularly as student debt has increased, people have begun to wonder if this story about social mobility through college is actually true. That’s a hard question to answer definitively. Data sets that track both student origins and outcomes are few and far between, and it’s also difficult to work out what social mobility used to look like in a quantifiable sense.

    However, this summer economist Sarah Quincy of Vanderbilt University and Zach Bleemer of Princeton University released a paper called Changes in the College Mobility Pipeline Since 1900. This paper overcame some of those data limitations and took a long, more than century-long, look at the relationship between social mobility and college attendance.

    What they found was sobering. Not only is higher education no longer helping poor students catch up with wealthier ones, but in fact the sector’s role as a social elevator actually stopped working almost 80 years ago. This seemed like a perfect story for the podcast, and so we invited Zach Bleemer—who you may remember from an episode on race-conscious admissions about two years ago—to join us to discuss it.

    This discussion ranges from the methodological to the expositional. Where does the data come from? What does the data really mean? And are there alternative explanations for the paper’s surprising findings? But enough from me—let’s hear from Zach.


    The World of Higher Education Podcast
    Episode 4.4 | The Widening Gap: Income, College, and Opportunity with Zachary Bleemer

    Transcript

    Alex Usher (AU): Zach, you wrote, with Sarah Quincy, a paper called Changes in the College Mobility Pipeline Since 1900, which looks a long way back. And you argue that the relative premium received by lower-income Americans from higher education has fallen by half since 1960. Take us through what you found—give us the 90-second elevator pitch.

    Zachary Bleemer (ZB): Consider kids who were born in 1900 and were choosing whether or not to go to college in the late 1910s and early 1920s. What we were interested in was that choice, and in particular, following people for the next 20 years after they made it. Some people graduated high school but didn’t go to college, while others graduated high school and chose to go.

    We wanted to compare the differences in early 1930s wages between those two groups—both for kids from lower-income backgrounds and kids from upper-income backgrounds. Now, you might be surprised to learn that there were lower-income kids going to college in the U.S. in the early 1920s, but there were. About 5 to 10% of people from the bottom parental income tercile even then were attending college.

    What we found, when we linked together historical U.S. census records and followed kids forward, is that whether you were low-income or high-income, if you went to college your wages went up a lot. And the degree to which your wages went up was independent of whether you were low-income or high-income—everyone benefited similarly from going to college.

    If you compare that to kids born in the 1980s, who were choosing to go to college in the late 1990s and early 2000s, you see a very different story. Everyone still gains from going to college, but kids from rich backgrounds gain a lot more—more than twice as much as kids from poor backgrounds. And that’s despite the fact they’re making the same choice. They’re going to different universities and studying different things, but when it comes down to the 18-year-old making a decision, those from poor families are just getting less from American higher education now than they did in the past—or compared to kids from rich backgrounds.

    AU: I want to make sure I understand this, because it’s a crucial part of your argument. When you talk about relative premiums—premium compared to what, and relative compared to what?

    ZB: What we always have in mind is the value of college for rich kids, and then asking: how much of that value do poor kids get too? In the early 20th century, and as late as the 1960s, those values were very similar. Lower-income kids were getting somewhere between 80 and 100% of the value of going to college as higher-income kids.

    AU: And by “value,” you mean…

    ZB: That just means how much your wages go up. So, the wage bump for lower-income kids was very similar to that of higher-income kids. Today, though, it’s more like half—or even a little less than half—of the economic value of college-going that lower-income kids receive compared to higher-income kids.

    AU: So in effect, higher education is acting as an engine of greater inequality. That’s what you’re saying?

    ZB: I guess it’s worth saying that lower-income kids who go to college are still getting ahead. But it’s not as much of a pipeline as it used to be. Higher education used to accelerate lower-income kids—not to the same level of income as their higher-income peers; they were never going to catch up—but at least they got the same bump, just from a lower starting point.

    AU: So the gap widens now. But how do you make a claim like that over 120 years? I mean, I sometimes have a hard time getting data for just one year. How do you track college premiums across a period of 120 years? How sound is the empirical basis for this? You mentioned something about linking data to census records, which obviously go back quite a way. So tell us how you constructed the data for this.

    ZB: The first-order answer is that I called up and worked with an economic historian who had much more experience with historical data than I did. Like you said, it’s hard in any period to get high-quality data that links students in high school—especially with information on their parental income—to wage outcomes 10 or 15 years later.

    What we did was scan around for any academic or government group over the last 120 years that had conducted a retrospective or longitudinal survey—where you either follow kids for a while, or you find a bunch of 30-year-olds and ask them questions about their childhood. We combined all of these surveys into a comprehensive database.

    In the early 20th century, that meant linking kids in the 1920 census, when they were still living with their parents, to the same kids in the 1940 census, when they were in their early thirties and working in the labor market. That link has been well established by economic historians and used in a large series of papers.

    By the middle of the 20th century, sociologists were conducting very large-scale longitudinal surveys. The biggest of these was called Project Talent, put together by the American Institutes for Research in 1961. They randomly sampled over 400,000 American high school students, collected a ton of information, and then re-surveyed them between 1971 and 1974 to ask what had happened in their lives.

    In more recent years, there’s been a large set of governmental surveys, primarily conducted by the Departments of Labor and Education. Some of these will be familiar to education researchers—like the National Longitudinal Survey of Youth (NLSY). Others are less well known, but there are lots of them. All we did was combine them all together.

    AU: I noticed in one of the appendices you’ve got about nine or ten big surveys from across this period. I guess one methodological limitation is that they don’t all follow respondents for the same amount of time, and you’d also be limited to questions where the surveys provided relatively similar answers. You never get your dream data, but those would be the big limitations—you’ve got to look for the similarities, and that restricts you.

    ZB: I’d add another restriction. You’re right that, as we filtered down which datasets we could use, the key variables we needed were: parental income when the student was in high school, level of education by age 30, and how much money they made at some point between ages 30 and 35. All of our surveys had those variables.

    We also looked for information about what college they attended and what their college major was. Ideally, the surveys also included some kind of high school test—like the SAT or an IQ test—so we could see what kinds of students from what academic backgrounds were going to college.

    But there was another key limitation. In most of the data before 1950, it was really difficult to get a direct measure of parental income. Instead, we usually had proxies like parental occupation, industry, or level of education—variables that are highly predictive of income, but not income itself.

    So, a lot of the work of the paper was lining up these measures of varying quality from different surveys to make sure the results we report aren’t just noise from mismeasurement, but instead reflect real changes on the ground in American higher education.

    AU: So you ran the data and noticed there was a sharp inflection point—or maybe not sharp, but certainly things started to get worse after 1960. When you first saw that, what were your hypotheses? At that point, you’ve got to start looking at whatever variables you can to explain it. What did you think the answer was, and what did you think the confounding variables might be?

    ZB: My expectation was that two things would primarily explain the change. My background is in studying undergraduate admissions, so I thought the first explanation would be rising meritocracy in admissions. That might have made it harder for lower-income and lower-testing kids to get access to high-quality education. I also thought changes in affirmative action and in access to selective schools for kids from different backgrounds, along with rising tuition that made it harder for lower-income kids to afford those schools, could have played a big role. That was one possible story.

    The second possible story is that it had nothing to do with the causal effect of college at all. Instead, maybe the poor kids who go to college today aren’t as academically strong as they were in the past. Perhaps in the past only the brilliant poor kids went to college, while all the rich kids went regardless of ability. So it could have looked like poor kids were getting a big benefit from college, when in fact those few who made it would have done well anyway.

    It turns out neither of these explanations is the primary driver of rising regressivity. On the test score story, it’s always been the case that rich kids who go to college have relatively higher test scores than rich kids who just graduate high school—and that poor kids who go to college have relatively lower scores compared to their peers. That hasn’t changed since 1960.

    And on the access story, it’s always been the case that rich kids dominate the schools we now think of as “good”—the fancy private universities and the flagship public universities. But over the last 50 years, poor kids have actually slightly increased their representation at those schools, not the other way around. Rising meritocracy hasn’t pushed poor kids out. If anything, the variety of admissions programs universities have implemented to boost enrollment among racial minority and lower-income students has relatively increased their numbers compared to 1950 or 1960.

    AU: You were just making the case that this isn’t about compositional change in where poor students went. I heard you say there are more lower-income students at Harvard, Yale, and MIT than there were 50 or 60 years ago—and I have no doubt that’s true. But as a percentage of all poor students, surely that’s not true. The vast wave of lower-income students, often from minority backgrounds, are ending up in community colleges or non-flagship publics. Surely that has to be part of the story.

    ZB: Yes. It turns out there are three primary trends that explain this rising collegiate regressivity, and you just hit on two of them.

    The first is exactly your point: lower-income students primarily go to satellite public universities, basically all the non–R1 publics. Higher-income students, if they attend a public university, tend to go to the flagship, research-oriented universities.

    I’ll skip talking about Harvard, Yale, and Princeton—almost no one goes to those schools, and they’re irrelevant to the overall landscape.

    AU: Because they’re such a small piece of the pie, right?

    ZB: Exactly. Fewer than 1% of students attend an Ivy Plus school. They don’t matter when we’re talking about American higher education as a whole. The flagships, though, matter a lot. About a third of all four-year college students go to a research-oriented flagship public university.

    What’s happened since 1960 isn’t that poor kids lost access to those schools—it’s that they never really had access in the first place. Meanwhile, those schools have gotten much better over time. If you look at simple measures of university quality—student-to-faculty ratios, instructional expenditures per student, graduation rates—or even our own wage “value-added” measures (the degree to which each university boosts students’ wages), the gap between flagship and non-flagship publics has widened dramatically since the 1960s.

    The flagships have pulled away. They’ve gotten more money—both from higher tuition and from huge federal subsidies, in part for research—and they’ve used that money to provide much more value to the students who attend. And those students tend to be higher income.

    The second trend is what you mentioned: increasing diversion to community colleges. Interestingly, before 1980, community colleges were already well established in the U.S. and enrolled only slightly more lower-income than higher-income students. They actually enrolled a lot of high-income students, and the gap was small. Since the 1980s, though, that gap has grown substantially. There’s been a huge diversion of lower-income students toward community colleges—and those schools just provide lower-value education to the students who enroll.

    AU: At some level this is a sorting story, right? You see that in discussions about American economic geography—that people sort themselves into certain areas. Is that what you’re saying is happening here too?

    ZB: It’s not about sorting inside the four-year sector. It’s about sorting between the two- and four-year sectors. And on top of that, we think there’s fundamentally a story about American state governments choosing to invest much more heavily in their flagship publics—turning them into gem schools, amazing schools—while leaving the other universities in their states behind. Those flagships enroll far more higher-income than lower-income students.

    AU: When I was reading this paper, one thing that struck me was how hard it is to read about American higher education without also reading something about race. The last time you were on, we were talking about SCOTUS and the Fair Harvard decision. But as far as I can tell, this paper doesn’t talk about race. I assume that goes back to our earlier discussion about data limitations—that race just wasn’t captured at some point. What’s the story there?

    ZB: No—we observe race throughout this entire period. In fact, you could basically rewrite our study and ask: how has the relative value of college for white kids compared to Black kids changed over the last hundred years? I suspect you’d see very similar patterns.

    The datasets we’re working with observe both parental income and race, but they aren’t large enough to separately analyze, for example, just white students and then compare lower- and higher-income groups over time. There’s a sense in which you could tell our story in terms of race, or you could tell it in terms of class—and both would be right. At a first-order level, both are happening. And within racial groups, the evidence we’ve been able to collect suggests that class gaps have substantially widened over time.

    Similarly, we show some evidence that even within the lower-income group there are substantial gaps between white and Black students. So in part, I saw this as an interesting complement to the work I’d already done on race. It points out that while race is part of the story, you can also reframe the entire conversation in terms of America’s higher education system leaving lower-income students behind—irrespective of race.

    AU: Right, because it strikes me that 1960 is only six years after Brown v. Board of Education. By the early to mid-1960s, you’d start to see a bigger push of Black students entering higher education, becoming a larger share of the lower-income sector. And a few years later, the same thing with Latino students.

    Suddenly lower-income students are not only starting from further behind, but also increasingly made up of groups who, irrespective of education, face discrimination in the labor market. Wouldn’t that pull things down? Wouldn’t that be part of the explanation?

    ZB: Keep in mind that when we measure wage premiums, we’re always comparing people who went to college with people who only finished high school. So there are Black students on both sides of that comparison, across both lower- and higher-income groups.

    That said, I think your point is well taken. We don’t do any work in the paper specifically looking at changes in the racial composition of students by parental income over this period. One thing we do show is that the test scores of lower-income students who go to college aren’t falling over time. But you’re probably right: while racial discrimination affects both college-goers and non-college-goers, it’s entirely plausible that part of what we’re picking up here is the changing racial dynamics in college-going.

    AU: What’s the range of policy solutions we can imagine here, other than, you know, taking money away from rich publics and giving it to community colleges? That’s the obvious one to me, but maybe there are others.

    ZB: And not just community colleges—satellite publics as well. I’ve spent the last five years of my life thinking about how to get more disadvantaged students into highly selective universities, and what happens when they get there. The main takeaway from that research is that it’s really hard to get lower-income students into highly selective universities. It’s also expensive, because of the financial aid required.

    But once they get into those schools, they tend not only to benefit in terms of long-run wage outcomes, they actually derive disproportionate value. Highly selective schools are more valuable for lower-income kids than for the higher-income kids who typically enroll there.

    What I’ve learned from this project, though, is that the closing of higher education’s mobility pipeline isn’t fundamentally about access. It’s about investments—by state governments, by students, by donors, by all the people and organizations that fund higher education. Over time, that funding has become increasingly centralized in schools that enroll a lot of wealthy students.

    So, the point you brought up—redirecting funds—is important. In California they call it “rebenching”: siphoning money away from high-funded schools and pushing it toward low-funded schools. There’s very little academic research on what happens when you do that, but our study suggests that this century-long trend of unequal investment has disadvantaged low-income students. Potentially moving in the other direction could make a real difference for them.

    AU: Zach, thanks so much for being with us today.

    ZB: My pleasure.

    AU: It just remains for me to thank our excellent producers, Tiffany MacLennan and Sam Pufek, and you, our listeners and readers, for joining us. If you have any questions or comments about today’s podcast, or suggestions for future editions, don’t hesitate to get in touch at [email protected].

    Join us next week when our guest will be Dmitry Dubrovsky, a research scholar and lecturer at Charles University in Prague. He’ll be talking to us about the slow-motion collapse of Russian higher education under Vladimir Putin. Bye for now.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service. Please note, the views and opinions expressed in each episode are those of the individual contributors, and do not necessarily reflect those of the podcast host and team, or our sponsors.

    Source link