Tag: Universities

  • Universities want more money upfront. DfE wants proof students are really there

    Universities want more money upfront. DfE wants proof students are really there

    When students get their student maintenance loans, they get the first instalment a lot earlier than their university gets the corresponding tuition fee payment.

    That might help explain the curious case of disparities between pulldown – but there’s a sound theory to it. Students without savings could face a cashflow issue if it was any other way.

    It’s becoming a problem for universities too. The Office for Students’ (OfS) financial sustainability update report highlights low liquidity levels in the sector – especially during certain points in the annual cycle.

    That matters because universities have to meet minimum liquidity requirements in the registration conditions in England. A failure to maintain those levels can also impact “going concern” status and breach some lending covenants.

    In the past, cash flow imbalances tended to be offset by other income sources, borrowing, or cross-subsidies, such as from international student fees.

    But given how universities operate and the demands on cash before those SLC payments come in, there is in some providers a disproportionate reliance on arrears payments from SLC-funded students compared to other funding sources.

    For non-SLC funded students, universities typically charge fees upfront (or at least in front-loaded advance instalments) or get payments for stuff like government-funded apprenticeships monthly. Research funding streams also match payments to incurred costs.

    But the SLC’s payment profile for undergrads is 25:25:50 – so universities face significant upfront costs in the first two terms and then wait longer than standard 30-day payment terms to receive funds, forcing them to bridge the gap using other resources.

    So the University Alliance has a proposal – switch those payments to 40:40:20 to improve the sector’s funding position:

    Even if the move was phased first to 33:33:33 and then to 40:40:20 it would have an immediate impact on the current situation which has been adversely impacted by the previous administration’s approach to international student recruitment through restrictive visa policies.

    The current system is going to have to undergo change anyway, given the potential implications of the LLE. I note in passing that one of the most common student leader manifesto goals this year is better, less front-loaded instalments – surely the principle (and the issue in terms of cashflow) cuts both ways.

    But UA’s proposal might not land in quite the way intended – partly because the Student Loans Company is under pressure to increase yield.

    Leakage

    DfE’s “Tailored Review” of the Student Loans Company back in July 2019 talked of the rapidly increasing size of the student loan book, and the increasing importance and value of having a robust, well-resourced and effective repayment strategy which actively seeks to maximise yield.

    That said that the SLC is hamstrung by IT systems which do not “adequately facilitate the use of smart diagnostics for effective modelling, proactive use of data analytics and more precise customer segmentation” to minimise repayment leakage:

    Indeed, unverified customers account for c. £7bn of uncollected repayments (although many of these would not be in a position to repay)

    September’s SLC board minutes noted that its CEO had been along to DfE’s Audit and Risk Committee, where the department led an item on the student finance loan book, with an emphasis on its “scale and yield potential”.

    And its newly published Business Plan for 2024-25 says it will work with partners in DfE to progress proposals to “improve repayment customer verification rates”, “improve data quality to increase verification and yield” and look at options to apply stronger sanctions to customers not adhering to the terms and conditions of their student finance repayments.

    Some of that is about the SLC’s systems – but one of the problems noted in the National Audit Office’s report into franchising is that there is often “insufficient evidence” that students are attending and engaging with their courses:

    In determining a student’s eligibility for loan payments, and before making payments, SLC uses lead providers’ data to confirm students’ attendance. Lead providers self-assure their own data… there is no effective standard against which to measure student engagement, which attendance helps demonstrate, and there is no legal or generally accepted definition of attendance. Providers themselves determine whether students are meaningfully engaged with their course.

    So in a set of circumstances where the NAO and the Public Accounts Committee (PAC) are already worried about attendance and engagement, and providers are worried about their own cashflow, it seems unlikely that DfE is going to be receptive of a proposal to give providers more of the money early – especially if, in the case of franchised provision, it can’t just claw it back from the lead provider if there’s a problem like the Applied Business Academy.

    As we noted back in October, the government’s response to the NAO and the PAC was that it published guidance on attendance management in May, against which providers can be held to account “in relation to the release of SLC tuition fee payments”.

    That said that there is an “understanding and acceptance” across the sector that providers should have in place published attendance and engagement policies, so that students understand the commitment expected of them and the respective process a provider follows if attendance expectations are not met.

    It also said that in any circumstance where a provider does not have a published policy, the department “expects” that one will exist from the 2024-25 academic year – but it’s pretty clear talking to people around the country that that goal hasn’t been meaningfully met in large parts of the sector, at least in terms of a policy that both covers home students and is “auditable”.

    And part of the difficulty there is what is or isn’t meant by “attendance”.

    Attending isn’t always in person

    The Attendance Management guidance says:

    Attendance means participation in a course by a student, including, but not limited to, teaching face-to-face or blended study, in line with a provider’s published attendance policy. A provider should communicate its policy to a student and have an auditable process in place to support the action it may take when a student does not meet attendance expectations.

    It goes on to say that providers have flexibility to ensure every student engages with a course, and that the student and/or the course may require greater or less attendance than another due to circumstances or content.

    SLC told me that there is no difference between “attendance” and “engagement” – the definition of “attendance” for student finance purposes is active and ongoing engagement. Crucially, it said that “attendance” doesn’t have to mean “in person”, or “studying on campus”.

    But the conflation of “attendance” and “engagement” doesn’t seem to apply when a course is designed and designated. Noting that “blended learning” combines traditional classroom teaching with online learning and independent study, it says that there has been some confusion as to whether these courses should be coded as distance learning courses:

    Courses of any teaching method are distance learning if the students only attend occasionally, for example once a term. If students attend regularly, for example once a week, and follow a structured timetable, the course is not distance learning and you should not add it to CMS as such.

    That paragraph draws a clear distinction between attendance and engagement. Its two scenarios also appear to draw a distinction between (physical) attendance and “engagement”:

    • Scenario 1: Thomas is studying a BA Hons in sports coaching. His course hours are 30 weeks online study including lectures and tutorials, 2 days per week physical attendance at sports academy, 6 days per year attendance at university. As Thomas needs to attend the sports academy regularly rather than occasionally, this is an in-attendance course.
    • Scenario 2: Kate is studying an HND in Musical Theatre. Her course hours are 30 weeks online study including lectures and tutorials, 3 days per year (1 day per term) attendance at college. As Kate only needs to attend college occasionally rather than regularly, this is a distance learning course.

    The difference between Scenario 2 and the patterns of attendance being seen by many providers around the country this term is that in that scenario, the course is designed not to include regular physical attendance.

    A two-stage process

    SLC told me that whether it’s distance or in-person, engagement on a course is required and confirmation of that engagement is therefore required for SLC to make a fee loan payment on the student’s behalf.

    Ongoing engagement is not part of the definition of in-person or distance learning. That distinction relates to the attributes of the course that is supplied by the provider, as to whether the course has elements of in-person learning or if the student is not required to be in-person.

    But the obvious question is as follows. Notwithstanding codified exemptions for disabled students, if a course is designed as blended, would an acceptable “attendance management” policy for a course of that sort allow a student to engage all term, but only occasionally physically attend?

    If yes, and Kate’s HND wasn’t designed as blended, and her mate Kathy was on a course that was designed as blended, that would seem to mean that they could both have exactly the same attendance and engagement pattern, but Kathy would get a maintenance loan while Kate wouldn’t.

    If, on the other hand, a course was designed as blended and requiring regular in-person attendance, and SLC would expect an attendance/engagement policy to enforce that regular in-person attendance, there’s plenty of providers right now falling foul of those expectations.

    So you end up with three categories:

    1. Providers who’ve never really had a proper policy on any of this for home students – let alone enforce one – beyond noticing if a student doesn’t submit what can often be end-of-year summative assessment.
    2. Providers who designed a course as blended where students are in reality engaging in a “distance learning” kind of way – which, while confirming engagement in accordance with the rules, seems hugely unjust to tens of thousands of OU students if nothing else.
    3. Providers who are heavily auditing and requiring physical attendance – partly to achieve parity with international students – at just the point that students are struggling to attend in-person given wider demands on their time.

    It may well be the case that SLC is stuck with the definitions it has – which in part date back to the Teaching and Higher Education Act 1998.

    But if it’s the case that it’s OK for an attendance policy to not actually require regular in-person attendance, it’s hard to believe that whatever size and shaped-problem that DfE and the SLC have with student loan fraud is going to get anything other than worse.

    And in the end, this all comes back to an old problem – not knowing what’s going on underneath headline non-continuation.

    How far in?

    Remember those risks that OfS identified in its insight brief on subcontracting:

    • Data of extremely poor quality has been submitted in relation to students at some subcontractual partnerships, leading to payments being made to, and on behalf of, students who are not genuinely entitled to them.
    • Delivery partners have lacked clear attendance policies, making it almost impossible for lead providers to submit accurate data to the OfS and the SLC in relation to these students.
    • Students have been encouraged to register for courses that they do not genuinely intend to study, to access public funding through maintenance loans. In some cases, students have withdrawn from courses shortly after receiving these funds; in others there are grounds to doubt that they are continuing to study, despite their termly attendance being confirmed.

    Whether we’re looking at a select group of partnerships as OfS published data on last week or directly taught provision, while we know what percentage of UG students don’t make it to the second year, we don’t know what proportion:

    • Got instalment 1 of the maintenance loan but didn’t get as far as “engaging” enough for the provider to claim instalment 1 of the tuition fee loan (they don’t show up at all in non-continuation)
    • Engaged enough to enable the provider to claim instalment 1 but not enough to enable the provider to claim instalment 2 (and what proportion of them claimed maintenance instalment 2)
    • Engaged enough to enable the provider to claim instalment 2 but not enough to enable the provider to claim instalment 3 (and what proportion of them claimed maintenance instalment 3)
    • Engaged enough to enable the provider to claim instalment 3 but then failed and was withdrawn
    • Engaged enough to enable the provider to claim instalment 3 and was eligible to progress but then self-withdrew
    • And we don’t know any of the above for subsequent years of study.

    In many ways, what we have here is (yet) another iteration of the stretch involved in a single level playing field. There have been endless tales down the years of Russell Group alumni not really “engaging” at all for entire years, and in some cases entire degree courses – only to pull it out of the bag at the end. It’s an adult environment, after all.

    On the other hand, with another part of the sector now under close scrutiny over ghost students of differing definitions – just as the FE sector saw scandals over in the 90s – it doesn’t feel like that kind of legend is to be allowed.

    In terms of the cashflow thing, if DfE and the SLC are going to push more of the money upfront, they’re surely going to want to know the percentages and numbers in each of the above categories.

    And the accuracy of those percentages and numbers involves providers being sure about “enough” engagement – in an auditable way across the diversity of programmes and reasonable adjustments – to tick the box in the data return to SLC three times a year.

    It does feel like there’s some distance to go on all of that as it stands.

    Source link

  • Twenty six years of enrollment at Public Research 1 Universities

    Twenty six years of enrollment at Public Research 1 Universities

    A while ago, I made the claim that Oregon State University has the longest streak of consecutive years of fall-over-fall enrollment growth of any public, Research 1 university in America.  A few people have asked me, not exactly doubting the claim, but thinking maybe I had made a mistake, for the source of it.

    This started as a curiosity: I knew from our own internal documentation that the last time OSU (the oldest OSU…not the one in Ohio or Oklahoma) had a fall-to-fall enrollment drop was 1996, and I was curious to see if any other institution could make that claim. So I went to the IPEDS Data Center and downloaded the data. 

    It’s below.  First, a few points: My comparison group is 108 Public, four-year, Research 1 Universities as designated by the Carnegie Commission on Higher Education as of Fall, 2022, the latest IPEDS data available. The R1 designation is actually called “Doctoral Institutions: Very High Research Activity” but the nickname R1 is a holdover from prior years. The category contains those institutions who produce the highest research activity and output among American universities.

    What you can’t see here is that 2023 showed an increase (it’s not yet in IPEDS, but trust me), and that 2024 will also show an increase once our census is final.  So OSU’s record is the 26 shown, plus last year, plus this coming year, for a total of 28 years.

    There are a couple of small anomalies with the data, as there always seems to be.  First, some institutions missed a year or two in their reporting.  Even if those years had shown an increase, they were already nullified by other decreases. And Penn State has bounced around from being one institution to being several to being one again; this too does not seem to make a difference in the tally.

    The first chart here shows all years and all institutions (you’ll have to scroll down to see them all using the bar on the right.)  You’ll notice that every institution shown (other than OSU) has at least two years with a blue box after 1997, meaning a decrease.  Hover over the box for details.  Orange shows an increase from the prior year.

    The second chart shows individual enrollment data for any institution you select, using the filter at the top.  The bars are colored similarly: Orange for increase, and blue for decrease.

    If I’ve missed something or you think these data points are wrong, let me know.  If a university decided intentionally to shrink, for whatever reason, that’s interesting, but not the point of this visualization. If you want to look at just graduates or undergraduates or men or women or students of color or some other variable, I encourage you to read my posts here and here about how to download IPEDS data for yourself. 

    And as always, leave a comment below if you find something interesting.

    Source link

  • Six-year graduation rates at four-year colleges and universities

    Six-year graduation rates at four-year colleges and universities

    Graduation rates are always a hot topic in higher education, but often for the wrong reason.  To demonstrate, I offer my parents.  Here is a portrait of Agnes and Mark, married May 4, 1946.

    One night while I was talking to my brother, he asked, “Do you think mom was the way she was because dad was the way he was, or do you think dad was the way he was because mom was the way she was?”  To which I replied, “yes.”  My point, of course, is that in complex relationships, it’s always difficult–impossible, actually–to detangle cause and effect.

    And, despite the Student Affairs perspective that graduation rates are a treatment effect, I maintain that they are actually a selection effect.  As I’ve written about before, it’s pretty easy to predict a college’s six-year graduation rate if you know one data point: The mean SAT score of the incoming class.  That’s because the SAT rolls a lot of predictive factors into one index number.  These include academic preparation, parental attainment, ethnicity, and wealth, on the student side, and selectivity, on the college side.

    When a college doesn’t have to–or chooses not to–take many risks in the admissions process, they tend to select those students who are more likely to graduate.  That skews the incoming class wealthier (Asian and Caucasian populations have the highest income levels in America), higher ability (the SAT is a good proxy for some measure of academic achievement, and often measures academic opportunity), and second generation.  And when you combine all those things–or you select so few poor students you can afford to fund them fully–guess what?  Graduation rates go up.

    If this doesn’t make any sense, read the Blueberry Speech.  Or ask yourself this question: If 100 MIT students enrolled at your local community college, what percentage would graduate? 

    But graduation rates are still interesting to look at, once you have that context.  The visualization below contains three views, using the tabs across the top.  You’ll have to make a few clicks to get the information you need.

    The first view (Single Group) starts with a randomly selected institution, Oklahoma State.  Choose your institution of choice by clicking on the box and typing any part of the name, and selecting the institution. 

    On the yellow bars, you see the entering cohorts in yellow, and the number of graduating students on the blue bars.  Note: The blue bars show graduates in the year shown (so, 4,755, which you can see by hovering over the bar) while the yellow bar shows the entering class from six years prior (7,406 in 2019, who entered in 2013).

    The top row shows graduation rates at all institutions nationally, and the second row shows percentages for the selected institution.  You can choose any single ethnicity at the top left, using the filter.

    The second view (Single Institution) shows all ethnicities at a single institution.  The randomly selected demonstration institution is Gustavus Adolphus College in Minnesota, but of course you can choose any institution in the data set.  Highlight a single ethnic group using the highlight function (I know some people are frightened of interacting with these visualizations….you can’t break anything).

    Note: I start with a minimum of 10 students in each year’s cohorts for the sake of clarity.  Small schools in the Northeast, for instance, might enroll one Asian/Pacific Islander in their incoming class, each year, so the graduation rate could swing wildly from 0% to 100%.  You can change this if you want to live dangerously, by pulling the slider downward.

    The final view (Sectors) shows aggregates of institutional types.  It starts with graduation rates for Hispanic/Latino students, but you can change it to any group you want.

    Have fun learning about graduation rates.  Just don’t assume they are mostly driven by what happens at the institution once the admissions office has its say.

    Source link

  • Average Net Price at America’s Public Colleges and Universities

    Average Net Price at America’s Public Colleges and Universities

    Good news: We have new IPEDS data on average net cost.  Bad news: Because IPEDS is IPEDS, it’s data from the 2021-22 Academic Year. 

    This is pretty straightforward: Each dot represents a public institution, colored by region, showing the average net price for first-year students entering in that year.  IPEDS breaks out average net price by income bands, so you can see what a family with income of $30,000 to $48,000 pays, for instance, by using the filters at right.

    You can also limit the institutions displayed by using the top three filters: Doctoral institutions in the Far West, or in Illinois, for instance.  If you want to see a specific institution highlighted, use that control.  Just type part of the name of the institution, like this example, and make your selection: 

    Average net price shows The Total Cost of Attendance (COA), which includes tuition, room, board, books, transportation, and personal expenses, minus all grant aid.  It does not include loans, but of course, loans can be used to cover part of the net price, along with other family resources.

    This display is a box and whisker chart, and if you’re not familiar with the format, here is a quick primer: 

    For the sticklers, the median shown is unweighted.

    As always, let me know what you see here that you find interesting or surprising.

    Source link

  • First-year student diversity in American colleges and universities, 2018-2022

    First-year student diversity in American colleges and universities, 2018-2022

    I started this visualization to show how first-year classes at the highly rejective colleges had changed since COVID-19 forced them all to go to a test-optional approach for the Fall of 2021.  But it sort of took on a life of its own after that, as big, beefy data sets often do.

    The original point was to help discount the conventional wisdom, which is propped up by a limited, old study of a small set of colleges that showed test-optional policies didn’t affect diversity.  I did this post last year, after just one year of data made it fairly clear they did at the institutions that had the luxury of selecting and shaping their class. 

    This year I took it a little farther.  The views, using the tabs across the top, show the same trends (now going to 2022) for Public Land Grants, Public Flagships, the Ivy and Ivy+ Institutions.  In each case, choose one using the control.

    Note that I had colored the years by national trends: 2018 and 2019 are pre-test optional, gray is COVID, and blue is post-test optional.  This is not to say that any individual college selected either required tests or went test-optional in those years, but rather shows the national trend.  And remember these show enrolling students, not admitted students, which is why gray is critical; we know COVID changed a lot of plans, and thus 2020 may be an anomalous year. 

    The fourth view shows where students of any selected ethnicity enroll (again, use the dropdown box at the top to make a selection); the fifth view breaks out ethnicity by sector; and the final view allows you to look at diversity by sector and region (to avoid comparing diversity in Idaho, California, and Mississippi, for instance, three states with very different racial and ethnic makeups.)

    On all views, hovering over a data point explains what you’re seeing.

    If you work at a college or university, or for a private company that uses this data in your work, and want to support my time and effort, as well as software and web hosting costs, you can do that by buying me a coffee, here. Note that I won’t accept contributions from students, parents, or high school counselors, or from any company that wants to do business with my employer.

    And, as always, let me know what jumps out at you here. 

    Source link

  • Tuition and Fees at Flagship and Land Grant Universities over time

    Tuition and Fees at Flagship and Land Grant Universities over time

    If you believe you can extract strategy from prior activities, I have something for you to try to make sense of here.  This is a long compilation of tuition and fees at America’s Flagship and Land Grant institutions.  If you are not quite sure about the distinction between those two types of institutions, you might want to read this first.  TLDR: Land Grants were created by an act of congress, and for this purpose, flagships are whoever I say they are.  There doesn’t seem to be a clear definition.  

    Further, for this visualization, I’ve only selected the first group of Land Grants, funded by the Morrill Act of 1862.  They tend to be the arch rival of the Flagship, unless, of course, they’re the same institution.

    Anyway, today I’m looking at tuition, something you’d think would be pretty simple.  But there are at least four ways to measure this: Tuition, of course, but also tuition and required fees, and both are different for residents and nonresidents.  Additionally, you can use those variables to create all sorts of interesting variables, like the gap between residents and nonresidents, the ratio of that gap to resident tuition, or even several ways to look at the role “required fees” change the tuition equation.  All would be–in a perfect world–driven by strategy.  I’m not sure I’d agree that such is the case.

    Take a look and see if you agree.

    There are five views here, each getting a little more complex.  I know people are afraid to interact with these visualizations, but I promise you can’t break anything.  So click away.

    The first view (using the tabs across the top) compares state resident full-time, first-time, undergraduate tuition and required fees (yellow) to those for nonresidents (red bar). The black line shows the gap ratio.  For instance, if resident tuition is $10,000 and nonresident tuition is $30,000, the gap is $20,000, and that is 2x the resident rate.  The view defaults to the University of Michigan, but don’t cheat yourself: Us the filter at top left to pick any other school. If you’ve read this blog before, you know why Penn State is showing strange data.  It’s not you, it’s IPEDS, so don’t ask.)

    The second tab shows four data points explicitly, and more implicitly.  This view starts with the University of Montana, but the control lets you change that.  On top is resident tuition (purple) and resident tuition and fees (yellow). Notice how the gap between the two varies, suggesting the role of fees in the total cost of attendance.  The bottom shows those figures for nonresidents.

    The third view looks a little crazy. Choose a value to display at top left, and the visualization will rank all 77 institutions from highest to lowest.  Use the control at top right to highlight an institution to put it in a national context.  Hover over the dots for details in a popup box.  If you want to look at a smaller set of institutions, you can do that, too, using the filters right above the chart.  The fourth view is the exact same, but shows the actual values, rather than the rank.  As always, hover for details.

    Finally, the fifth view is a custom scatter plot: Choose the variable you want on the x-axis and the variable to plot it against on the y-axis.  Then use the filters to limit the included institutions. As always, let me know what you find that’s interesting.

    Source link