Category: Data

  • Student Aid in Canada: The Long View

    Student Aid in Canada: The Long View

    Note: this is a short version of a paper which has just appeared in issue 72:4 the Canadian Tax Journal. How short? I’m trying for under 1000 words. Let’s see how I do.

    Canadian student aid programs existed in scattered forms since just after World War I but became a “national program” when the Dominion-Canadian Student Aid Program (DCSAP) was created in 1939. Under this program, the Government of Canada provided block cash grants to provinces who administered their own scholarship programs which provided aid based on some combination of need and merit. The actual details of the program varied significantly from one province to another; at the time, the government of Canada did not place much importance on “national programs” with common elements.

    In 1964, this DCSAP was replaced by the Canada Student Loans Program (CSLP)—recently re-named the Canada Student Financial Assistance Program (CSFAP). This has always been a joint federal-provincial enterprise. But where the earlier program was a block grant, this program would be a single national entity run more or less consistently across all provinces, albeit with provincial governments still in place as responsible administrative agencies able to supplement the plan as they wished. Some provinces would opt out of this program and received compensation to run their own solo programs (Quebec at the program’s birth, the Northwest Territories in 1984 and Nunavut in 1999). The others, for the most part, built grant programs that kicked in once a student had exhausted their Canada Student Loan eligibility.

    Meanwhile, a complimentary student aid program grew up in the tax system, mainly because it was a way to give money to students that didn’t involve negotiations with provinces. Tuition fees plus a monthly education amount were made into a tax deduction in 1961 and then converted to a tax credit in 1987. Registered Education Savings Plans (RESPs), which are basically tax-free growth savings accounts, showed up in 1971.

    Although the CSLP was made somewhat more generous over time in order to keep up with rising student costs, program rules went largely unchanged between 1964 and 1993. Then, during the extremely short Kim Campbell government, a new system came into being. The federal government decided to make loans much larger, but also to force provinces in participating provinces to start cost-sharing in a different manner—basically, they had to step up from a student’s first dollar of need instead of just taking students with high need. Since this was the era of stupidly high deficits, provinces responded to these additional responsibilities by cutting the generosity of their programs, transforming from pure grants to forgivable loans. For the rest of the decade, student debt rose—in some cases quite quickly: in total loans issued doubled between 1993 and 1997.

    And then, everything went into reverse.

    In a series of federal budgets between 1996 and 2000, billions of dollars were thrown into grants, tax credits and a new program called “Canada Education Savings Grants,” which were a form of matching grant for contributions to RESPs. Grants and total aid rose; loans issued fell by a third, mainly between 1997 and 2001 (a recovering economy helped quite a bit). Tax expenditures soared, which due to a rule change allowing tax credits to be carried forward meant either students got to keep more of their work income or got to reduce their taxes once they started working.

    Since this period of rapid change at the turn of the century, student aid has doubled in real terms. And nearly all of that has been an increase in non-repayable aid. Institutional scholarships? Tripled. Education scholarships? Quadrupled. Loans? They are up, too, but there the story is a bit more complicated.

    Figure 1: Student Aid by Source, Canada, 1993-94 to 2022-23, in thousands of constant $2022

    For the period from about 2000 to 2015, all forms of aid were increasing at about inflation plus 3%. Then, in 2016, we entered another period of rapid change. The Governments of Canada and Ontario eliminated a bunch of tax credits and re-invested the money into grants. Briefly, this led to targeted free tuition in Ontario, before the Ford government took an axe to the system. Then, COVID hit and the CSFAP doubled grants. Briefly, in 2020-21, total student aid exceeded $23 billion/year (the figure above does not include the $4 billion per year paid out through the Canada Emergency Student Benefit), with less than 30% of it made up of loans.

    One important thing to understand about all this is that while the system became much larger and much less loan-based, something else was going on, too. It was becoming much more federal. Over the past three decades, provincial outlays have risen about 30% in real terms; meanwhile, federal ones have quadrupled. In the early 1990s, the system was about 45-55 federal-provincial; now, it’s about 70-30 federal. It’s a stunning example of “uploading” of responsibilities in an area of shared-jurisdiction.

    Figure 2: Government Student Aid by Source, Figure 1: Student Aid by Source, Canada, 1993-94 to 2022-23, in thousands of constant $2022

    So there you go: a century of Canadian student aid in less than 850 words. Hope you enjoyed it.

    Source link

  • Institutions may be holding themselves back by not sharing enough data

    Institutions may be holding themselves back by not sharing enough data

    Wonkhe readers need little persuasion that information flows are vital to the higher education sector. But without properly considering those flows and how to minimise the risk of something going wrong, institutions can find themselves at risk of substantial fines, claims and reputational damage. These risks need organisational focus from the top down as well as regular review.

    Information flows in higher education occur not only in teaching and research but in every other area of activity such as accommodation arrangements, student support, alumni relations, fundraising, staff and student complaints and disciplinary matters. Sometimes these flows are within organisations, sometimes they involve sharing data externally.

    Universities hold both highly sensitive research information and personal data. Examples of the latter include information about individuals’ physical and mental health, family circumstances, care background, religion, financial information and a huge range of other personal information.

    The public narrative on risks around data tend to focus on examples of inadvertently sharing protected information – such as in the recent case of the Information Commissioner’s decision to fine the Police Service of Northern Ireland £750,000 in relation to the inadvertent disclosure of personal information over 9,000 officers and staff in response to a freedom of information request. The same breach has also resulted in individuals bringing legal claims against the PSNI, with media reports suggesting a potential bill for those at up to £240m.

    There is also the issue of higher education institutions being a target for cyber attack by criminal and state actors. Loss of data through such attacks again has the potential to result in fines and other regulatory action as well as claims by those affected.

    Oversharing and undersharing

    But inadvertent sharing of information and cyberattacks are not the only areas of risk. In some circumstances a failure to ensure that information is properly collected and shared lawfully may also be a risk. And ensuring effective and appropriate flows of information to the governing body is key to it being able to fulfil its oversight function.

    One aspect of the tragic circumstances mentioned in the High Court appeal ruling in the case concerning Natasha Abrahart is the finding that there had been a failure to pass on information about a suicide attempt to key members of staff, which might have enabled action to be taken to remove pressure on Natasha.

    Another area of focus concerns sharing of information related to complaints of sexual harassment and misconduct and subsequent investigations. OfS Condition E6 and its accompanying guidance which comes fully into effect on 1 August 2025 includes measures on matters such as reporting potential complaints and the sensitive handling and fair use of information. The condition and guidance require the provider to set out comprehensively and in an easy to understand manner how it ensures that those “directly affected” by decisions are directly informed about those decisions and the reasons for them.

    There are also potential information flows concerning measures intended to protect students from any actual or potential abuse of power or conflict of interest in respect of what the condition refers to as “intimate personal relationships” between “relevant staff members” and students.

    All of these data flows are highly sensitive and institutions will need to ensure that appropriate thought is given to policies, procedures and systems security as well as identifying the legal basis for collecting, holding and sharing information, taking appropriate account of individual rights.

    A blanket approach will not serve

    Whilst there are some important broad principles in data protection law that should be applied when determining the legal basis for processing personal data, in sensitive cases like allegations of sexual harassment the question of exactly what information can be shared with another person involved in the process often needs to be considered against the particular circumstances.

    Broadly speaking in most cases where sexual harassment or mental health support is concerned, the legislation will require at minimum both a lawful basis and a condition for processing “special category” and/or data that includes potential allegations of a criminal act. Criminal offences and allegations data and special category data (which includes data relating to an individual’s health, sex life and sexual orientation) are subject to heightened controls under the legislation.

    Without getting into the fine detail it can often be necessary to consider individuals’ rights and interests in light of the specific circumstances. This is brought into sharp focus when considering matters such as:

    • Sharing information with an emergency contact in scenarios that might fall short of a clear “life or death” situation.
    • Considering what information to provide to a student who has made a complaint about sexual harassment by another student or staff member in relation to the outcome of their complaint and of any sanction imposed.

    It’s also important not to forget other legal frameworks that may be relevant to data flows. This includes express or implied duties of confidentiality that can arise where sensitive information is concerned. Careful thought needs to be given to make clear in relevant policies and documents when it is envisaged that information might need to be shared, and provided the law permits it.

    A range of other legal frameworks can also be relevant, such as consumer law, equality law and freedom of information obligations. And of course, aside from the legal issues, there will be potential reputational and institutional risks if something does go wrong. It’s important that senior management and governing bodies have sufficient oversight and involvement to encourage a culture of organisational awareness and compliance across the range of information governance issues that can arise.

    Managing the flow of information

    Institutions ought to have processes to keep their data governance under review, including measures that map out the flows and uses of data in accordance with relevant legal frameworks. The responsibility for oversight of data governance lies not only with any Data Protection Officer, but also with senior management and governors who can play a key part in ensuring a good data governance culture within institutions.

    Compliance mechanisms also need regular review and refresh including matters such as how privacy information is provided to individuals in a clear and timely way. Data governance needs to be embedded throughout the lifecycle of each item of data. And where new activities, policies or technologies are being considered, data governance needs to be a central part of project plans at the earliest stages to ensure that appropriate due diligence and other compliance requirements are in place, such as data processing agreements or data protection impact assessments are undertaken.

    Effective management of the flow ensures that the right data gets in front of the right people, at the right time – and means everyone can be confident the right balance has been struck between maintaining privacy and sharing vital information.

    This article is published in association with Mills & Reeve.

    Source link

  • Data futures, reviewed | Wonkhe

    Data futures, reviewed | Wonkhe

    As a sector, we should really have a handle on how many students we have and what they are like.

    Data Futures – the multi-year programme that was designed to modernise the collection of student data – has become, among higher education data professionals, a byword for delays, stress, and mixed messages.

    It was designed to deliver in year data (so 2024-25 data arriving within the 2024-25 academic year) three times a year, drive efficiency in data collection (by allowing for process streamlining and automation), and remove “data duplication” (becoming a single collection that could be used for multiple purposes by statutory customers and others). To date it has achieved none of these benefits, and has instead (for 2022-23 data) driven one of the sectors’ most fundamental pieces of data infrastructure into such chaos that all forward uses of data require heavy caveats.

    The problem with the future

    In short – after seven years of work (at the point the review was first mooted), and substantial investment, we are left with more problems than we started with. Most commentary has focused on four key difficulties:

    • The development of the data collection platform, starting with Civica in 2016 and later taken over by Jisc, has been fraught with difficulties, frequently delayed, and experienced numerous changes in scope
    • The documentation and user experience of the data collection platform has been lacking. Rapid changes have not resulted in updates for those who use the platform within providers, or those who support those providers (the HESA Liaison team). The error handling and automated quality rules have caused particular issues – indeed the current iteration of the platform still struggles with fields that require responses involving decimal fractions.
    • The behavior of some statutory customers – in frequently modifying requirements, changing deadlines, and putting unhelpful regulatory pressure on providers, has not helped matters.
    • The preparedness of the sector has been inconsistent between providers and between software vendors. This level of preparedness has not been fully understood – in part because of a nervousness among providers around regulatory consequences for late submissions.

    These four interlinked strands have been exacerbated by an underlying fifth issue:

    • The quality of programme management, programme delivery, and programme documentation has not been of the standards required for a major infrastructure project. Parts of this have been due to problems in staffing, and problems in programme governance – but there are also reasonable questions to be asked about the underlying programme management process.

    Decisions to be made

    An independent review was originally announced in November 2023, overlapping a parallel internal Jisc investigation. The results we have may not be timely – the review didn’t even appear to start until early 2024 – but even the final report merely represents a starting point for some of the fundamental discussions that need to happen about sector data.

    I say a “starting point” because many of the issues raised by the review concern decisions about the projected benefits of doing data futures. As none of the original benefits of the programme have been realised in any meaningful way, the future of the programme (if it has one) needs to be focused on what people actually want to see happen.

    The headline is in-year data collection. To the external observer, it is embarrassing that while other parts of the education sector can return data on a near-real time basis – universities update the records they hold on students on a regular basis so it should not be impossible to update external data too. It should not come as a surprise that when the review poses the question:

    As a priority, following completion of the 2023-24 data collection, the Statutory Customers (with the help of Jisc) should revisit the initial statement of benefits… in order to ascertain whether a move to in-year data collection is a critical dependent in order to deliver on the benefits of the data futures programme.

    This isn’t just an opportunity for regulators to consider their shopping list – a decision to continue needs to be swiftly followed by a cost-benefit analysis, reassessing the value of in-year collection and determining whether or when to pursue in-year collection. And the decision is that there will, one day, be in-year student data. In a joint statement the four statutory customers said:

    After careful consideration, we intend to take forward the collection of in-year student data

    highlighting the need for data to contribute to “robust and timely regulation”, and reminding institutions that they will need “adequate systems in place to record and submit student data on time”.

    The bit that interests me here is the implications for programme management.

    Managing successful programmes

    If you look at the government’s recent record in delivering large and complex programmes you may be surprised to learn of the existence of a Government Functional Standard covering portfolio, programme, and project management. What’s a programme? Well:

    A programme is a unique, temporary, flexible organisation created to co-ordinate, direct and oversee the implementation of a set of projects and other related work components to deliver outcomes and benefits related to a set of strategic objectives

    Language like this, and the concepts underpinning it come from what remains the gold standard programme management methodology, Managing Successful Programmes (MSP). If you are more familiar with the world of project management (project: “a unique temporary management environment, undertaken in stages, created for the purpose of delivering one or more business products or outcomes”) it bears a familial resemblance to PRINCE2.

    If you do manage projects for a living, you might be wondering where I have been for the last decade or so. The cool kids these days are into a suite of methodologies that come under the general description of “agile” – PRINCE2 these days is seen primarily as a cautionary tale: a “waterfall” (top down, documentation centered, deadline focused) management practice rather than an “iterative” (emergent, development centered, short term) one.

    Each approach has strengths and weaknesses. Waterfall methods are great if you want to develop something that meets a clearly defined need against clear milestones and a well understood specification. Agile methods are a nice way to avoid writing reports and updating documentation.

    Data futures as a case study

    In the real world, the distinction is less clear cut. Most large programmes in the public sector use elements of waterfall methods (regular project reports, milestones, risk and benefits management, senior responsible owners, formal governance) as a scaffold in which sit agile elements at a more junior level (short development cycle, regular “releases” of “product” prioritised above documentation). While this can be done well it is very easy for the two ideologically separate approaches to drift apart – and it doesn’t take much to read this into what the independent review of data futures reveals.

    Recommendation B1 calls, essentially, for clarity:

    • Clarity of roles and responsibilities
    • Clarity of purpose for the programme
    • Clarity on the timetable, and on how and when the scope of the programme can be changed

    This is amplified by recommendation C1, which looks for specific clarifications around “benefits realisation” – which itself underpins the central recommendation relating to in-year data.

    In classic programme management (like MSP) the business case will include a map of programme benefits: that is, all of the good things that will come about as a result of the hard work of the programme. Like the business case’s risk register (a list of all the bad things that might happen and what can be done if they did) it is supposed to be regularly updated and signed off by the Programme Board – which is made up of the most senior staff responsible for the work of the programme (the Senior Responsible Owners) in the lingo.

    The statement of benefits languished for some time without a full update (there was an incomplete attempt in February 2023, and a promise to make another one after the completed 2022-23 collection – we are not told whether the second had happened). In proper, grown-up, programme management this is supposed to be done in a systematic way: every programme board meeting you review the benefits and the risk register. It’s dull (most of the time!) but it is important. The board needs an eye on whether the programme still offers value overall (based on an analysis of projected benefits). And if the scope needed to change, the board would have final say on that.

    The issue with Data Futures was clarity over whether this level of governance actually had the power to do these things, and – if not – who was actually doing them. The Office for Students latterly put together quite a complex and unwieldy governance structure, with a quarterly review board having oversight of the main programme board. This QRB was made up of very senior staff at the statutory customers (OfS, HEFCW, SFC, DoE(NI)), Jisc, and HESA (plus one Margaret Monckton – now chair of this independent review! – as an external voice).

    The QRB oversaw the work of the programme board – meaning that decisions made by the senior staff nominally responsible for the direction of the programme were often second guessed by their direct line managers. The programme board was supposed to have its own assurance function and an independent observer – it did not (despite the budget being there for it).

    Stop and go

    Another role of the board is to make what are more generally called “stop-go” decisions, and are here described as “approval to proceed”. This is an important way of making sure the programme is still on track – you’d set (in advance) the criteria that needed to be fulfilled in terms of delivery (was the platform ready, had the testing been done) before you moved on to the next work package. Below this, incremental approvals are made by line managers or senior staff as required, but reported upwards to the board.

    What seems to have happened a lot in the Data Futures programme is what’s called conditional approvals – where some of these conditions were waived based on assurances that the remaining required work was completed. This is fine as it goes (not everything lines up all the time) but as the report notes:

    While the conditions of the approvals were tracked in subsequent increment approval documents, they were not given a deadline, assignee or accountable owner for the conditions. Furthermore, there were cases where conditions were not met by the time of the subsequent approval

    Why would you do that? Well, you’d be tempted if you had another board above you – comprising very senior staff and key statutory customers – concerned about the very public problems with Data Futures and looking for progress. The Quarterly Review Board (QRB) as it turned out, only actually ended up making five decisions (and in three of these cases it just punted the issue back down to the programme board – the other two, for completists, were to delay plans for in-year collection).

    What it was meant to be doing was “providing assurance on progress”, “acting as an escalation point” and “approving external assurance activities”. As we’ve already seen, it didn’t really bother with external assurance. And on the other points the review is damning:

    From the minutes provided, the extent to which the members of the QRG actively challenged the programme’s progress and performance in the forum appears to be limited. There was not a clear delegation of responsibilities between the QRG, Programme Board and other stakeholders. In practice, there was a lack of clarity also on the role of the Data Futures governance structure and the role of the Statutory Customers separately to the Data Futures governance structure; some decisions around the data specification were taken outside of the governance structure.

    Little wonder that the section concludes:

    Overall, the Programme Board and QRG were unable to gain an independent, unbiased view on the progress and success of the project. If independent project assurance had been in place throughout the Data Futures project, this would have supported members of the Programme Board in oversight of progress and issues may have been raised and resolved sooner

    Resourcing issues

    Jisc, as developer, took on responsibility for technical delivery in late 2019. Incredibly, Jisc was not provided with funding to do this work until March 2020.

    As luck would have it, March 2020 saw the onset of a series of lockdowns and a huge upswing in demand for the kind of technical and data skills needed to deliver a programme like data futures. Jisc struggled to fill key posts, most notably running for a substantive period of time without a testing lead in post.

    If you think back to the 2022-23 collection, the accepted explanation around the sector for what – at heart – had gone wrong was a failure to test “edge cases”. Students, it turns out, are complex and unpredictable things – with combinations of characteristics and registrations that you might not expect to find. A properly managed programme of testing would have focused on these edge cases – there would have been less issues faced when the collection went live.

    Underresourcing and understaffing are problems in their own right, but these were exacerbated by rapidly changing data model requirements, largely coming from statutory customers.

    To quote the detail from from the report:

    The expected model for data collection under the Data Futures Programme has changed repeatedly and extensively, with ongoing changes over several years on the detail of the data model as well as the nature of collection and the planned number of in-year collections. Prior to 2020, these changes were driven by challenges with the initial implementation. The initial data model developed was changed substantially due to technical challenges after a number of institutions had expended significant time and resource working to develop and implement it. Since 2020, these changes were made to reflect evolving requirements of the return from Statutory Customers, ongoing enhancements to the data model and data specification and significantly, the ongoing development of quality rules and necessary technical changes determined as a result of bugs identified after the return had ‘gone live’. These changes have caused substantial challenges to delivery of the Data Futures Programme – specifically reducing sector confidence and engagement as well as resulting in a compressed timeline for software development.

    Sector readiness

    It’s not enough to conjure up a new data specification and platform – it is hugely important to be sure that your key people (“operational contacts”) within the universities and colleges that would be submitting data are ready.

    On a high level, this did happen – there were numerous surveys of provider readiness, and the programme also worked with the small number of software vendors that supply student information systems to the sector. This formal programme communication came alongside the more established links between the sector and the HESA Liaison team.

    However, such was the level of mistrust between universities and the Office for Students (who could technically have found struggling providers in breach of condition of registration F4), that it is widely understood that answers to these surveys were less than honest. As the report says:

    Institutions did not feel like they could answer the surveys honestly, especially in instances where the institution was not on track to submit data in line with the reporting requirements, due to the outputs of the surveys being accessible to regulators/funders and concerns about additional regulatory burden as a result.

    The decision to scrap a planned mandatory trial of the platform, made in March 2022 by the Quarterly Review Group, was ostensibly made to reduce burden – but, coupled with the unreliable survey responses, this meant that HESA was unable to identify cases where support was needed.

    This is precisely the kind of risk that should have been escalated to programme board level – a lack of transparency between Jisc and the board about readiness made it harder to take strategic actions on the basis of evidence about where the sector really was. And the issue continued into live collection – because Liaison were not made aware of common problems (“known issues”, in fact) the team often struggled with out-of-date documentation: meaning that providers got conflicting messages from different parts of Jisc.

    Liaison, on their part, dealt with more than 39,000 messages between October and December 2023 (during the peak of issues raised during the collection process) – even given the problems noted above they resolved 61 per cent of queries on the first try. Given the level of stress in the sector (queries came in at all hours of the day) and the longstanding and special relationship that data professionals have with HESA Liasion, you could hardly criticise that team for making the best of a near-impossible situation.

    I am glad to see that the review notes:

    The need for additional staff, late working hours, and the pressure of user acceptance testing highlights the hidden costs and stress associated with the programme, both at institutions and at Jisc. Several institutions talked about teams not being able to take holidays over the summer period due to the volume of work to be delivered. Many of the institutions we spoke to indicated that members of their team had chosen to move into other roles at the institution, leave the sector altogether, experienced long term sickness absence or retired early as a result of their experiences, and whilst difficult to quantify, this will have a long-term impact on the sector’s capabilities in this complex and fairly niche area.

    Anyone who was even tangentially involved in the 2022-23 collection, or attended the “Data Futures Redux” session at the Festival of Higher Education last year, will find those words familiar.

    Moving forward

    The decision on in-year data has been made – it will not happen before the 2026-27 academic year, but it will happen. The programme delivery and governance will need to improve, and there are numerous detailed recommendations to that end: we should expect more detail and the timeline to follow.

    It does look as though there will be more changes to the data model to come – though the recommendation is that this should be frozen 18 months before the start of data collection which by my reckoning would mean a confirmed data model printed out and on the walls of SROC members in the spring of 2026. A subset of institutions would make an early in-year submission, which may not be published to “allow for lower than ideal data quality”.

    On arrangements for collections for 2024-25 and 2025-26 there are no firm recommendations – it is hoped that data model changes will be minimal and the time used to ensure that the sector and Jisc are genuinely ready for the advent of the data future.

    Source link

  • Student Debt by Ethnicity | HESA

    Student Debt by Ethnicity | HESA

    Hi all. Just a quick one today, this time on some data I recently got from StatsCan.

    We know a fair a bit about student debt in Canada, especially with respect to distribution by gender, type of institution, province, etc. (Chapter 6 of The State of Postsecondary Education in Canada is just chock full of this kind of data if you’re minded to take a deeper dive). But to my knowledge no one has ever pulled and published the data on debt by ethnicity, even though this data has been collected for quite some time through the National Graduates Survey (NGS). So I ordered the data, and here’s what I discovered.

    Figure 1 shows incidence of borrowing for the graduating class of 2020, combined for all graduates of universities and graduates, for the eight largest ethnicities covered by the NGS (and before anyone asks, “indigeneity” is not considered an ethnicity so anyone indicating an indigenous ethnicity is unfortunately excluded from this data… there’s more below on the challenges of getting additional data). And the picture it shows is…a bit complex.

    Figure 1: Incidence of Borrowing, College and University Graduates Combined, Class of 2020

    If you just look at the data on government loan programs (the orange bars), we see that only Arab students have borrowing rates in excess of 1 in 2. But for certain ethnicities, the borrowing rate is much lower. For Latin American and Chinese students, the borrowing rate is below 1 in 3, and among South Asian students the borrowing rate is barely 1 in 5. Evidence of big differences in attitudes towards borrowing!

    Except…well when you add in borrowing from private sources (e.g. from banks and family) so as to take a look at overall rates of borrowing incidence, the differences in borrowing rates are a lot narrower. Briefly, Asian and Latin American students borrow a lot more money from private sources (mainly family) than do Arab students, whites, and Blacks. These probably come with slightly easier repayment terms, but it’s hard to know for sure. An area almost certainly worthy of further research.

    There is a similarly nuanced picture when we look at median levels of indebtedness among graduates who had debt. This is shown below in Figure 2.

    Figure 2: Median Borrowing, College and University Graduates Combined, Class of 2020

    Now, there isn’t a huge amount of difference in exiting debt levels by ethnicity: the gap is only about $6,000 between the lowest total debt levels (Filipinos) and the highest (Chinese). But part of the problem here is that we can’t distinguish the reason for the various debt levels. Based on what we know about ethnic patterns of postsecondary education, we can probably guess that Filipino students have low debt levels not because they are especially wealthy and can afford to go to post-secondary without financial assistance. But rather because they are more likely to go to college and this spend less time, on average, in school paying fees and accumulating debt. Similarly, Chinese students don’t have the highest debt because they have low incomes; they have higher debt because they are the ethnic group the most likely to attend university and spend more time paying (higher) fees.

    (Could we get the data separately for universities and colleges to clear up the confound? Yes, we could. But it cost me $3K just to get this data. Drilling down a level adds costs, as would getting data based on indigenous identity, and this is a free email, and so for the moment what we have above will have to do. If anyone wants to pitch in a couple of grand to do more drilling-down, let me know and I would be happy to coordinate some data liberation).

    It is also possible to use NGS data to look at post-graduate income by debt. I obtained the data by in fairly large ranges (e.g. $0-20K, $20-60K, etc.), but it’s possible on the basis of that to estimate roughly what median incomes are (put it this way: the exact numbers are not exactly right, but the ordinal rank of income of the various ethnicities are probably accurate). My estimations of median 2023 income of 2020 graduates—which includes those graduates who are not in the labour market full-time, if you’re wondering why the numbers look a little low—are shown below in Figure 3.

    Figure 3: Estimate Median 2023 Income, College and University Graduates Combined, Class of 2020

    Are there differences in income here? Yes, but they aren’t huge. Most ethnic groups have median post-graduate incomes between $44 and $46,000. The two lowest-earning groups (Latin Americans and Filipinos) re both disproportionately enrolled in community colleges, which is part of what is going on in this data (if you want disaggregated data, see above).

    Now, the data from the previous graphs can be combined to look at debt-to-income ratios, both for students with debt, and all students (that is, including those that do not borrow). This is shown below in Figure 4.

    Figure 4: Estimated Median 2023 Debt-to-Income Ratios, College and University Graduates Combined, Class of 2020

    If you’re just dividing indebtedness by income (the blue bars), you get a picture that looks a lot like Figure 2 in debt, because differences in income are pretty small. But if you are looking at debt-to-income ratios across all students (including those that do not borrow) you get a very different picture because as we saw in Figure 1, there are some pretty significant differences in overall borrowing rates. So, for instance, Chinese students go from having the worst debt-to-income ratio on one measure to being middle of the pack on another because they have relatively low incidence of borrowing; similarly, students of Latin American origin go from being middle-of-the-pack to nearly the lowest debt-to-income ratios because they are a lot less likely to borrow than others. Black students end up having among the highest debt-to-income ratios not because they earn significantly less than other graduates, but because both the incidence and amount of their borrowing is relatively high.

    But I think the story to go with here is that while there are differences between ethnic groups in terms of borrowing, debt, and repayment ratios, and that it’s worth trying to do something to narrow them, the difference in these rates is not enormous. Overall, it appears that as a country we are achieving reasonably good things here, with the caveat that if this data were disaggregated by university/ college, the story might not be quite as promising.

    And so ends the first-ever analysis of student debt and repayment by ethnic background. Hope you found it moderately enlightening.

    Source link

  • Canadian study permit approvals fall far below cap targets

    Canadian study permit approvals fall far below cap targets

    Canadian study permit approvals are on track to fall by 45% in 2024, rather than the 35% planned reduction of last year’s controversial international student caps, new IRCC data analysed by ApplyBoard has revealed.  

    “The caps’ impact was significantly underestimated,” ApplyBoard founder Meti Basiri told The PIE News. “Rapidly introduced policy changes created confusion and had an immense impact on student sentiment and institutional operations.  

    “While aiming to manage student numbers, these changes failed to account for the perspectives of students, and their importance to Canada’s future economy and communities,” he continued.  

    The report reveals the far-reaching impact of Canada’s study permit caps, which were announced in January 2024 and followed by a tumultuous year of policy changes that expanded restrictions and set new rules for post-graduate work permit eligibility, among other changes.  

    For the first 10 months of 2024, Canada’s study permit approval rate hovered just above 50%, resulting in an estimated maximum of 280,000 approvals from K-12 to postgraduate levels. This represents the lowest number of approvals in a non-pandemic year since 2019. 

    Source: IRCC. Disclaimer: Data for 2021-Oct 2024 is sourced from IRCC. Full-year 2024 figures are estimates extrapolated from Jan-Oct 2024 and full-year 2021-2023 IRCC data. Projections may be subject to change based on changing conditions and source data.

    “Even from the early days of the caps, decreased student interest outpaced government estimates,” noted the report, with stakeholders highlighting the reputational damage to Canada as a study destination.  

    “Approvals for capped programs fell by 60%, but even cap-exempt programs declined by 27%. Major source countries like India, Nigeria, and Nepal saw over 50% declines, showing how policies have disrupted demand across all study levels,” said Basiri.  

    Following major PGWP and study permit changes announced by the IRCC in September 2024, four out of five international student counsellors surveyed by ApplyBoard agreed that Canada’s caps had made it a less desirable study destination. 

    Though stakeholders across Canada recognised the need to address fraud and student housing issues, many had urged the federal government to wait until the impact of the initial caps was clear before going ahead with seemingly endless policy changes.  

    At the CBIE conference in November 2024, immigration minister Marc Miller said he “profoundly disagreed” with the prevailing sector view that the caps and subsequent PGWP and permanent residency restrictions had been an “overcorrection”.

    Post-secondary programs, which were the primary focus of the 2024 caps, were hit hardest by the restrictions, with new international enrolments at colleges estimated to have dropped by 60% as a result of the policies.  

    While Canada’s largest source destinations saw major declines, the caps were not felt evenly across sending countries. Senegal, Guinea and Vietnam maintained year-over-year growth, signalling potential sources of diversity for Canada’s cap era.   

    The report also highlighted Ghana’s potential as a source destination, where approval ratings – though declining from last year – remain 175% higher than figures from 2022. 

    Rapidly introduced policy changes created confusion and had an immense impact on student sentiment

    Meti Basiri, ApplyBoard

    The significant drop in study permit approvals was felt across all provinces, but Ontario – which accounted for over half of all study permit approvals in 2023 – and Nova Scotia have seen the largest impact, falling by 55% and 54.5% respectively.

    Notably, the number of study permits processed by the IRCC dropped by a projected 35% in 2024, in line with the government’s targets, but approval rates have not kept pace.

    When setting last year’s targets, minister Miller only had the power to limit the number of applications processed by the IRCC, not the number of study permits that are approved.  

    The initial target of 360,000 approved study permits was based on an estimated approval rate of 60%, resulting in a 605,000 cap on the number of applications processed. 

    Following new policies such as the inclusion of postgraduate programs in the 2025 cap, Basiri said he anticipated that study permit approvals would remain below pre-cap levels.  

    “While overall student numbers may align with IRCC’s targets, the broader impact on institutional readiness and Canada’s reputation will be key areas to watch in 2025,” he added.  

    Source link

  • Crafting technology-driven IEPs

    Crafting technology-driven IEPs

    Key points:

    Individualized Education Plans (IEP) have been the foundation of special education for decades, and the process in which these documents are written has evolved over the years.

    As technology has evolved, writing documents has also evolved. Before programs existed to streamline the IEP writing process, creating IEPs was once a daunting task of paper and pencil. Not only has the process of writing the IEP evolved, but IEPs are becoming technology-driven.

    Enhancing IEP goal progress with data-driven insights using technology: There are a variety of learning platforms that can monitor a student’s performance in real-time, tailoring to their individual needs and intervening areas for improvement. Data from these programs can be used to create students’ annual IEP goals. This study mentions that the ReadWorks program, used for progress monitoring IEP goals, has 1.2 million teachers and 17 million students using its resources, which provide content, curricular support, and digital tools. ReadWorks is free and provides all its resources free of charge and has both printed and digital versions of the material available to teachers and students (Education Technology Nonprofit, 2021).

    Student engagement and involvement with technology-driven IEPs: Technology-driven IEPs can also empower students to take an active role in their education plan. According to this study, research shows that special education students benefit from educational technology, especially in concept teaching and in practice-feedback type instructional activities (Carter & Center, 2005; Hall, Hughes & Filbert, 2000; Hasselbring & Glaser, 2000). It is vital for students to take ownership in their learning. When students on an IEP reach a certain age, it is important for them to be the active lead in their plan. Digital tools that are used for technology-driven IEPs can provide students with visual representations of their progress, such as dashboards or graphs. When students are given a visual representation of their progress, their engagement and motivation increases.

    Technology-driven IEPs make learning fun: This study discusses technology-enhanced and game based learning for children with special needs. Gamified programs, virtual reality (VR), and augmented reality (AR) change the learning experience from traditional to transformative. Gamified programs are intended to motivate students with rewards, personalized feedback, and competition with leaderboards and challenges to make learning feel like play. Virtual reality gives students an immersive experience that they would otherwise only be able to experience outside of the classroom. It allows for deep engagement and experiential learning via virtual field trips and simulations, without the risk of visiting dangerous places or costly field trip fees that not all districts or students can afford. Augmented reality allows students to visualize abstract concepts such as anatomy or 3D shapes in context. All these technologies align with technology-driven IEPs by providing personalized, accessible, and measurable learning experiences that address diverse needs. These technologies can adapt to a student’s individual skill level, pace, and goals, supporting their IEP.

    Challenges with technology-driven IEPs: Although there are many benefits to
    technology-driven IEPs, it is important to address the potential challenges to ensure equity across school districts. Access to technology in underfunded school districts can be challenging without proper investment in infrastructures, devices, and network connection. Student privacy and data must also be properly addressed. With the use of technologies for technology-driven IEPs, school districts must take into consideration laws such as the Family Educational Rights and Privacy Act (FERPA).

    The integration of technology into the IEP process to create technology-driven IEPs represents a shift from a traditional process to a transformative process. Technology-driven IEPs create more student-centered learning experiences by implementing digital tools, enhancing collaboration, and personalized learning experiences. These learning experiences will enhance student engagement and motivation and allow students to take control of their own learning, making them leaders in their IEP process. However, as technology continues to evolve, it is important to address the equity gap that may arise in underfunded school districts.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Keep talking about data | Wonkhe

    Keep talking about data | Wonkhe

    How’s your student data this morning?

    Depending on how close you sit to your institutional student data systems, your answer may range from a bemused shrug to an anguished yelp.

    In the most part, we remain blissfully unaware of how much work it currently takes to derive useful and actionable insights from the various data traces our students leave behind them. We’ve all seen the advertisements promising seamless systems integration and a tangible improvement in the student experience, but in most cases the reality is far different.

    James Gray’s aim is to start a meaningful conversation about how we get there and what systems need to be in place to make it happen – at a sector as well as a provider level. As he says:

    There is a genuine predictive value in using data to design future solutions to engage students and drive improved outcomes. We now have the technical capability to bring content, data, and context together in a way that simply has not been possible before now.”

    All well and good, but just because we have the technology doesn’t mean we have the data in the right place or the right format – the problem is, as Helen O’Sullivan has already pointed out on Wonkhe, silos.

    Think again about your student data.

    Some of it is in your student information system (assessment performance, module choices), which may or may not link to the application tracking systems that got students on to courses in the first place. You’ll also have information about how students engage with your virtual learning environment, what books they are reading in the library, how they interact with support services, whether and how often they attend in person, and their (disclosed) underlying health conditions and specific support needs.

    The value of this stuff is clear – but without a whole-institution strategic approach to data it remains just a possibility. James notes that:

    We have learned that a focus on the student digital journey and institutional digital transformation means that we need to bring data silos together, both in terms of use and collection. There needs to be a coherent strategy to drive deployment and data use.

    But how do we get there? From what James has seen overseas, in the big online US providers like Georgia Tech and Arizona State data is something that is managed strategically at the highest levels of university leadership. It’s perhaps a truism to suggest that if you really care about something it needs ownership at a senior level, but having that level of buy-in unlocks the resource and momentum that a big project like this needs.

    We also talked about the finer-grained aspects of implementation – James felt that the way to bring students and staff on board is to clearly demonstrate the benefits, and listen (and respond) to concerns. That latter is essential because “you will annoy folks”.

    Is it worth this annoyance to unlock gains in productivity and effectiveness? Ideally, we’d all be focused on getting the greatest benefit from our resources – but often processes and common practices are arranged in sub-optimal ways for historical reasons, and rewiring large parts of someone’s role is a big ask. The hope is that the new way will prove simpler and less arduous, so it absolutely makes sense to focus on potential benefits and their realisation – and bringing in staff voices at the design stage can make for gains in autonomy and job satisfaction.

    The other end of the problem concerns procurement. Many providers have updated their student records systems in recent years in response to the demands of the Data Futures programme. The trend has been away from bespoke and customised solutions and towards commercial off-the-shelf (COTS) procurement: the thinking here being that updates and modifications are easier to apply consistently with a standard install.

    As James outlines, providers are looking at a “buy, build, or partner” decision – and institutions with different goals (and at different stages of data maturity) may choose different options. There is though enormous value in senior leaders talking across institutions about decisions such as these. “We had to go through the same process” James outlined. “In the end we decided to focus on our existing partnership with Microsoft to build a cutting edge data warehouse, and data ingestion, hierarchy and management process leveraging Azure and MS Fabric with direct connectivity to Gen AI capabilities to support our university customers with their data, and digital transformation journey.” – there is certainly both knowledge and hard-won experience out there about the different trade-offs, but what university leader wants to tell a competitor about the time they spent thousands of pounds on a platform that didn’t communicate with the rest of their data ecosystem?

    As Claire Taylor recently noted on Wonkhe there is a power in relationships and networks among senior leaders that exist to share learning for the benefit of many. It is becoming increasingly clear that higher education is a data-intensive sector – so every provider should feel empowered to make one of the most important decisions they will make in the light of a collective understanding of the landscape.

    This article is published in association with Kortext. Join us at an upcoming Kortext LIVE event in London, Manchester and Edinburgh in January and February 2025 to find out more about Wonkhe and Kortext’s work on leading digital capability for learning, teaching and student success.

    Source link

  • Re-capturing the early 80s | HESA

    Re-capturing the early 80s | HESA

    Most of the time when I talk about the history of university financing, I show a chart that looks like this, showing that since 1980 government funding to the sector is up by a factor of about 2.3 after inflation over the last 40-odd years, while total funding is up by a factor of 3.6.

    Figure 1: Canadian University Income by source, 1979-80 to 2022-23, in billions of constant $2022

    That’s just a straight up expression of how universities get their money. But what it doesn’t take account of are changes in enrolment, which as Figure 2 shows, were a pretty big deal. Universities have admitted a *lot* more students over time. The university system has nearly doubled since the end of the 1990s and nearly tripled since the start of the 1990s.

    Figure 2: Full-time Equivalent Enrolment, Canada, Universities, 1978-79 to 2022-23

    So, the question is, really, how have funding pattern changes interacted with changes in enrolment? Well, folks, wonder no more, because I have toiled through some unbelievably badly-organized excel data to bring you funding data on this that goes back to the 1980s (I did a version of this back here, but I only captured national-level data—the toil here involved getting data granular enough to look at individual provinces). Buckle up for a better understanding of how we got to our present state!

    Figure 3 is what I would call the headline graph: University income per student by source, from 1980-81 to the present, in constant $2022. Naturally, it looks a bit like Figure 1, but more muted because it takes enrolment growth into account.

    Figure 3: University income per student by source, from 1980-81 to the present, in constant $2022

    There’s nothing revolutionary here, but it shows a couple of things quite clearly. First, government funding per-student has been falling for most of the past 40 years.; the brief period from about 1999 to 2009 stands out as the exception rather than the norm. Second, despite that, total funding per student is still quite high compared with the 1990s. Institutions have found ways to replace government income with income from other sources. That doesn’t mean the quality of the money is the same. As I have said before, hustling for money incurs costs that don’t occur if governments are just writing cheques.

    As usual, though, looking at the national picture often disguises variation at the provincial level. Let’s drill one level down and see what happened to government spending at the sub-national level. A quick note here: “government spending” means *all* government spending, not just provincial government spending. So, Ontario and Quebec probably look better than they otherwise would because they receive an outsized chunk of federal government research spending, while the Atlantic provinces probably look worse. I doubt the numbers are affected much because overall revenues from federal sources are pretty small compared to provincial ones, but it’s worth keeping in mind as you read the following.

    Figure 4 looks at government spending per student in the “big three” provinces which make up over 75% of the Canadian post-secondary system. Nationally, per-student spending fell from $22,800 per year to $17,600 per year. But there are differences here: Ontario spent the entire 42-year period below that average, while BC and Quebec spent nearly all that period above it. Quebec has notably seen very little in terms of per-student fluctuations, while BC has been more volatile. Ontario saw a recovery in spending during the McGuinty years, but then has experienced a drop of about 35%. Of note, perhaps is that most of this decline happened before the arrival of the current Ford government.

    Figure 4: Per-Student Income from Government Sources, in thousands of constant $2022, Canada and the “Big Three” provinces, 1980-81 to 2022-23

    Figure 5 shows that spending volatility was much higher in the three oil provinces of Alberta, Saskatchewan, and Newfoundland & Labrador. All three provinces spent virtually the entirety of our period with above-average spending levels but the gap between these provinces and the national average was quite large both in the early 1980s and from about 2005 onwards: i.e. when oil prices were at their highest. Alberta of course has seen per-student funding drop by about 50% in the last fifteen years, but at the same time, it is close to where it was 25 years ago. So, was it the dramatic fall or the precipitous rise that was the outlier?

    Figure 5: Per-Student Income from Government Sources, in thousands of constant $2022, Canada and the “Oil provinces”, 1980-81 to 2022-23

    Figure 6 shows the other four provinces for the sake of completeness. New Brunswick and Nova Scotia were the lowest spenders in the country for most of the period we’re looking at, only catching up to the national average in the mid-aughts. Interestingly, the two provinces took two different paths to raise per-student spending: Nova Scotia did it almost entirely by raising spending, while in New Brunswick this feat was to a considerable extent “achieved” by a significant fall in student numbers (this is a ratio, folks, both the numerator and the denominator matter!).

    Figure 6: Per-Student Income from Government Sources, in thousands of constant $2022, Canada and selected provinces, 1980-81 to 2022-23

    An interesting question, of course, is what it would have cost to have kept public spending at 1980 per-student levels. And it’s an interesting question, because remember, total spending did in fact rise quite substantially (see Figure 1): it just didn’t rise as fast as student numbers. So, in Figure 7, I show what it would have cost to keep per-student expenditures stable at 1980-81 levels both if student numbers had stayed constant, and what it would have meant in practice given actual student numbers.

    Figure 7: Funds required to return to 1980-81 levels of per-student government investment in universities, Canada, in millions of constant $2022

    Weird-looking graph, right? But here’s how to interpret it. Per-student public funding did fall in the 80s and early 90s. But it rose again in the early aughts, to the point where per-student funding went back to where it was in 1980, even though the number of students in the system had doubled in the meanwhile. From about 2008 onwards, though, public investment started falling off again in per-student terms, going back to mid/late-90s levels even as overall student numbers continued to rise. We are now at the point where getting back to the levels of 1980-81, or even just 2007-08, would require a rise of between $6 and $6.5 billion dollars.

    Anyways, that’s enough sunshine for one morning. Have a great day.

    Source link

  • College Financials 2022-23 | HESA

    College Financials 2022-23 | HESA

    StatsCan dropped some college financial data over the XMAS holidays.  I know you guys are probably sick of this subject, but it’s still good to have some national data—even if it is eighteen months out of date and doesn’t really count the last frenzied months of the international student gold rush (aka “doing the Conestoga”).  But it does cover the year in which everyone now agrees student visa numbers “got out of control,” so there are some interesting things to be learned here nonetheless.

    To start, let’s look quickly at college income by source.  Figure 1, below, shows that college income did rise somewhat in 2022-23, due mainly to an increase in tuition income (up 35% between the nadir COVID year of 20-21 and 22-23).  But overall, once inflation is taken into account, the increase in college income really wasn’t all that big: about a billion dollars in 2021-22 and about the same again in 2022-23, or about 6-7% per year after inflation.  Good?  Definitely.  Way above what universities were managing, and well above most sectors internationally?  But it’s not exactly the banditry that some communicators (including the unofficial national minister of higher education, Marc Miller) like to imply.

    Figure 1: College Income by Source, Canada, 2017-18 to 2022-23, in Billions of $2022

    Now I know a few of you are looking at this and scratching your heads, asking what the hell is going on in Figure 1.  After all, haven’t I (among others) made the point about record surpluses in the college sector?  Well, yes.  But I’ve only ever really been talking about Ontario, which is the only province where international tuition fees have really taken flight.  In Figure 2, I put the results for Ontario and for the other nine provinces side-by-side.  And you can see how different the two are.  Ontario has seen quite large increases in income, mainly through tuition fees and by ancillary income bouncing back to where it was pre-COVID, while in the other nine provinces income growth is basically non-existent in any of the three categories.

    Figure 2a/bCollege Income by Source, Ontario vs Other Nine Provinces, 2017-18 to 2022-23, in Billions of $2022

    (As an aside, just note here that over 70% of all college tuition income is collected in the province of Ontario, which is kind of wild.  At the national level, Canada’s college sector is not really a sector at all…their aims, goals, tools, and income patterns all diverge enormously.)

    Figure 3 drills down a little bit on the issue of tuition fee income to show where they have been growing and where they have not.  One might look at this and think its irreconcilable with Figure 2, since tuition fees in the seven smaller provinces seem to be increasing at a rate similar to Ontario.  What that should tell you, though, is that the base tuition from which these figures are rising are pretty meagre in the seven smallest provinces, and quite significant in Ontario.  (Also, remember that in Ontario, domestic tuition fees fell by over 20% or so after inflation between 2019-20 and 2022-23, so this chart is actually underplaying the growth in international fees in that province a bit.)

    Figure 3: Change in Real Aggregate Tuition Income by Province, 2017-18 to 2022-23, (2017-18 = 100)

    Now I want to look specifically at some of the data with respect to expenditures and to try to ask the question: where did that extra $2.2 billion that the sector acquired in 21-22 and 22-23 (of which, recall, over 70% went to Ontario alone) go?

    Figure 4 answers this question in precise detail, and once again the answer depends on whether you are talking about Ontario or the rest of the country.  The biggest jump in expenditures by far is “contracted services” in Ontario—an increase of over $500M in just two years.  This is probably as close a look as we will ever get at the economics of those PPP colleges that were set up around the GTA since most of this sum is almost certainly made up of public college payments to those institutions for paying the new students had arrived in those two years.  If you assume the increase in international students at those colleges was about 40,000 (for a variety of reasons, an exact count on this is difficult), then that implies that colleges were paying their PPP partners about $12,500 per student on average and pocketing the difference, which would have been anywhere between about $2,500 and $10,000, depending on the campus and program.  And of course, most of the funds spent on PPP were spent one way or another on teaching expenses for these students.

    Figure 4: Change in Expenditures/Surplus, Canadian Colleges 2022-23 vs 2020-21, Ontario vs. Other 9 Provinces, in millions of 2022

    On top of that, Ontario colleges threw an extra $300 million into new construction (this is a bit of an exaggeration because 2020-21 was a COVID year and building expenses were abnormally low), and an extra $260 million (half a billion in total) thrown into reserve funds for future years.  This last is money that probably would have ended up as capital expenditures in future years if the feds hadn’t come crashing in and destroying the whole system last year but will now probably get used to cover losses over the next year or two instead.  Meanwhile, in the rest of Canada, surpluses decreased between 2020-21 and 2022-23, and such spending increases as occurred came mostly under the categories “miscellaneous” and “ancillary enterprises.”

    2022-23 of course was not quite “peak international student” so this analysis can’t quite tell the full story of how international students affected colleges.  We’ll need to wait another 11 months for that data to show up.  But I doubt that the story I have outlined based on the data available to date will not change too much.  In short, the financials show that:

    • Colleges outside Ontario were really not making bank on international students.
    • Within Ontario, over a third of the additional revenue from international students generated in the 2020-21 to 2022-23 period was paid out to PPP partners, who would have spent most of that on instruction.
    • Of the remaining billion or so, about a third went into new construction and another 20% was “surplus,” which probably meant it was intended for future capital expenditure.
    • The increase in core college salary mass was miniscule—in fact only about 3% after inflation. 

    If there was “empire building” going on, it was in the form of constructing new buildings, not in terms of massive salary rises or hiring sprees. 

    Source link

  • Deafening Silence on PIAAC | HESA

    Deafening Silence on PIAAC | HESA

    Last month, right around the time the blog was shutting down, the OECD released its report on the second iteration of the Programme for International Assessment for Adult Competencies (PIAAC), titled “Do Adults Have the Skills They Need to Thrive in a Changing World?”. Think of it perhaps as PISA for grown-ups, providing a broadly useful cross-national comparison of basic cognitive skills which are key to labour market success and overall productivity. You are forgiven if you didn’t hear about it: its news impact was equivalent to the proverbial tree falling in a forest. Today, I will skim briefly over the results, but more importantly, ponder why this kind of data does not generate much news.

    First administered in 2011, PIAAC consists of three parts: a test for literacy, numeracy, and what they call “adaptive problem solving” (this last one has changed a bit—in the previous iteration it was something called “problem-solving in technology-rich environments). The test scale for is from 0 to 500, and individuals are categorized as being in one of six “bands” (1 through 5, with 5 being the highest, and a “below 1,” which is the lowest). National scores across all three of these areas are highly correlated, which is to say that if country is at the top or bottom, or even in the middle on literacy, it’s almost certainly pretty close to the same rank order for numeracy and problem solving as well. National scores all cluster in the 200 to 300 range.

    One of the interesting—and frankly somewhat terrifying—discoveries of PIAAC 2 is that literacy and numeracy scores are down in most of the OECD outside of northern Europe. Across all participating countries, literacy is down fifteen points, and numeracy by seven. Canada is about even in literacy and up slightly in numeracy—this is one trend it’s good to buck. The reason for this is somewhat mysterious—an aging population probably has something to do with it, because literacy and numeracy do start to fall off with age (scores peak in the 25-34 age bracket)—but I would be interested to see more work on the role of smart phones. Maybe it isn’t just teenagers whose brains are getting wrecked?

    The overall findings actually aren’t that interesting. The OECD hasn’t repeated some of the analyses that made the first report so fascinating (results were a little too interesting, I guess), so what we get are some fairly broad banalities—scores rise with education levels, but also with parents’ education levels; employment rates and income rise with skills levels; there is broadly a lot of skill mis-match across all economies, and this is a Bad Thing (I am not sure it is anywhere near as bad as OECD assumes, but whatever). What remains interesting, once you read over all the report, are the subtle differences one picks up in the results from one country to another.

    So, how does Canada do, you ask? Well, as Figure 1 shows, we are considered to be ahead of the OECD average, which is good so far as it goes. However, we’re not at the top. The head of the class across all measures are Finland, Japan, and Sweden, followed reasonably closely by the Netherlands and Norway. Canada is in a peloton behind that with a group including Denmark, Germany, Switzerland, Estonia, the Flemish region of Belgium, and maybe England. This is basically Canada’s sweet spot in everything when it comes to education, skills, and research: good but not great, and it looks worse if you adjust for the amount of money we spend on this stuff.

    Figure 1: Key PIAAC scores, Canada vs OECD, 2022-23

    Canadian results can also be broken down by province, as in Figure 2, below. Results do not vary much across most of the country. Nova Scotia, Ontario, Saskatchewan, Manitoba, Prince Edward Island, and Quebec all cluster pretty tightly around the national average. British Columbia and Alberta are significantly above that average, while New Brunswick and Newfoundland are significantly below it. Partly, of course, this has to do with things you’d expect like provincial income, school policies, etc. But remember that this is across entire populations, not school leavers, and so internal immigration plays a role here too. Broadly speaking, New Brunswick and Newfoundland lose a lot of skills to places further west, while British Columbia and Alberta are big recipients of immigration from places further east (international migration tends to reduce average scores: language skills matter and taking the test in a non-native tongue tends to result in lower overall results).

    Figure 2: Average PIAAC scores by province, 2022-23

    Anyways, none of this is particularly surprising or perhaps even all that interesting. What I think is interesting is how differently this data release was handled from the one ten years ago. When the first PIAAC was released a decade ago, Statistics Canada and the Council of Ministers of Education, Canada (CMEC) published a 110-page analysis of the results (which I analyzed in two posts, one on Indigenous and immigrant populations, and another on Canadian results more broadly) and an additional 300(!)-page report lining up the PIAAC data with data on formal and informal adult learning. It was, all in all, pretty impressive. This time, CMEC published a one-pager which linked to a Statscan page which contains all of three charts and two infographics (fortunately, the OECD itself put out a 10-pager that is significantly better than anything domestic analysis). But I think all of this points to something pretty important, which is this:

    Canadian governments no longer care about skills. At least not in the sense that PIAAC (or PISA for that matter) measures them.

    What they care about instead are shortages of very particular types of skilled workers, specifically health professions and the construction trades (which together make up about 20% of the workforce). Provincial governments will throw any amount of money at training in these two sets of occupations because they are seen as bottlenecks in a couple of key sectors of the economy. They won’t think about the quality of the training being given or the organization of work in the sector (maybe we wouldn’t need to train as many people if the labour produced by such training was more productive?). God forbid. I mean that would be difficult. Complex. Requiring sustained expert dialogue between multiple stakeholders/partners. No, far easier just to crank out more graduates, by lowering standards if necessary (a truly North Korean strategy).

    But actual transversal skills? The kind that make the whole economy (not just a politically sensitive 20%) more productive? I can’t name a single government in Canada that gives a rat’s hairy behind. They used to, twenty or thirty years ago. But then we started eating the future. Now, policy capacity around this kind of thing has atrophied to the point where literally no one cares when a big study like PIAAC comes out.

    I don’t know why we bother, to be honest. If provincial governments and their ministries of education in particular (personified in this case by CMEC) can’t be arsed to care about something as basic as the skill level of the population, why spend millions collecting the data? Maybe just admit our profound mediocrity and move on.

    Source link