Category: Data

  • Policy Proposals Lack Clarity About How to Evaluate Graduates’ Additional Degrees

    Policy Proposals Lack Clarity About How to Evaluate Graduates’ Additional Degrees

    Title: Accounting for Additional Credentials in Postsecondary Earnings Data

    Authors: Jason Delisle, Jason Cohn, and Bryan Cook

    Source: The Urban Institute

    As policymakers across both parties consider how to evaluate postsecondary outcomes and earnings data, the authors of a new brief from the Urban Institute pose a major question: How should students who earn multiple credentials be included in data collection for the college that awarded their first degree?

    For example, should the earnings of a master’s degree recipient be included in the data for the institution where they earned their bachelor’s degree? Additionally, students who finish an associate degree at a community college are likely to earn higher wages when they complete a bachelor’s degree at another institution. Thus, multiple perspectives need to be considered to help both policymakers and institutions understand, interpret, and treat additional degrees earned.

    Additional key findings include:

    Earnings Data and Accountability Policies

    Many legislative proposals would expand the use of earnings data to provide further accountability and federal aid restrictions. For example, the House Republicans’ College Cost Reduction Act, proposed in 2024, would put institutions at risk of losing funding if they have low student loan repayment rates. The brief’s authors state that the bill does not indicate if students who earn additional credentials should be included in the cohort of students where they completed their first credential.

    The recently implemented gainful employment rule from the Biden administration is explicit in its inclusion of those who earn additional credentials. Under the rule, students who earn an additional degree are included in both calculations for their recent degree and the program that awarded their first credential.

    How Much Do Additional Credential Affect Earnings Data?

    Determining how much additional credentials affect wages and earnings for different programs is difficult. The first earnings measurement—the first year after students leave school—is usually too early to include additional income information from a second credential.

    Although the entire data picture is lacking, a contrast between first- and fifth-year earnings suggests that the number of students earning additional degrees may be very high for some programs. As an example, students who earn associate degrees in liberal arts and general studies often have some of their quickest increases in earnings during these first five years. A potential explanation is because students are then completing a bachelor’s degree program at a four-year institution.

    Policy Implications: How Should Earnings Data Approach Subsequent Credentials?

    In general, it seems that many policymakers have not focused on this complicated question of students who earn additional degrees. However, policy and data professionals may benefit from excluding students who earn additional credentials to more closely measure programs’ return on investment. This can be especially helpful when examining the costs of bachelor’s programs and their subsequent earnings benchmarks, by excluding additional earnings premiums generated from master’s programs.

    Additionally, excluding students who earn additional credentials may be particularly valuable to students in making consumer and financial aid decisions if the payoff from a degree is extremely different depending on whether students pursue an additional credential.

    However, some programs are intended to prepare students for an additional degree, and excluding data for students who earn another degree would mean excluding most graduates and paint a misleading picture.

    To read the full report from the Urban Institute, click here.

    —Austin Freeman


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • Embracing a growth mindset when reviewing student data

    Embracing a growth mindset when reviewing student data

    Key points:

    In the words of Carol Dweck, “Becoming is better than being.” As novice sixth grade math and English teachers, we’ve learned to approach our mid-year benchmark assessments not as final judgments but as tools for reflection and growth. Many of our students entered the school year below grade level, and while achieving grade-level mastery is challenging, a growth mindset allows us to see their potential, celebrate progress, and plan for further successes amongst our students. This perspective transforms data analysis into an empowering process; data is a tool for improvement amongst our students rather than a measure of failure.

    A growth mindset is the belief that abilities grow through effort and persistence. This mindset shapes how we view data. Instead of focusing on what students can’t do, we emphasize what they can achieve. For us, this means turning gaps into opportunities for growth and modeling optimism and resilience for our students. When reviewing data, we don’t dwell on weaknesses. We set small and achievable goals to help students move forward to build confidence and momentum.

    Celebrating progress is vital. Even small wins (i.e., moving from a kindergarten grade-level to a 1st– or 2nd-grade level, significant growth in one domain, etc.) are causes for recognition. Highlighting these successes motivates students and shows them that effort leads to results.

    Involving students in the process is also advantageous. At student-led conferences, our students presented their data via slideshows that they created after they reviewed their growth, identified their strengths, and generated next steps with their teachers. This allowed them to feel and have tremendous ownership over their learning. In addition, interdisciplinary collaboration at our weekly professional learning communities (PLCs) has strengthened this process. To support our students who struggle in English and math, we work together to address overlapping challenges (i.e., teaching math vocabulary, chunking word-problems, etc.) to ensure students build skills in connected and meaningful ways.

    We also address the social-emotional side of learning. Many students come to us with fixed mindsets by believing they’re just “bad at math” or “not good readers.” We counter this by celebrating effort, by normalizing struggle, and by creating a safe and supportive environment where mistakes are part of learning. Progress is often slow, but it’s real. Students may not reach grade-level standards in one year, but gains in confidence, skills, and mindset set the stage for future success, as evidenced by our students’ mid-year benchmark results. We emphasize the concept of having a “growth mindset,” because in the words of Denzel Washington, “The road to success is always under construction.” By embracing growth and seeing potential in every student, improvement, resilience, and hope will allow for a brighter future.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Fun with Participation Rate Data

    Fun with Participation Rate Data

    Just a quick one today, mostly charts.

    Back in the fall, StatsCan released a mess of data from the Labour Force Survey looking at education participation rates—that is, the percentage of any given age cohort that is attending education—over the past 25 years. So, let’s go see what it says.

    Figure 1 shows total education participation rates, across all levels of education, from age 15 to 29, for selected years over the past quarter century. At the two ends of the graph, the numbers look pretty similar. At age 15, we’ve always had 95%+ of our population enrolled in school (almost exclusively secondary education, and from age 26 and above, we’ve always been in the low-tweens or high single digits. The falling-off in participation is fairly steady: for every age-year above 17, about 10% of the population exits education up until the age of 26. The big increase in education enrolments that we’ve seen over the past couple of decades has really occurred in the 18-24 range, where participation rates (almost exclusively in universities, as we shall see) have increased enormously.

    Figure 1: Participation rates in Education (all institutions) by Age, Canada, select years 1999-00 to 2023-24

    Figure 2 shows current participation rates by age and type of postsecondary institution. People sometimes have the impression that colleges cater to an “older” clientele, but in fact, at any given age under 30, Canadians are much more likely to be enrolled in universities than in colleges. Colleges have a very high base in the teens because of the way the CEGEP system works in Quebec (I’ll come back to regional diversity in a minute), and it is certainly true that there is a very wide gap in favour of universities among Canadians in their mid-20s. But while the part rate gap narrows substantially at about age 25, it is never the case that the college participation rate surpasses the university one.

    Figure 2: Participation Rates by Age and Institution Type, Canada, 2023-24

    Figure 3 shows college participation rates by age over time. What you should take from this is that there has been a slight decline in college participation rates over time in the 19-23 age range, but beyond that not much has changed.

    Figure 3: College Participation Rates by Age, Selected Years, 1999-2000 to 2023-24

    Figure 4 uses the same lens as figure 3 only for universities. And it’s about as different as it can be. In 1999, fewer than one in ten Canadians aged 18 was in university: now it is three in ten. In 1999, only one in four 21 year-olds was in university, now it is four-in-ten. These aren’t purely the effects of increased demand; the elimination of grade 13 in Ontario had a lot to do with the changes for 18-year-olds; Alberta and British Columbia converting a number of their institutions from colleges to universities in the late 00s probably juices these numbers a bit, too. But on the whole, what we’ve seen is a significant increase in the rate at which young people are choosing to attend universities between the ages of 18 and 24. However, beyond those ages the growth is less pronounced. There was certainly growth in older student participation rates between 1999-00 and 20011-12, but since then none at all.

    Figure 4: University Participation Rates by Age, Selected Years, 1999-2000 to 2023-24

    So much for the national numbers: what’s going on at the provincial level? Well, because this is the Labour Force Survey, which unlike administrative data has sample size issues, we can’t quite get the same level of granularity of information. We can’t look at individual ages, but we can see age-ranges, in this case ages 20-24. In figures 5 and 6 (I broke them up so they are a bit easier to read), I show how each province’s university and college participation rates in 2000 vs. 2023.

    Figure 5: University Participation Rates for 20-24 Year-olds, Four Largest Provinces, 2000-01 vs. 2023-24

    Figure 6: University Participation Rates for 20-24 Year-olds, Six Remaining Provinces, 2000-01 vs. 2023-24

    Some key facts emerge from these two graphs:

    • The highest participation rates in the country are in Ontario, Quebec, and British Columbia.
    • In all provinces, the participation rate in universities is higher than it is for colleges, ranging from 2.5x in Quebec for over 4x in Saskatchewan.
    • Over the past quarter century, overall postsecondary participation rates and university participation rates have gone up in all provinces; Alberta and British Columbia alone have seen a decline in college participation rates, due to the aforementioned decision to convert certain colleges to university status in the 00s.
    • Growth in participation rates since 2000 has been universal but has been more significant in the country’s four largest provinces, where the average gain has been nine percentage points, and the country’s six smaller provinces, where the gain has been just under five percent.
    • Over twenty-five years, British Columbia has gone from ninth to second in the country in terms of university participation rates, while Nova Scotia has gone second to ninth.
    • New Brunswick has consistently been in last place for overall participation rates for the entire century.

    Just think: three minutes ago, you probably knew very little about participation rates in Canada by age and geography, now you know almost everything there is to know about participation rates in Canada by age and geography. Is this a great way to start your day or what?

    Source link

  • Student Aid in Canada: The Long View

    Student Aid in Canada: The Long View

    Note: this is a short version of a paper which has just appeared in issue 72:4 the Canadian Tax Journal. How short? I’m trying for under 1000 words. Let’s see how I do.

    Canadian student aid programs existed in scattered forms since just after World War I but became a “national program” when the Dominion-Canadian Student Aid Program (DCSAP) was created in 1939. Under this program, the Government of Canada provided block cash grants to provinces who administered their own scholarship programs which provided aid based on some combination of need and merit. The actual details of the program varied significantly from one province to another; at the time, the government of Canada did not place much importance on “national programs” with common elements.

    In 1964, this DCSAP was replaced by the Canada Student Loans Program (CSLP)—recently re-named the Canada Student Financial Assistance Program (CSFAP). This has always been a joint federal-provincial enterprise. But where the earlier program was a block grant, this program would be a single national entity run more or less consistently across all provinces, albeit with provincial governments still in place as responsible administrative agencies able to supplement the plan as they wished. Some provinces would opt out of this program and received compensation to run their own solo programs (Quebec at the program’s birth, the Northwest Territories in 1984 and Nunavut in 1999). The others, for the most part, built grant programs that kicked in once a student had exhausted their Canada Student Loan eligibility.

    Meanwhile, a complimentary student aid program grew up in the tax system, mainly because it was a way to give money to students that didn’t involve negotiations with provinces. Tuition fees plus a monthly education amount were made into a tax deduction in 1961 and then converted to a tax credit in 1987. Registered Education Savings Plans (RESPs), which are basically tax-free growth savings accounts, showed up in 1971.

    Although the CSLP was made somewhat more generous over time in order to keep up with rising student costs, program rules went largely unchanged between 1964 and 1993. Then, during the extremely short Kim Campbell government, a new system came into being. The federal government decided to make loans much larger, but also to force provinces in participating provinces to start cost-sharing in a different manner—basically, they had to step up from a student’s first dollar of need instead of just taking students with high need. Since this was the era of stupidly high deficits, provinces responded to these additional responsibilities by cutting the generosity of their programs, transforming from pure grants to forgivable loans. For the rest of the decade, student debt rose—in some cases quite quickly: in total loans issued doubled between 1993 and 1997.

    And then, everything went into reverse.

    In a series of federal budgets between 1996 and 2000, billions of dollars were thrown into grants, tax credits and a new program called “Canada Education Savings Grants,” which were a form of matching grant for contributions to RESPs. Grants and total aid rose; loans issued fell by a third, mainly between 1997 and 2001 (a recovering economy helped quite a bit). Tax expenditures soared, which due to a rule change allowing tax credits to be carried forward meant either students got to keep more of their work income or got to reduce their taxes once they started working.

    Since this period of rapid change at the turn of the century, student aid has doubled in real terms. And nearly all of that has been an increase in non-repayable aid. Institutional scholarships? Tripled. Education scholarships? Quadrupled. Loans? They are up, too, but there the story is a bit more complicated.

    Figure 1: Student Aid by Source, Canada, 1993-94 to 2022-23, in thousands of constant $2022

    For the period from about 2000 to 2015, all forms of aid were increasing at about inflation plus 3%. Then, in 2016, we entered another period of rapid change. The Governments of Canada and Ontario eliminated a bunch of tax credits and re-invested the money into grants. Briefly, this led to targeted free tuition in Ontario, before the Ford government took an axe to the system. Then, COVID hit and the CSFAP doubled grants. Briefly, in 2020-21, total student aid exceeded $23 billion/year (the figure above does not include the $4 billion per year paid out through the Canada Emergency Student Benefit), with less than 30% of it made up of loans.

    One important thing to understand about all this is that while the system became much larger and much less loan-based, something else was going on, too. It was becoming much more federal. Over the past three decades, provincial outlays have risen about 30% in real terms; meanwhile, federal ones have quadrupled. In the early 1990s, the system was about 45-55 federal-provincial; now, it’s about 70-30 federal. It’s a stunning example of “uploading” of responsibilities in an area of shared-jurisdiction.

    Figure 2: Government Student Aid by Source, Figure 1: Student Aid by Source, Canada, 1993-94 to 2022-23, in thousands of constant $2022

    So there you go: a century of Canadian student aid in less than 850 words. Hope you enjoyed it.

    Source link

  • Institutions may be holding themselves back by not sharing enough data

    Institutions may be holding themselves back by not sharing enough data

    Wonkhe readers need little persuasion that information flows are vital to the higher education sector. But without properly considering those flows and how to minimise the risk of something going wrong, institutions can find themselves at risk of substantial fines, claims and reputational damage. These risks need organisational focus from the top down as well as regular review.

    Information flows in higher education occur not only in teaching and research but in every other area of activity such as accommodation arrangements, student support, alumni relations, fundraising, staff and student complaints and disciplinary matters. Sometimes these flows are within organisations, sometimes they involve sharing data externally.

    Universities hold both highly sensitive research information and personal data. Examples of the latter include information about individuals’ physical and mental health, family circumstances, care background, religion, financial information and a huge range of other personal information.

    The public narrative on risks around data tend to focus on examples of inadvertently sharing protected information – such as in the recent case of the Information Commissioner’s decision to fine the Police Service of Northern Ireland £750,000 in relation to the inadvertent disclosure of personal information over 9,000 officers and staff in response to a freedom of information request. The same breach has also resulted in individuals bringing legal claims against the PSNI, with media reports suggesting a potential bill for those at up to £240m.

    There is also the issue of higher education institutions being a target for cyber attack by criminal and state actors. Loss of data through such attacks again has the potential to result in fines and other regulatory action as well as claims by those affected.

    Oversharing and undersharing

    But inadvertent sharing of information and cyberattacks are not the only areas of risk. In some circumstances a failure to ensure that information is properly collected and shared lawfully may also be a risk. And ensuring effective and appropriate flows of information to the governing body is key to it being able to fulfil its oversight function.

    One aspect of the tragic circumstances mentioned in the High Court appeal ruling in the case concerning Natasha Abrahart is the finding that there had been a failure to pass on information about a suicide attempt to key members of staff, which might have enabled action to be taken to remove pressure on Natasha.

    Another area of focus concerns sharing of information related to complaints of sexual harassment and misconduct and subsequent investigations. OfS Condition E6 and its accompanying guidance which comes fully into effect on 1 August 2025 includes measures on matters such as reporting potential complaints and the sensitive handling and fair use of information. The condition and guidance require the provider to set out comprehensively and in an easy to understand manner how it ensures that those “directly affected” by decisions are directly informed about those decisions and the reasons for them.

    There are also potential information flows concerning measures intended to protect students from any actual or potential abuse of power or conflict of interest in respect of what the condition refers to as “intimate personal relationships” between “relevant staff members” and students.

    All of these data flows are highly sensitive and institutions will need to ensure that appropriate thought is given to policies, procedures and systems security as well as identifying the legal basis for collecting, holding and sharing information, taking appropriate account of individual rights.

    A blanket approach will not serve

    Whilst there are some important broad principles in data protection law that should be applied when determining the legal basis for processing personal data, in sensitive cases like allegations of sexual harassment the question of exactly what information can be shared with another person involved in the process often needs to be considered against the particular circumstances.

    Broadly speaking in most cases where sexual harassment or mental health support is concerned, the legislation will require at minimum both a lawful basis and a condition for processing “special category” and/or data that includes potential allegations of a criminal act. Criminal offences and allegations data and special category data (which includes data relating to an individual’s health, sex life and sexual orientation) are subject to heightened controls under the legislation.

    Without getting into the fine detail it can often be necessary to consider individuals’ rights and interests in light of the specific circumstances. This is brought into sharp focus when considering matters such as:

    • Sharing information with an emergency contact in scenarios that might fall short of a clear “life or death” situation.
    • Considering what information to provide to a student who has made a complaint about sexual harassment by another student or staff member in relation to the outcome of their complaint and of any sanction imposed.

    It’s also important not to forget other legal frameworks that may be relevant to data flows. This includes express or implied duties of confidentiality that can arise where sensitive information is concerned. Careful thought needs to be given to make clear in relevant policies and documents when it is envisaged that information might need to be shared, and provided the law permits it.

    A range of other legal frameworks can also be relevant, such as consumer law, equality law and freedom of information obligations. And of course, aside from the legal issues, there will be potential reputational and institutional risks if something does go wrong. It’s important that senior management and governing bodies have sufficient oversight and involvement to encourage a culture of organisational awareness and compliance across the range of information governance issues that can arise.

    Managing the flow of information

    Institutions ought to have processes to keep their data governance under review, including measures that map out the flows and uses of data in accordance with relevant legal frameworks. The responsibility for oversight of data governance lies not only with any Data Protection Officer, but also with senior management and governors who can play a key part in ensuring a good data governance culture within institutions.

    Compliance mechanisms also need regular review and refresh including matters such as how privacy information is provided to individuals in a clear and timely way. Data governance needs to be embedded throughout the lifecycle of each item of data. And where new activities, policies or technologies are being considered, data governance needs to be a central part of project plans at the earliest stages to ensure that appropriate due diligence and other compliance requirements are in place, such as data processing agreements or data protection impact assessments are undertaken.

    Effective management of the flow ensures that the right data gets in front of the right people, at the right time – and means everyone can be confident the right balance has been struck between maintaining privacy and sharing vital information.

    This article is published in association with Mills & Reeve.

    Source link

  • Data futures, reviewed | Wonkhe

    Data futures, reviewed | Wonkhe

    As a sector, we should really have a handle on how many students we have and what they are like.

    Data Futures – the multi-year programme that was designed to modernise the collection of student data – has become, among higher education data professionals, a byword for delays, stress, and mixed messages.

    It was designed to deliver in year data (so 2024-25 data arriving within the 2024-25 academic year) three times a year, drive efficiency in data collection (by allowing for process streamlining and automation), and remove “data duplication” (becoming a single collection that could be used for multiple purposes by statutory customers and others). To date it has achieved none of these benefits, and has instead (for 2022-23 data) driven one of the sectors’ most fundamental pieces of data infrastructure into such chaos that all forward uses of data require heavy caveats.

    The problem with the future

    In short – after seven years of work (at the point the review was first mooted), and substantial investment, we are left with more problems than we started with. Most commentary has focused on four key difficulties:

    • The development of the data collection platform, starting with Civica in 2016 and later taken over by Jisc, has been fraught with difficulties, frequently delayed, and experienced numerous changes in scope
    • The documentation and user experience of the data collection platform has been lacking. Rapid changes have not resulted in updates for those who use the platform within providers, or those who support those providers (the HESA Liaison team). The error handling and automated quality rules have caused particular issues – indeed the current iteration of the platform still struggles with fields that require responses involving decimal fractions.
    • The behavior of some statutory customers – in frequently modifying requirements, changing deadlines, and putting unhelpful regulatory pressure on providers, has not helped matters.
    • The preparedness of the sector has been inconsistent between providers and between software vendors. This level of preparedness has not been fully understood – in part because of a nervousness among providers around regulatory consequences for late submissions.

    These four interlinked strands have been exacerbated by an underlying fifth issue:

    • The quality of programme management, programme delivery, and programme documentation has not been of the standards required for a major infrastructure project. Parts of this have been due to problems in staffing, and problems in programme governance – but there are also reasonable questions to be asked about the underlying programme management process.

    Decisions to be made

    An independent review was originally announced in November 2023, overlapping a parallel internal Jisc investigation. The results we have may not be timely – the review didn’t even appear to start until early 2024 – but even the final report merely represents a starting point for some of the fundamental discussions that need to happen about sector data.

    I say a “starting point” because many of the issues raised by the review concern decisions about the projected benefits of doing data futures. As none of the original benefits of the programme have been realised in any meaningful way, the future of the programme (if it has one) needs to be focused on what people actually want to see happen.

    The headline is in-year data collection. To the external observer, it is embarrassing that while other parts of the education sector can return data on a near-real time basis – universities update the records they hold on students on a regular basis so it should not be impossible to update external data too. It should not come as a surprise that when the review poses the question:

    As a priority, following completion of the 2023-24 data collection, the Statutory Customers (with the help of Jisc) should revisit the initial statement of benefits… in order to ascertain whether a move to in-year data collection is a critical dependent in order to deliver on the benefits of the data futures programme.

    This isn’t just an opportunity for regulators to consider their shopping list – a decision to continue needs to be swiftly followed by a cost-benefit analysis, reassessing the value of in-year collection and determining whether or when to pursue in-year collection. And the decision is that there will, one day, be in-year student data. In a joint statement the four statutory customers said:

    After careful consideration, we intend to take forward the collection of in-year student data

    highlighting the need for data to contribute to “robust and timely regulation”, and reminding institutions that they will need “adequate systems in place to record and submit student data on time”.

    The bit that interests me here is the implications for programme management.

    Managing successful programmes

    If you look at the government’s recent record in delivering large and complex programmes you may be surprised to learn of the existence of a Government Functional Standard covering portfolio, programme, and project management. What’s a programme? Well:

    A programme is a unique, temporary, flexible organisation created to co-ordinate, direct and oversee the implementation of a set of projects and other related work components to deliver outcomes and benefits related to a set of strategic objectives

    Language like this, and the concepts underpinning it come from what remains the gold standard programme management methodology, Managing Successful Programmes (MSP). If you are more familiar with the world of project management (project: “a unique temporary management environment, undertaken in stages, created for the purpose of delivering one or more business products or outcomes”) it bears a familial resemblance to PRINCE2.

    If you do manage projects for a living, you might be wondering where I have been for the last decade or so. The cool kids these days are into a suite of methodologies that come under the general description of “agile” – PRINCE2 these days is seen primarily as a cautionary tale: a “waterfall” (top down, documentation centered, deadline focused) management practice rather than an “iterative” (emergent, development centered, short term) one.

    Each approach has strengths and weaknesses. Waterfall methods are great if you want to develop something that meets a clearly defined need against clear milestones and a well understood specification. Agile methods are a nice way to avoid writing reports and updating documentation.

    Data futures as a case study

    In the real world, the distinction is less clear cut. Most large programmes in the public sector use elements of waterfall methods (regular project reports, milestones, risk and benefits management, senior responsible owners, formal governance) as a scaffold in which sit agile elements at a more junior level (short development cycle, regular “releases” of “product” prioritised above documentation). While this can be done well it is very easy for the two ideologically separate approaches to drift apart – and it doesn’t take much to read this into what the independent review of data futures reveals.

    Recommendation B1 calls, essentially, for clarity:

    • Clarity of roles and responsibilities
    • Clarity of purpose for the programme
    • Clarity on the timetable, and on how and when the scope of the programme can be changed

    This is amplified by recommendation C1, which looks for specific clarifications around “benefits realisation” – which itself underpins the central recommendation relating to in-year data.

    In classic programme management (like MSP) the business case will include a map of programme benefits: that is, all of the good things that will come about as a result of the hard work of the programme. Like the business case’s risk register (a list of all the bad things that might happen and what can be done if they did) it is supposed to be regularly updated and signed off by the Programme Board – which is made up of the most senior staff responsible for the work of the programme (the Senior Responsible Owners) in the lingo.

    The statement of benefits languished for some time without a full update (there was an incomplete attempt in February 2023, and a promise to make another one after the completed 2022-23 collection – we are not told whether the second had happened). In proper, grown-up, programme management this is supposed to be done in a systematic way: every programme board meeting you review the benefits and the risk register. It’s dull (most of the time!) but it is important. The board needs an eye on whether the programme still offers value overall (based on an analysis of projected benefits). And if the scope needed to change, the board would have final say on that.

    The issue with Data Futures was clarity over whether this level of governance actually had the power to do these things, and – if not – who was actually doing them. The Office for Students latterly put together quite a complex and unwieldy governance structure, with a quarterly review board having oversight of the main programme board. This QRB was made up of very senior staff at the statutory customers (OfS, HEFCW, SFC, DoE(NI)), Jisc, and HESA (plus one Margaret Monckton – now chair of this independent review! – as an external voice).

    The QRB oversaw the work of the programme board – meaning that decisions made by the senior staff nominally responsible for the direction of the programme were often second guessed by their direct line managers. The programme board was supposed to have its own assurance function and an independent observer – it did not (despite the budget being there for it).

    Stop and go

    Another role of the board is to make what are more generally called “stop-go” decisions, and are here described as “approval to proceed”. This is an important way of making sure the programme is still on track – you’d set (in advance) the criteria that needed to be fulfilled in terms of delivery (was the platform ready, had the testing been done) before you moved on to the next work package. Below this, incremental approvals are made by line managers or senior staff as required, but reported upwards to the board.

    What seems to have happened a lot in the Data Futures programme is what’s called conditional approvals – where some of these conditions were waived based on assurances that the remaining required work was completed. This is fine as it goes (not everything lines up all the time) but as the report notes:

    While the conditions of the approvals were tracked in subsequent increment approval documents, they were not given a deadline, assignee or accountable owner for the conditions. Furthermore, there were cases where conditions were not met by the time of the subsequent approval

    Why would you do that? Well, you’d be tempted if you had another board above you – comprising very senior staff and key statutory customers – concerned about the very public problems with Data Futures and looking for progress. The Quarterly Review Board (QRB) as it turned out, only actually ended up making five decisions (and in three of these cases it just punted the issue back down to the programme board – the other two, for completists, were to delay plans for in-year collection).

    What it was meant to be doing was “providing assurance on progress”, “acting as an escalation point” and “approving external assurance activities”. As we’ve already seen, it didn’t really bother with external assurance. And on the other points the review is damning:

    From the minutes provided, the extent to which the members of the QRG actively challenged the programme’s progress and performance in the forum appears to be limited. There was not a clear delegation of responsibilities between the QRG, Programme Board and other stakeholders. In practice, there was a lack of clarity also on the role of the Data Futures governance structure and the role of the Statutory Customers separately to the Data Futures governance structure; some decisions around the data specification were taken outside of the governance structure.

    Little wonder that the section concludes:

    Overall, the Programme Board and QRG were unable to gain an independent, unbiased view on the progress and success of the project. If independent project assurance had been in place throughout the Data Futures project, this would have supported members of the Programme Board in oversight of progress and issues may have been raised and resolved sooner

    Resourcing issues

    Jisc, as developer, took on responsibility for technical delivery in late 2019. Incredibly, Jisc was not provided with funding to do this work until March 2020.

    As luck would have it, March 2020 saw the onset of a series of lockdowns and a huge upswing in demand for the kind of technical and data skills needed to deliver a programme like data futures. Jisc struggled to fill key posts, most notably running for a substantive period of time without a testing lead in post.

    If you think back to the 2022-23 collection, the accepted explanation around the sector for what – at heart – had gone wrong was a failure to test “edge cases”. Students, it turns out, are complex and unpredictable things – with combinations of characteristics and registrations that you might not expect to find. A properly managed programme of testing would have focused on these edge cases – there would have been less issues faced when the collection went live.

    Underresourcing and understaffing are problems in their own right, but these were exacerbated by rapidly changing data model requirements, largely coming from statutory customers.

    To quote the detail from from the report:

    The expected model for data collection under the Data Futures Programme has changed repeatedly and extensively, with ongoing changes over several years on the detail of the data model as well as the nature of collection and the planned number of in-year collections. Prior to 2020, these changes were driven by challenges with the initial implementation. The initial data model developed was changed substantially due to technical challenges after a number of institutions had expended significant time and resource working to develop and implement it. Since 2020, these changes were made to reflect evolving requirements of the return from Statutory Customers, ongoing enhancements to the data model and data specification and significantly, the ongoing development of quality rules and necessary technical changes determined as a result of bugs identified after the return had ‘gone live’. These changes have caused substantial challenges to delivery of the Data Futures Programme – specifically reducing sector confidence and engagement as well as resulting in a compressed timeline for software development.

    Sector readiness

    It’s not enough to conjure up a new data specification and platform – it is hugely important to be sure that your key people (“operational contacts”) within the universities and colleges that would be submitting data are ready.

    On a high level, this did happen – there were numerous surveys of provider readiness, and the programme also worked with the small number of software vendors that supply student information systems to the sector. This formal programme communication came alongside the more established links between the sector and the HESA Liaison team.

    However, such was the level of mistrust between universities and the Office for Students (who could technically have found struggling providers in breach of condition of registration F4), that it is widely understood that answers to these surveys were less than honest. As the report says:

    Institutions did not feel like they could answer the surveys honestly, especially in instances where the institution was not on track to submit data in line with the reporting requirements, due to the outputs of the surveys being accessible to regulators/funders and concerns about additional regulatory burden as a result.

    The decision to scrap a planned mandatory trial of the platform, made in March 2022 by the Quarterly Review Group, was ostensibly made to reduce burden – but, coupled with the unreliable survey responses, this meant that HESA was unable to identify cases where support was needed.

    This is precisely the kind of risk that should have been escalated to programme board level – a lack of transparency between Jisc and the board about readiness made it harder to take strategic actions on the basis of evidence about where the sector really was. And the issue continued into live collection – because Liaison were not made aware of common problems (“known issues”, in fact) the team often struggled with out-of-date documentation: meaning that providers got conflicting messages from different parts of Jisc.

    Liaison, on their part, dealt with more than 39,000 messages between October and December 2023 (during the peak of issues raised during the collection process) – even given the problems noted above they resolved 61 per cent of queries on the first try. Given the level of stress in the sector (queries came in at all hours of the day) and the longstanding and special relationship that data professionals have with HESA Liasion, you could hardly criticise that team for making the best of a near-impossible situation.

    I am glad to see that the review notes:

    The need for additional staff, late working hours, and the pressure of user acceptance testing highlights the hidden costs and stress associated with the programme, both at institutions and at Jisc. Several institutions talked about teams not being able to take holidays over the summer period due to the volume of work to be delivered. Many of the institutions we spoke to indicated that members of their team had chosen to move into other roles at the institution, leave the sector altogether, experienced long term sickness absence or retired early as a result of their experiences, and whilst difficult to quantify, this will have a long-term impact on the sector’s capabilities in this complex and fairly niche area.

    Anyone who was even tangentially involved in the 2022-23 collection, or attended the “Data Futures Redux” session at the Festival of Higher Education last year, will find those words familiar.

    Moving forward

    The decision on in-year data has been made – it will not happen before the 2026-27 academic year, but it will happen. The programme delivery and governance will need to improve, and there are numerous detailed recommendations to that end: we should expect more detail and the timeline to follow.

    It does look as though there will be more changes to the data model to come – though the recommendation is that this should be frozen 18 months before the start of data collection which by my reckoning would mean a confirmed data model printed out and on the walls of SROC members in the spring of 2026. A subset of institutions would make an early in-year submission, which may not be published to “allow for lower than ideal data quality”.

    On arrangements for collections for 2024-25 and 2025-26 there are no firm recommendations – it is hoped that data model changes will be minimal and the time used to ensure that the sector and Jisc are genuinely ready for the advent of the data future.

    Source link

  • Student Debt by Ethnicity | HESA

    Student Debt by Ethnicity | HESA

    Hi all. Just a quick one today, this time on some data I recently got from StatsCan.

    We know a fair a bit about student debt in Canada, especially with respect to distribution by gender, type of institution, province, etc. (Chapter 6 of The State of Postsecondary Education in Canada is just chock full of this kind of data if you’re minded to take a deeper dive). But to my knowledge no one has ever pulled and published the data on debt by ethnicity, even though this data has been collected for quite some time through the National Graduates Survey (NGS). So I ordered the data, and here’s what I discovered.

    Figure 1 shows incidence of borrowing for the graduating class of 2020, combined for all graduates of universities and graduates, for the eight largest ethnicities covered by the NGS (and before anyone asks, “indigeneity” is not considered an ethnicity so anyone indicating an indigenous ethnicity is unfortunately excluded from this data… there’s more below on the challenges of getting additional data). And the picture it shows is…a bit complex.

    Figure 1: Incidence of Borrowing, College and University Graduates Combined, Class of 2020

    If you just look at the data on government loan programs (the orange bars), we see that only Arab students have borrowing rates in excess of 1 in 2. But for certain ethnicities, the borrowing rate is much lower. For Latin American and Chinese students, the borrowing rate is below 1 in 3, and among South Asian students the borrowing rate is barely 1 in 5. Evidence of big differences in attitudes towards borrowing!

    Except…well when you add in borrowing from private sources (e.g. from banks and family) so as to take a look at overall rates of borrowing incidence, the differences in borrowing rates are a lot narrower. Briefly, Asian and Latin American students borrow a lot more money from private sources (mainly family) than do Arab students, whites, and Blacks. These probably come with slightly easier repayment terms, but it’s hard to know for sure. An area almost certainly worthy of further research.

    There is a similarly nuanced picture when we look at median levels of indebtedness among graduates who had debt. This is shown below in Figure 2.

    Figure 2: Median Borrowing, College and University Graduates Combined, Class of 2020

    Now, there isn’t a huge amount of difference in exiting debt levels by ethnicity: the gap is only about $6,000 between the lowest total debt levels (Filipinos) and the highest (Chinese). But part of the problem here is that we can’t distinguish the reason for the various debt levels. Based on what we know about ethnic patterns of postsecondary education, we can probably guess that Filipino students have low debt levels not because they are especially wealthy and can afford to go to post-secondary without financial assistance. But rather because they are more likely to go to college and this spend less time, on average, in school paying fees and accumulating debt. Similarly, Chinese students don’t have the highest debt because they have low incomes; they have higher debt because they are the ethnic group the most likely to attend university and spend more time paying (higher) fees.

    (Could we get the data separately for universities and colleges to clear up the confound? Yes, we could. But it cost me $3K just to get this data. Drilling down a level adds costs, as would getting data based on indigenous identity, and this is a free email, and so for the moment what we have above will have to do. If anyone wants to pitch in a couple of grand to do more drilling-down, let me know and I would be happy to coordinate some data liberation).

    It is also possible to use NGS data to look at post-graduate income by debt. I obtained the data by in fairly large ranges (e.g. $0-20K, $20-60K, etc.), but it’s possible on the basis of that to estimate roughly what median incomes are (put it this way: the exact numbers are not exactly right, but the ordinal rank of income of the various ethnicities are probably accurate). My estimations of median 2023 income of 2020 graduates—which includes those graduates who are not in the labour market full-time, if you’re wondering why the numbers look a little low—are shown below in Figure 3.

    Figure 3: Estimate Median 2023 Income, College and University Graduates Combined, Class of 2020

    Are there differences in income here? Yes, but they aren’t huge. Most ethnic groups have median post-graduate incomes between $44 and $46,000. The two lowest-earning groups (Latin Americans and Filipinos) re both disproportionately enrolled in community colleges, which is part of what is going on in this data (if you want disaggregated data, see above).

    Now, the data from the previous graphs can be combined to look at debt-to-income ratios, both for students with debt, and all students (that is, including those that do not borrow). This is shown below in Figure 4.

    Figure 4: Estimated Median 2023 Debt-to-Income Ratios, College and University Graduates Combined, Class of 2020

    If you’re just dividing indebtedness by income (the blue bars), you get a picture that looks a lot like Figure 2 in debt, because differences in income are pretty small. But if you are looking at debt-to-income ratios across all students (including those that do not borrow) you get a very different picture because as we saw in Figure 1, there are some pretty significant differences in overall borrowing rates. So, for instance, Chinese students go from having the worst debt-to-income ratio on one measure to being middle of the pack on another because they have relatively low incidence of borrowing; similarly, students of Latin American origin go from being middle-of-the-pack to nearly the lowest debt-to-income ratios because they are a lot less likely to borrow than others. Black students end up having among the highest debt-to-income ratios not because they earn significantly less than other graduates, but because both the incidence and amount of their borrowing is relatively high.

    But I think the story to go with here is that while there are differences between ethnic groups in terms of borrowing, debt, and repayment ratios, and that it’s worth trying to do something to narrow them, the difference in these rates is not enormous. Overall, it appears that as a country we are achieving reasonably good things here, with the caveat that if this data were disaggregated by university/ college, the story might not be quite as promising.

    And so ends the first-ever analysis of student debt and repayment by ethnic background. Hope you found it moderately enlightening.

    Source link

  • Canadian study permit approvals fall far below cap targets

    Canadian study permit approvals fall far below cap targets

    Canadian study permit approvals are on track to fall by 45% in 2024, rather than the 35% planned reduction of last year’s controversial international student caps, new IRCC data analysed by ApplyBoard has revealed.  

    “The caps’ impact was significantly underestimated,” ApplyBoard founder Meti Basiri told The PIE News. “Rapidly introduced policy changes created confusion and had an immense impact on student sentiment and institutional operations.  

    “While aiming to manage student numbers, these changes failed to account for the perspectives of students, and their importance to Canada’s future economy and communities,” he continued.  

    The report reveals the far-reaching impact of Canada’s study permit caps, which were announced in January 2024 and followed by a tumultuous year of policy changes that expanded restrictions and set new rules for post-graduate work permit eligibility, among other changes.  

    For the first 10 months of 2024, Canada’s study permit approval rate hovered just above 50%, resulting in an estimated maximum of 280,000 approvals from K-12 to postgraduate levels. This represents the lowest number of approvals in a non-pandemic year since 2019. 

    Source: IRCC. Disclaimer: Data for 2021-Oct 2024 is sourced from IRCC. Full-year 2024 figures are estimates extrapolated from Jan-Oct 2024 and full-year 2021-2023 IRCC data. Projections may be subject to change based on changing conditions and source data.

    “Even from the early days of the caps, decreased student interest outpaced government estimates,” noted the report, with stakeholders highlighting the reputational damage to Canada as a study destination.  

    “Approvals for capped programs fell by 60%, but even cap-exempt programs declined by 27%. Major source countries like India, Nigeria, and Nepal saw over 50% declines, showing how policies have disrupted demand across all study levels,” said Basiri.  

    Following major PGWP and study permit changes announced by the IRCC in September 2024, four out of five international student counsellors surveyed by ApplyBoard agreed that Canada’s caps had made it a less desirable study destination. 

    Though stakeholders across Canada recognised the need to address fraud and student housing issues, many had urged the federal government to wait until the impact of the initial caps was clear before going ahead with seemingly endless policy changes.  

    At the CBIE conference in November 2024, immigration minister Marc Miller said he “profoundly disagreed” with the prevailing sector view that the caps and subsequent PGWP and permanent residency restrictions had been an “overcorrection”.

    Post-secondary programs, which were the primary focus of the 2024 caps, were hit hardest by the restrictions, with new international enrolments at colleges estimated to have dropped by 60% as a result of the policies.  

    While Canada’s largest source destinations saw major declines, the caps were not felt evenly across sending countries. Senegal, Guinea and Vietnam maintained year-over-year growth, signalling potential sources of diversity for Canada’s cap era.   

    The report also highlighted Ghana’s potential as a source destination, where approval ratings – though declining from last year – remain 175% higher than figures from 2022. 

    Rapidly introduced policy changes created confusion and had an immense impact on student sentiment

    Meti Basiri, ApplyBoard

    The significant drop in study permit approvals was felt across all provinces, but Ontario – which accounted for over half of all study permit approvals in 2023 – and Nova Scotia have seen the largest impact, falling by 55% and 54.5% respectively.

    Notably, the number of study permits processed by the IRCC dropped by a projected 35% in 2024, in line with the government’s targets, but approval rates have not kept pace.

    When setting last year’s targets, minister Miller only had the power to limit the number of applications processed by the IRCC, not the number of study permits that are approved.  

    The initial target of 360,000 approved study permits was based on an estimated approval rate of 60%, resulting in a 605,000 cap on the number of applications processed. 

    Following new policies such as the inclusion of postgraduate programs in the 2025 cap, Basiri said he anticipated that study permit approvals would remain below pre-cap levels.  

    “While overall student numbers may align with IRCC’s targets, the broader impact on institutional readiness and Canada’s reputation will be key areas to watch in 2025,” he added.  

    Source link

  • Crafting technology-driven IEPs

    Crafting technology-driven IEPs

    Key points:

    Individualized Education Plans (IEP) have been the foundation of special education for decades, and the process in which these documents are written has evolved over the years.

    As technology has evolved, writing documents has also evolved. Before programs existed to streamline the IEP writing process, creating IEPs was once a daunting task of paper and pencil. Not only has the process of writing the IEP evolved, but IEPs are becoming technology-driven.

    Enhancing IEP goal progress with data-driven insights using technology: There are a variety of learning platforms that can monitor a student’s performance in real-time, tailoring to their individual needs and intervening areas for improvement. Data from these programs can be used to create students’ annual IEP goals. This study mentions that the ReadWorks program, used for progress monitoring IEP goals, has 1.2 million teachers and 17 million students using its resources, which provide content, curricular support, and digital tools. ReadWorks is free and provides all its resources free of charge and has both printed and digital versions of the material available to teachers and students (Education Technology Nonprofit, 2021).

    Student engagement and involvement with technology-driven IEPs: Technology-driven IEPs can also empower students to take an active role in their education plan. According to this study, research shows that special education students benefit from educational technology, especially in concept teaching and in practice-feedback type instructional activities (Carter & Center, 2005; Hall, Hughes & Filbert, 2000; Hasselbring & Glaser, 2000). It is vital for students to take ownership in their learning. When students on an IEP reach a certain age, it is important for them to be the active lead in their plan. Digital tools that are used for technology-driven IEPs can provide students with visual representations of their progress, such as dashboards or graphs. When students are given a visual representation of their progress, their engagement and motivation increases.

    Technology-driven IEPs make learning fun: This study discusses technology-enhanced and game based learning for children with special needs. Gamified programs, virtual reality (VR), and augmented reality (AR) change the learning experience from traditional to transformative. Gamified programs are intended to motivate students with rewards, personalized feedback, and competition with leaderboards and challenges to make learning feel like play. Virtual reality gives students an immersive experience that they would otherwise only be able to experience outside of the classroom. It allows for deep engagement and experiential learning via virtual field trips and simulations, without the risk of visiting dangerous places or costly field trip fees that not all districts or students can afford. Augmented reality allows students to visualize abstract concepts such as anatomy or 3D shapes in context. All these technologies align with technology-driven IEPs by providing personalized, accessible, and measurable learning experiences that address diverse needs. These technologies can adapt to a student’s individual skill level, pace, and goals, supporting their IEP.

    Challenges with technology-driven IEPs: Although there are many benefits to
    technology-driven IEPs, it is important to address the potential challenges to ensure equity across school districts. Access to technology in underfunded school districts can be challenging without proper investment in infrastructures, devices, and network connection. Student privacy and data must also be properly addressed. With the use of technologies for technology-driven IEPs, school districts must take into consideration laws such as the Family Educational Rights and Privacy Act (FERPA).

    The integration of technology into the IEP process to create technology-driven IEPs represents a shift from a traditional process to a transformative process. Technology-driven IEPs create more student-centered learning experiences by implementing digital tools, enhancing collaboration, and personalized learning experiences. These learning experiences will enhance student engagement and motivation and allow students to take control of their own learning, making them leaders in their IEP process. However, as technology continues to evolve, it is important to address the equity gap that may arise in underfunded school districts.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Keep talking about data | Wonkhe

    Keep talking about data | Wonkhe

    How’s your student data this morning?

    Depending on how close you sit to your institutional student data systems, your answer may range from a bemused shrug to an anguished yelp.

    In the most part, we remain blissfully unaware of how much work it currently takes to derive useful and actionable insights from the various data traces our students leave behind them. We’ve all seen the advertisements promising seamless systems integration and a tangible improvement in the student experience, but in most cases the reality is far different.

    James Gray’s aim is to start a meaningful conversation about how we get there and what systems need to be in place to make it happen – at a sector as well as a provider level. As he says:

    There is a genuine predictive value in using data to design future solutions to engage students and drive improved outcomes. We now have the technical capability to bring content, data, and context together in a way that simply has not been possible before now.”

    All well and good, but just because we have the technology doesn’t mean we have the data in the right place or the right format – the problem is, as Helen O’Sullivan has already pointed out on Wonkhe, silos.

    Think again about your student data.

    Some of it is in your student information system (assessment performance, module choices), which may or may not link to the application tracking systems that got students on to courses in the first place. You’ll also have information about how students engage with your virtual learning environment, what books they are reading in the library, how they interact with support services, whether and how often they attend in person, and their (disclosed) underlying health conditions and specific support needs.

    The value of this stuff is clear – but without a whole-institution strategic approach to data it remains just a possibility. James notes that:

    We have learned that a focus on the student digital journey and institutional digital transformation means that we need to bring data silos together, both in terms of use and collection. There needs to be a coherent strategy to drive deployment and data use.

    But how do we get there? From what James has seen overseas, in the big online US providers like Georgia Tech and Arizona State data is something that is managed strategically at the highest levels of university leadership. It’s perhaps a truism to suggest that if you really care about something it needs ownership at a senior level, but having that level of buy-in unlocks the resource and momentum that a big project like this needs.

    We also talked about the finer-grained aspects of implementation – James felt that the way to bring students and staff on board is to clearly demonstrate the benefits, and listen (and respond) to concerns. That latter is essential because “you will annoy folks”.

    Is it worth this annoyance to unlock gains in productivity and effectiveness? Ideally, we’d all be focused on getting the greatest benefit from our resources – but often processes and common practices are arranged in sub-optimal ways for historical reasons, and rewiring large parts of someone’s role is a big ask. The hope is that the new way will prove simpler and less arduous, so it absolutely makes sense to focus on potential benefits and their realisation – and bringing in staff voices at the design stage can make for gains in autonomy and job satisfaction.

    The other end of the problem concerns procurement. Many providers have updated their student records systems in recent years in response to the demands of the Data Futures programme. The trend has been away from bespoke and customised solutions and towards commercial off-the-shelf (COTS) procurement: the thinking here being that updates and modifications are easier to apply consistently with a standard install.

    As James outlines, providers are looking at a “buy, build, or partner” decision – and institutions with different goals (and at different stages of data maturity) may choose different options. There is though enormous value in senior leaders talking across institutions about decisions such as these. “We had to go through the same process” James outlined. “In the end we decided to focus on our existing partnership with Microsoft to build a cutting edge data warehouse, and data ingestion, hierarchy and management process leveraging Azure and MS Fabric with direct connectivity to Gen AI capabilities to support our university customers with their data, and digital transformation journey.” – there is certainly both knowledge and hard-won experience out there about the different trade-offs, but what university leader wants to tell a competitor about the time they spent thousands of pounds on a platform that didn’t communicate with the rest of their data ecosystem?

    As Claire Taylor recently noted on Wonkhe there is a power in relationships and networks among senior leaders that exist to share learning for the benefit of many. It is becoming increasingly clear that higher education is a data-intensive sector – so every provider should feel empowered to make one of the most important decisions they will make in the light of a collective understanding of the landscape.

    This article is published in association with Kortext. Join us at an upcoming Kortext LIVE event in London, Manchester and Edinburgh in January and February 2025 to find out more about Wonkhe and Kortext’s work on leading digital capability for learning, teaching and student success.

    Source link