Category: OECD

  • Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Before our show starts today, I just wanna take a minute to note the passing of Professor Claire Callender, OBE. For the last two and a half decades, she’s been one of the most important figures in UK higher education studies, in particular with respect to student loans and student finance. Holder of a joint professorship at UCL Institute of Education and Birkbeck University of London, she was also instrumental in setting up the ESRC Centre for Global Higher Education, of which she later became deputy director. I just want to quote the short obituary that her colleague Simon Marginson wrote for her last week after her passing from lung cancer. He said, “What we’ll remember about Claire is the way she focused her formidable capacity for rational thought on matters to which she was committed, her gravitas that held the room when speaking, and the warmth that she evoked without fail in old and new acquaintances.”

    My thoughts and condolences to her partner Annette, and to her children. We’ll all miss Claire. 


    I suspect most of you are familiar with the OECD’s Program for International Student Assessment, or PISA. That’s a triannual test of 15 year olds around the world. It tries to compare how teenagers fare in real world tests of literacy and numeracy. But you might not be as familiar with PISA’s cousin, the Program for International Assessment of Adult Competencies or PIAAC. To simplify enormously, it’s PISA, but for adults, and it only comes out once a decade with the latest edition having appeared on December 10th of last year. Now, if you’re like most people, you’re probably asking yourself, what does PIAAC measure exactly?

    PISA pretty clearly is telling us something about school systems. Adults, the subject of the PIAAC test, they’ve been out of school for a long time. What do test results mean for people who’ve been out of school for, in some cases, decades? And what kinds of meaningful policies might be made on the basis of this data?

    Today my guest is the CEO of Canada’s Future Skills Centre, Noel Baldwin. Over the past decade, both in his roles at FSC, his previous ones at the Council Minister of Education Canada, he’s arguably been one of the country’s most dedicated users of PIAAC data. As part of Canada’s delegation to the OECD committee in charge of PIAAC, he also had a front row seat to the development of these tests and the machinery behind these big international surveys. 

    Over the course of the next 20 or so minutes, you’ll hear Noel and I, both fellow members of the Canada Millennium Scholarship Foundation Mafia, discuss such issues as how the wording of international surveys gets negotiated, why we seem to be witnessing planet wide declines in adult literacy, what research questions PIAAC is best suited to answer, and maybe most intriguingly what PIAAC 3 might look like a decade from now.

    I really enjoyed this conversation and I hope you do too. Anyway, over to Noel.


    The World of Higher Education Podcast
    Episode 3.28 | Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Transcript

    Alex Usher (AU): Noel, some of our listeners might be familiar with big international testing programs like PISA—the Program for International Student Assessment. But what is the Program for the International Assessment of Adult Competencies? What does it aim to measure, and why?

    Noel Baldwin (NB): It’s somewhat analogous to PISA, but it’s primarily focused on working-age adults. Like PISA, it’s a large-scale international assessment organized by the OECD—specifically by both the education and labor secretariats. It’s administered on the ground by national statistical agencies or other government agencies in participating countries.

    PIAAC is mainly focused on measuring skills like literacy and numeracy. Over time, though, the OECD has added other skill areas relevant to the intersection of education and labor markets—things like digital skills, technology use, problem solving, and social-emotional skills.

    In addition to the assessment itself, there’s a large battery of background questions that gather a lot of demographic information—details about respondents’ work life, and other factors like health and wellbeing. This allows researchers to draw correlations between the core skills being measured and how those skills are used, or what kind of impact they have on people’s lives.

    AU: How do they know that what’s being measured is actually useful in the workplace? I mean, the literacy section is reading comprehension, and the math is sort of like, you know, “If two trains are moving toward each other, one from Chicago and one from Pittsburgh…” It’s a bit more sophisticated than that, but that kind of thing. How do they know that actually measures anything meaningful for workplace competencies?

    NB: That’s a good question. One thing to start with is that the questions build from fairly easy and simple tasks to much more complex ones. That allows the OECD to create these scales, and they talk a lot about proficiency levels—level one up to five, and even below level one in some cases, for people with the weakest skill levels.

    And while PIAAC itself is relatively new, the assessment of these competencies isn’t. It actually dates back to the early 1990s. There’s been a lot of research—by the OECD and by psychometricians and other researchers—on the connections between these skills and broader outcomes.

    The key thing to understand is that, over time, there’s been strong evidence linking higher literacy and numeracy skills to a range of life outcomes, especially labor market outcomes. It’s a bit like educational attainment—these things often act as proxies for one another. But the stronger your skills, the more likely you are to be employed, to earn higher wages, to avoid unemployment, and to be adaptable and resilient.

    And it’s not just about work. It extends to other areas too—life satisfaction, for instance. There are even some interesting findings about democratic participation and people’s perceptions of how their society is doing. So there are pretty strong correlations between higher-level skills and a variety of positive outcomes.

    AU: But, I can imagine that the nature of an economy—whether it’s more manufacturing-based or service-based—might affect what kinds of skills are relevant. So different countries might actually want to measure slightly different things. How do you get 50—or however many, dozens of countries—to agree on what skills to assess and how to measure them?

    NB: The point at which OECD countries agreed to focus on literacy and numeracy actually predates me—and it also predates a lot of today’s focus on more digitally oriented skills. It was a much more analog world when this started, and so literacy and numeracy made a lot of sense. At the time, most of the information people consumed came in some form of media that required reading comprehension and the ability to navigate text. And then, on the numeracy side, the ability to do anything from basic to fairly advanced problem solving with numbers was highly relevant. So I suspect that when this was being developed—through the 1980s and into the early 1990s—there was a high degree of consensus around focusing on those core skills.

    The development of the instruments themselves is also an international effort. It’s led by the OECD, but they work with experts from a range of countries to test and validate the items used in the assessment. Educational Testing Service (ETS) in the U.S. is quite involved, and there are also experts from Australia and Canada. In fact, Canada was very involved in the early stages—both through Statistics Canada and other experts—particularly in developing some of the initial tools for measuring literacy. So, the consensus-building process includes not just agreeing on what to measure and how to administer it, but also developing the actual assessment items and ensuring they’re effective. They do field testing before rolling out the main assessment to make sure the tools are as valid as possible.

    AU: Once the results are in and published, what happens next? How do governments typically use this information to inform policy?

    NB: I’ll admit—even having been on the inside of some of this—it can still feel like a bit of a black box. In fact, I’d say it’s increasingly becoming one, and I think we’ll probably get into that more as the conversation goes on.

    That said, different countries—and even different provinces and territories within Canada—use the information in different ways. It definitely gets integrated into various internal briefings. I spent some time, as you know, at the Council of Ministers of Education, and we saw that both in our own work and in the work of officials across the provinces and territories.

    After the last cycle of PIAAC, for instance, Quebec produced some fairly detailed reports analyzing how Quebecers performed on the PIAAC scales—comparing them to other provinces and to other countries. That analysis helped spark conversations about what the results meant and what to do with them. New Brunswick, for example, launched a literacy strategy shortly after the last PIAAC cycle, which suggests a direct link between the data and policy action.

    So there are examples like that, but it’s also fair to say that a lot of the data ends up being used internally—to support conversations within governments. Even since the most recent PIAAC cycle was released in December, I’ve seen some of that happening. But there’s definitely less in the public domain than you might expect—and less than there used to be, frankly.

    AU: Some of the findings in this latest PIAAC cycle—the headline that got the most traction, I think—was the fact that we’re seeing declines in literacy and numeracy scores across much of the OECD. A few countries bucked the trend—Canada saw a small decline, and parts of Northern Europe did okay—but most countries were down. What are the possible explanations for this trend? And should we be concerned?

    NB: I think we should be really aware. When it comes to concern, though, I’m always a bit hesitant to declare a crisis. There’s a lot of work still to be done to unpack what’s going on in this PIAAC cycle.

    One thing to keep in mind is that most of the responses were collected during a time of ongoing global turmoil. The data was gathered in 2022, so we were still in the middle of the pandemic. Just getting the sample collected was a major challenge—and a much bigger one than usual.

    With that caveat in mind, the OECD has started to speculate a bit, especially about the literacy side. One of the things they’re pointing to is how radically the way people consume information has changed over the past 10 years.

    People are reading much shorter bits of text now, and they’re getting information in a much wider variety of formats. There are still items in the literacy assessment that resemble reading a paragraph in a printed newspaper—something that just doesn’t reflect how most people engage with information anymore. These days, we get a lot more of it through video and audio content.

    So I think those shifts in how we consume information are part of the story. But until we see more analysis, it’s hard to say for sure. There are some signals—differences in gender performance across countries, for example—that we need to unpack. And until we do that, we’re not going to have a great sense of why outcomes look the way they do.

    AU: Let’s focus on Canada for a second. As with most international education comparisons, we end up in the top—but at the bottom of the top third, basically. It doesn’t seem to matter what we do or when—it’s always that pattern. Looking at global trends, do you think Canada stands out in any way, positively or negatively? Are there things we’re doing right? Or things we’re getting wrong?

    NB: Well, I’d say we continue to see something that the OECD points out almost every time we do one of these assessments: the gap between our top performers and our lowest performers is smaller than in many other countries. That’s often taken as a sign of equity, and I’d say that’s definitely a good news story.

    In the global comparison, we held pretty much steady on literacy, while many countries saw declines. Holding steady when others are slipping isn’t a bad outcome. And in numeracy, we actually improved.

    The distribution of results across provinces was also more even than in the last cycle. Last time, there was much more variation, with several provinces falling below the OECD or Canadian average. This time around, we’re more tightly clustered, which I think is another positive.

    If you dig a little deeper, there are other encouraging signs. For example, while the OECD doesn’t have a perfect measure of immigration status, it can identify people who were born outside a country or whose parents were. Given how different Canada’s demographic profile is from nearly every other participating country—especially those in Northern Europe—I think we’re doing quite well in that regard.

    And in light of the conversations over the past few years about immigration policy and its impacts across our society, I think it’s a pretty good news story that we’re seeing strong performance among those populations as well.

    AU: I know we’ll disagree about this next question. My impression is that, in Canada, the way PIAAC gets used has really changed over the last decade. The first round of PIAAC results got a lot of attention—StatsCan and the Council of Ministers of Education both published lengthy analyses.

    And maybe “crickets” is too strong a word to describe the reaction this time, but it’s definitely quieter. My sense is that governments just don’t care anymore. When they talk about skills, the narrative seems focused solely on nursing and the skilled trades—because those are seen as bottlenecks on the social side and the private sector side.

    But there’s very little interest in improving transversal skills, and even less knowledge or strategy about how to do so. Make me less cynical.

    NB: Well, it’s funny—this question is actually what kicked off the conversation that led to this podcast. And I’ll confess, you’ve had me thinking about it for several weeks now.

    One thing I want to distinguish is caring about the skills themselves versus how the data is being released and used publicly. There’s no denying that we’re seeing less coming out publicly from the governments that funded the study. That’s just true—and I’m not sure that’s going to change.

    I think that reflects a few things. Partly, it’s the changed fiscal environment and what governments are willing to pay for. But it’s also about the broader information environment we’re in today compared to 2013.

    As I’ve been reflecting on this, I wonder if 2012 and 2013 were actually the tail end of the era of evidence-based policymaking—and that now we’re in the era of vibes-based policymaking. And if that’s the case, why would you write up detailed reports about something you’re mostly going to approach from the gut?

    On the skills side, though, I still think there’s an interesting question. A few weeks ago, I felt more strongly about this, but I still believe it’s not that governments don’t care about these foundational skills. Rather, I think the conversation about skills has shifted.

    We may have lost sight of how different types of skills build on one another—starting from foundational literacy and numeracy, then layering on problem-solving, and eventually reaching digital competencies. That understanding might be missing in the current conversation.

    Take the current moment around AI, for example. Maybe “craze” is too strong a word, but there’s a belief that people will become great at prompt engineering without any formal education. Mark Cuban—on BlueSky or wherever, I’m not sure what they call posts there—made a point recently that you won’t need formal education with generative AI. If you can get the right answers out of a large language model, you’ll outperform someone with an advanced degree.

    But that completely overlooks how much you need to understand in order to ask good questions—and to assess whether the answers you get are worth anything. So we may start to see that shift back.

    That said, you’re right—there has definitely been a move in recent years toward thinking about workforce issues rather than broader skill development. And that may be a big part of what’s going on.

    AU: What do you think is the most interesting or under-explored question that PIAAC data could help answer, but that we haven’t fully investigated yet? This dataset allows for a lot of interesting analysis. So if you could wave a magic wand and get some top researchers working on it—whether in Canada or internationally—where would you want them to focus?

    NB: First, I’ll just make a small plug. We’ve been working on what we hope will become a PIAAC research agenda—something that responds to the things we care about at the Future Skills Centre, but that we hope to advance more broadly in the coming weeks and months. So we are actively thinking about this.

    There are a bunch of areas that I think are really promising. One is the renewed conversation about productivity in Canada. I think PIAAC could shed light on the role that skills play in that. The Conference Board of Canada did a piece a while back looking at how much of the productivity gap between Canada and the U.S. is due to skill or labor factors. Their conclusion was that it wasn’t a huge part—but I think PIAAC gives us tools to continue digging into that question.

    Another area the OECD often highlights when talking about Canada is the extent to which workers are overqualified or overskilled for the jobs they’re in. That’s a narrative that’s been around for a while, but one where I think PIAAC could offer deeper insights.

    It becomes even more interesting when you try to link it to broader labor supply questions—like the role of immigration. Some people have suggested that one reason Canada lags in things like technology integration or capital investment is that we’ve substituted skilled labor for that kind of investment.

    With PIAAC, we might be able to explore whether overqualification or overskilling is connected to the way we’ve managed immigration over the last couple of decades.

    So, there are a few areas there that I think are both relevant and under-explored. And of course, on the international side, you’re right—we should be looking for examples of countries that have had success, and thinking about what we can emulate, borrow from, or be inspired by.

    AU: I don’t know if either of us wants to still be doing this in 10 years, but if we were to have this conversation again a decade from now, what do you think—or hope—will have changed? What will the long-term impact of PIAAC Cycle 2 have been, and how do you think PIAAC 3 might be different?

    NB: Well, I think I need to say this out loud: I’m actually worried there won’t be a PIAAC 3.

    We’re recording this in early 2025, which is a pretty turbulent time globally. One of the things that seems clear is that the new U.S. administration isn’t interested in the Department of Education—which likely means they won’t be interested in continuing the National Center for Education Statistics.

    And like with many international initiatives, the U.S. plays a big role in driving and valuing efforts like PIAAC. So I do worry about whether there will be a third cycle. If it happens without U.S. participation, it would be a very different kind of study.

    But I hope that in 10 years, we are talking about a robust PIAAC 3—with strong participation from across OECD countries.

    I also hope there’s continued investment in using PIAAC data to answer key research questions. It’s just one tool, of course, but it’s a big one. It’s the only direct assessment of adult skills we have—where someone is actually assessed on a defined set of competencies—so it’s really valuable.

    For an organization like ours, which is focused on adult skills in the workforce, it’s up to us to push forward and try to get answers to some of these questions. And I hope the research we and others are doing will find its way into policy conversations—especially as we think about how workforce needs, skills, and the broader economy are going to change over the next decade.

    It would be a wasted opportunity if it didn’t.

    AU: Noel, thanks so much for being with us today.

    NB: Thanks Alex.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service. Please note, the views and opinions expressed in each episode are those of the individual contributors, and do not necessarily reflect those of the podcast host and team, or our sponsors.

    This episode is sponsored by Studiosity. Student success, at scale – with an evidence-based ROI of 4.4x return for universities and colleges. Because Studiosity is AI for Learning — not corrections – to develop critical thinking, agency, and retention — empowering educators with learning insight. For future-ready graduates — and for future-ready institutions. Learn more at studiosity.com.

    Source link

  • Deafening Silence on PIAAC | HESA

    Deafening Silence on PIAAC | HESA

    Last month, right around the time the blog was shutting down, the OECD released its report on the second iteration of the Programme for International Assessment for Adult Competencies (PIAAC), titled “Do Adults Have the Skills They Need to Thrive in a Changing World?”. Think of it perhaps as PISA for grown-ups, providing a broadly useful cross-national comparison of basic cognitive skills which are key to labour market success and overall productivity. You are forgiven if you didn’t hear about it: its news impact was equivalent to the proverbial tree falling in a forest. Today, I will skim briefly over the results, but more importantly, ponder why this kind of data does not generate much news.

    First administered in 2011, PIAAC consists of three parts: a test for literacy, numeracy, and what they call “adaptive problem solving” (this last one has changed a bit—in the previous iteration it was something called “problem-solving in technology-rich environments). The test scale for is from 0 to 500, and individuals are categorized as being in one of six “bands” (1 through 5, with 5 being the highest, and a “below 1,” which is the lowest). National scores across all three of these areas are highly correlated, which is to say that if country is at the top or bottom, or even in the middle on literacy, it’s almost certainly pretty close to the same rank order for numeracy and problem solving as well. National scores all cluster in the 200 to 300 range.

    One of the interesting—and frankly somewhat terrifying—discoveries of PIAAC 2 is that literacy and numeracy scores are down in most of the OECD outside of northern Europe. Across all participating countries, literacy is down fifteen points, and numeracy by seven. Canada is about even in literacy and up slightly in numeracy—this is one trend it’s good to buck. The reason for this is somewhat mysterious—an aging population probably has something to do with it, because literacy and numeracy do start to fall off with age (scores peak in the 25-34 age bracket)—but I would be interested to see more work on the role of smart phones. Maybe it isn’t just teenagers whose brains are getting wrecked?

    The overall findings actually aren’t that interesting. The OECD hasn’t repeated some of the analyses that made the first report so fascinating (results were a little too interesting, I guess), so what we get are some fairly broad banalities—scores rise with education levels, but also with parents’ education levels; employment rates and income rise with skills levels; there is broadly a lot of skill mis-match across all economies, and this is a Bad Thing (I am not sure it is anywhere near as bad as OECD assumes, but whatever). What remains interesting, once you read over all the report, are the subtle differences one picks up in the results from one country to another.

    So, how does Canada do, you ask? Well, as Figure 1 shows, we are considered to be ahead of the OECD average, which is good so far as it goes. However, we’re not at the top. The head of the class across all measures are Finland, Japan, and Sweden, followed reasonably closely by the Netherlands and Norway. Canada is in a peloton behind that with a group including Denmark, Germany, Switzerland, Estonia, the Flemish region of Belgium, and maybe England. This is basically Canada’s sweet spot in everything when it comes to education, skills, and research: good but not great, and it looks worse if you adjust for the amount of money we spend on this stuff.

    Figure 1: Key PIAAC scores, Canada vs OECD, 2022-23

    Canadian results can also be broken down by province, as in Figure 2, below. Results do not vary much across most of the country. Nova Scotia, Ontario, Saskatchewan, Manitoba, Prince Edward Island, and Quebec all cluster pretty tightly around the national average. British Columbia and Alberta are significantly above that average, while New Brunswick and Newfoundland are significantly below it. Partly, of course, this has to do with things you’d expect like provincial income, school policies, etc. But remember that this is across entire populations, not school leavers, and so internal immigration plays a role here too. Broadly speaking, New Brunswick and Newfoundland lose a lot of skills to places further west, while British Columbia and Alberta are big recipients of immigration from places further east (international migration tends to reduce average scores: language skills matter and taking the test in a non-native tongue tends to result in lower overall results).

    Figure 2: Average PIAAC scores by province, 2022-23

    Anyways, none of this is particularly surprising or perhaps even all that interesting. What I think is interesting is how differently this data release was handled from the one ten years ago. When the first PIAAC was released a decade ago, Statistics Canada and the Council of Ministers of Education, Canada (CMEC) published a 110-page analysis of the results (which I analyzed in two posts, one on Indigenous and immigrant populations, and another on Canadian results more broadly) and an additional 300(!)-page report lining up the PIAAC data with data on formal and informal adult learning. It was, all in all, pretty impressive. This time, CMEC published a one-pager which linked to a Statscan page which contains all of three charts and two infographics (fortunately, the OECD itself put out a 10-pager that is significantly better than anything domestic analysis). But I think all of this points to something pretty important, which is this:

    Canadian governments no longer care about skills. At least not in the sense that PIAAC (or PISA for that matter) measures them.

    What they care about instead are shortages of very particular types of skilled workers, specifically health professions and the construction trades (which together make up about 20% of the workforce). Provincial governments will throw any amount of money at training in these two sets of occupations because they are seen as bottlenecks in a couple of key sectors of the economy. They won’t think about the quality of the training being given or the organization of work in the sector (maybe we wouldn’t need to train as many people if the labour produced by such training was more productive?). God forbid. I mean that would be difficult. Complex. Requiring sustained expert dialogue between multiple stakeholders/partners. No, far easier just to crank out more graduates, by lowering standards if necessary (a truly North Korean strategy).

    But actual transversal skills? The kind that make the whole economy (not just a politically sensitive 20%) more productive? I can’t name a single government in Canada that gives a rat’s hairy behind. They used to, twenty or thirty years ago. But then we started eating the future. Now, policy capacity around this kind of thing has atrophied to the point where literally no one cares when a big study like PIAAC comes out.

    I don’t know why we bother, to be honest. If provincial governments and their ministries of education in particular (personified in this case by CMEC) can’t be arsed to care about something as basic as the skill level of the population, why spend millions collecting the data? Maybe just admit our profound mediocrity and move on.

    Source link