Tag: PISA

  • Overrepresentation of female teachers and gender differences in PISA 2022: what cross-national evidence can and cannot tell us

    Overrepresentation of female teachers and gender differences in PISA 2022: what cross-national evidence can and cannot tell us

    Over the weekend HEPI published blogs on AI in legal education and knowledge and skills in higher education.

    Today’s blog was kindly authored by Hans Luyten, University of Twente, Netherlands ([email protected]).

    Across many education systems, secondary-school teaching remains a predominantly female profession. While this fact is well known, less is understood about whether the gender composition of the teaching workforce relates to gender differences in student achievement at the system level. My recently published paper, Overrepresentation of Female Teachers in Secondary Education and Gender Achievement Gaps in PISA 2022 (Studies in Educational Evaluation), takes up this question using recent international data.

    The study investigates whether gender differences in reading, mathematics, and science among 15-year-olds vary according to the extent to which women are overrepresented among secondary-school teachers, relative to their share in each country’s labour force.

    Data and analytical approach

    The analysis draws on two international datasets:

    1. PISA 2022: Providing country-level average scores for 15-year-olds in reading, mathematics, and science. Gender achievement gaps are operationalised as the difference between the average score for girls and that for boys.
    2. Labour-market data: Measuring the proportion of women among secondary-school teachers in each country and the proportion of women in the wider labour force.

    Female overrepresentation is defined as the difference between these two proportions.

    Although the analysis focuses on statistical correlations at the country level, it does not rely on simple bivariate associations. A wide range of control variables is included to account for differences between countries in:

    • Students’ out-of-school lives, such as gender differences in family support;
    • School resources, such as the availability of computers;
    • School staff characteristics, such as the percentage of certified teachers.

    These controls help ensure that the observed relationships are not simply reflections of broader cross-national differences in socioeconomic conditions or school quality.

    Key findings

    Three main results emerge from the analysis:

    First, gender achievement gaps tend to be larger in favour of girls in countries where women are more strongly overrepresented among secondary-school teachers.

    Second, this association holds across all three domains (reading, mathematics, and science), although the size and direction of the gender gap differs by subject.

    Third, the relationship becomes more pronounced as the degree of female overrepresentation increases. Countries with only modest overrepresentation tend to have smaller gender gaps, whereas those with large overrepresentation tend to have wider gaps.

    These findings concern gender differences in performance, not the absolute levels of boys’ or girls’ achievement. The study does not examine, and therefore does not draw conclusions about, whether boys or girls perform better or worse in absolute terms in countries with different levels of female teacher overrepresentation.

    Interpreting the results

    The analysis identifies a robust statistical association at the country level, after accounting for a broad set of background variables. However, as with any cross-national correlational study, it cannot establish causality. Other country-specific characteristics (cultural, institutional, or organisational) may also contribute to the observed patterns.

    It is also important to note that the study addresses a different question from research that examines the effects of individual teachers’ gender on the achievement of individual students. Earlier classroom- and school-level studies often find little or no systematic effect of teacher gender on student outcomes. The present study, by contrast, examines the overall gender composition of the teaching workforce and its relation to system-level gender achievement gaps.

    Implications

    Although the findings do not directly point to specific policy interventions, they suggest that the gender composition of the secondary-school teaching workforce is a feature of educational systems that merits closer attention when interpreting international variation in gender gaps. Teacher demographics form part of the broader context within which student achievement develops, and system-level gender imbalances may interact with other structural characteristics in shaping performance differences between girls and boys.

    Final remarks

    The full paper provides a detailed description of the data, analyses, and limitations. It is available open access at: https://doi.org/10.1016/j.stueduc.2025.101544

    I hope this summary brings the findings to a wider audience and encourages further research on how system-level characteristics relate to gender differences in educational outcomes.

    Source link

  • English skills are more essential than ever – the first PISA FLA proves it

    English skills are more essential than ever – the first PISA FLA proves it

    There has been much hype over the role AI can play, with increased speculation that, as this technology evolves, the need for learning languages will become less important. 

    This is obviously not the case.

    Used properly, AI can bring enormous benefits to classrooms. But there’s really no substitute for human-to-human learning with a skilled language teacher. It remains critical for students in school systems around the world to continue to learn real-life communicative language skills. AI can teach you a substantial amount of words and grammar, but language is about real-life communication, and this takes practise and guidance that AI just can’t provide.

    When it comes to testing language skills, it’s the same picture. AI can give an indication of knowledge, but it cannot reliably measure what students can do with the language and how well they can communicate.

    The Introduction of the in-depth English test for PISA

    The need for quality English skills in the age of AI is recognised worldwide. This is best proved by the fact that, for the first time, the PISA survey has added an assessment of foreign language skills – starting with English.

    The PISA Foreign Language Assessment (FLA) is using in-depth high-quality tests, developed by Cambridge, to make sure that it gives a really accurate picture of each participant’s language skills. By this, we mean their ability to interact, understand nuance and apply their language skills to real-world situations.

    This first PISA FLA is currently testing the English skills of thousands of students in 21 countries and economies around the world, providing unprecedented insights into what makes English language teaching and learning effective. Insights that are vital during this time of rapid change. Having a clear picture of what works in terms of language teaching in schools around the world, as a basis for improving future generations’ language skills, means we can measure change, learn and evolve.

    Why communicative language skills matter

    The benefits of learning communicative language skills are well documented. A recent paper by Cambridge and the OECD describes the benefits of learning another language in terms of the positive impact it can have on employability, critical thinking skills, and boosting cultural awareness – essential skills in today’s interconnected world.

    The importance of quality English skills was highlighted further in a recent article in the Financial Times, where journalist Simon Kuper comments that fluency in English “has become a non-negotiable qualification for high-level jobs in many professions.” He references a paper for the OECD that studied job vacancies across the EU and in the UK in 2021: 22% explicitly required knowledge of English. This is meaningful – as generative AI makes it easier for people to have a “passable grasp” of English, excellence in a language becomes a true differentiator in business and elsewhere.

    But of course, it’s not just about learning English. While English is an essential skill in so many areas, it’s equally important that people do not neglect their first language and that they take the time to learn other languages. Whether it’s a foreign language, the regional language of the place they live, the language of their parents or communities, or even the language of their favourite holiday destination, individuals can gain enormous benefits from learning more than one language.

    The impact of the PISA FLA

    We have a clear understanding of the benefits that English skills can bring. So, it is surprising that there has not been a comprehensive study in this area since 2011, when SurveyLang assessed the language competence of 50,000 pupils across 15 countries in Europe. The findings highlighted the importance of starting to learn English at an early age – and the benefits of exposure to language outside the classroom, through films, music, travel and other opportunities, to incorporate the language into the students’ lives. Whilst this is insightful, this was over 14 years ago, and we need contemporary and reliable data.

    For this reason, the results of the PISA FLA will mark a turning point for language education. Although it’s too early to speculate on the findings, the impact of the survey’s data has the unprecedented potential to transform language policy around the world. Leaders and policymakers will get access to the data they need to make decisions on which teaching methods and learning environments really work, where to focus resources and how to design curriculums. One of the ways it will achieve this is by assessing against the Common European Framework of Reference for Languages (CEFR).

    The PISA FLA also demonstrates how meaningful language testing can be delivered at scale. The English test used in PISA – and developed through a partnership between Cambridge and the OECD – is a cutting-edge, multi-level, computer-adaptive assessment, and tests the spoken production of language via a computer-delivered test for the first time in a global survey of this kind.

    We are at an exciting moment of change. How we teach, how we learn, how we work and how we live is evolving every day. As providers of quality education, we have a responsibility to stay abreast of this change and ensure we are continually adding value – serving the current and very real needs of our learners.

    When it comes to language education, that means understanding how we can shape learning, teaching and assessment that will empower generations of learners to come. It also means understanding how we can contribute to an educational system fuelled by insights and data. The PISA FLA is the first step on this journey.

    Written by: Francesca Woodward, Global Managing Director, English at Cambridge University Press & Assessment

    Source link

  • Deafening Silence on PIAAC | HESA

    Deafening Silence on PIAAC | HESA

    Last month, right around the time the blog was shutting down, the OECD released its report on the second iteration of the Programme for International Assessment for Adult Competencies (PIAAC), titled “Do Adults Have the Skills They Need to Thrive in a Changing World?”. Think of it perhaps as PISA for grown-ups, providing a broadly useful cross-national comparison of basic cognitive skills which are key to labour market success and overall productivity. You are forgiven if you didn’t hear about it: its news impact was equivalent to the proverbial tree falling in a forest. Today, I will skim briefly over the results, but more importantly, ponder why this kind of data does not generate much news.

    First administered in 2011, PIAAC consists of three parts: a test for literacy, numeracy, and what they call “adaptive problem solving” (this last one has changed a bit—in the previous iteration it was something called “problem-solving in technology-rich environments). The test scale for is from 0 to 500, and individuals are categorized as being in one of six “bands” (1 through 5, with 5 being the highest, and a “below 1,” which is the lowest). National scores across all three of these areas are highly correlated, which is to say that if country is at the top or bottom, or even in the middle on literacy, it’s almost certainly pretty close to the same rank order for numeracy and problem solving as well. National scores all cluster in the 200 to 300 range.

    One of the interesting—and frankly somewhat terrifying—discoveries of PIAAC 2 is that literacy and numeracy scores are down in most of the OECD outside of northern Europe. Across all participating countries, literacy is down fifteen points, and numeracy by seven. Canada is about even in literacy and up slightly in numeracy—this is one trend it’s good to buck. The reason for this is somewhat mysterious—an aging population probably has something to do with it, because literacy and numeracy do start to fall off with age (scores peak in the 25-34 age bracket)—but I would be interested to see more work on the role of smart phones. Maybe it isn’t just teenagers whose brains are getting wrecked?

    The overall findings actually aren’t that interesting. The OECD hasn’t repeated some of the analyses that made the first report so fascinating (results were a little too interesting, I guess), so what we get are some fairly broad banalities—scores rise with education levels, but also with parents’ education levels; employment rates and income rise with skills levels; there is broadly a lot of skill mis-match across all economies, and this is a Bad Thing (I am not sure it is anywhere near as bad as OECD assumes, but whatever). What remains interesting, once you read over all the report, are the subtle differences one picks up in the results from one country to another.

    So, how does Canada do, you ask? Well, as Figure 1 shows, we are considered to be ahead of the OECD average, which is good so far as it goes. However, we’re not at the top. The head of the class across all measures are Finland, Japan, and Sweden, followed reasonably closely by the Netherlands and Norway. Canada is in a peloton behind that with a group including Denmark, Germany, Switzerland, Estonia, the Flemish region of Belgium, and maybe England. This is basically Canada’s sweet spot in everything when it comes to education, skills, and research: good but not great, and it looks worse if you adjust for the amount of money we spend on this stuff.

    Figure 1: Key PIAAC scores, Canada vs OECD, 2022-23

    Canadian results can also be broken down by province, as in Figure 2, below. Results do not vary much across most of the country. Nova Scotia, Ontario, Saskatchewan, Manitoba, Prince Edward Island, and Quebec all cluster pretty tightly around the national average. British Columbia and Alberta are significantly above that average, while New Brunswick and Newfoundland are significantly below it. Partly, of course, this has to do with things you’d expect like provincial income, school policies, etc. But remember that this is across entire populations, not school leavers, and so internal immigration plays a role here too. Broadly speaking, New Brunswick and Newfoundland lose a lot of skills to places further west, while British Columbia and Alberta are big recipients of immigration from places further east (international migration tends to reduce average scores: language skills matter and taking the test in a non-native tongue tends to result in lower overall results).

    Figure 2: Average PIAAC scores by province, 2022-23

    Anyways, none of this is particularly surprising or perhaps even all that interesting. What I think is interesting is how differently this data release was handled from the one ten years ago. When the first PIAAC was released a decade ago, Statistics Canada and the Council of Ministers of Education, Canada (CMEC) published a 110-page analysis of the results (which I analyzed in two posts, one on Indigenous and immigrant populations, and another on Canadian results more broadly) and an additional 300(!)-page report lining up the PIAAC data with data on formal and informal adult learning. It was, all in all, pretty impressive. This time, CMEC published a one-pager which linked to a Statscan page which contains all of three charts and two infographics (fortunately, the OECD itself put out a 10-pager that is significantly better than anything domestic analysis). But I think all of this points to something pretty important, which is this:

    Canadian governments no longer care about skills. At least not in the sense that PIAAC (or PISA for that matter) measures them.

    What they care about instead are shortages of very particular types of skilled workers, specifically health professions and the construction trades (which together make up about 20% of the workforce). Provincial governments will throw any amount of money at training in these two sets of occupations because they are seen as bottlenecks in a couple of key sectors of the economy. They won’t think about the quality of the training being given or the organization of work in the sector (maybe we wouldn’t need to train as many people if the labour produced by such training was more productive?). God forbid. I mean that would be difficult. Complex. Requiring sustained expert dialogue between multiple stakeholders/partners. No, far easier just to crank out more graduates, by lowering standards if necessary (a truly North Korean strategy).

    But actual transversal skills? The kind that make the whole economy (not just a politically sensitive 20%) more productive? I can’t name a single government in Canada that gives a rat’s hairy behind. They used to, twenty or thirty years ago. But then we started eating the future. Now, policy capacity around this kind of thing has atrophied to the point where literally no one cares when a big study like PIAAC comes out.

    I don’t know why we bother, to be honest. If provincial governments and their ministries of education in particular (personified in this case by CMEC) can’t be arsed to care about something as basic as the skill level of the population, why spend millions collecting the data? Maybe just admit our profound mediocrity and move on.

    Source link