Tag: Reveals

  • ICE Reveals How It Targeted International Students

    ICE Reveals How It Targeted International Students

    Federal immigration officials targeted student visa holders by running their names through a federal database of criminal histories, according to court testimony given by Department of Homeland Security officials on Tuesday and reported by Politico.

    As part of the Student Criminal Alien Initiative, as officials dubbed the effort, 20 ICE agents and several federal contractors ran the names of 1.3 million potential student visa holders through the database, searching for those that were both still enrolled in programs and had had some brush with the criminal justice system. Many of those students had only minor criminal infractions on their record like traffic violations, and they often had never been charged. ICE used that information to terminate students’ SEVIS records.

    Officials testified that ICE ultimately flagged around 6,400 Student Exchange and Visitor Information System records for termination and used the data to revoke more than 3,000 student visas—far more than the 1,800 that Inside Higher Ed tracked over the past month. 

    The officials’ testimony came in a hearing for one of many lawsuits filed by international students and immigration attorneys challenging the sudden and unexplained visa terminations; dozens of the cases have been successful so far. Last week the agency restored international students’ visas amid the flurry of court losses and said it would release an updated policy in the near future. 

    On Monday, the Trump administration released a draft of that policy, which vastly expands the prior one and makes visa revocation legal grounds for a student’s legal residency to be terminated as well.

    Source link

  • Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Before our show starts today, I just wanna take a minute to note the passing of Professor Claire Callender, OBE. For the last two and a half decades, she’s been one of the most important figures in UK higher education studies, in particular with respect to student loans and student finance. Holder of a joint professorship at UCL Institute of Education and Birkbeck University of London, she was also instrumental in setting up the ESRC Centre for Global Higher Education, of which she later became deputy director. I just want to quote the short obituary that her colleague Simon Marginson wrote for her last week after her passing from lung cancer. He said, “What we’ll remember about Claire is the way she focused her formidable capacity for rational thought on matters to which she was committed, her gravitas that held the room when speaking, and the warmth that she evoked without fail in old and new acquaintances.”

    My thoughts and condolences to her partner Annette, and to her children. We’ll all miss Claire. 


    I suspect most of you are familiar with the OECD’s Program for International Student Assessment, or PISA. That’s a triannual test of 15 year olds around the world. It tries to compare how teenagers fare in real world tests of literacy and numeracy. But you might not be as familiar with PISA’s cousin, the Program for International Assessment of Adult Competencies or PIAAC. To simplify enormously, it’s PISA, but for adults, and it only comes out once a decade with the latest edition having appeared on December 10th of last year. Now, if you’re like most people, you’re probably asking yourself, what does PIAAC measure exactly?

    PISA pretty clearly is telling us something about school systems. Adults, the subject of the PIAAC test, they’ve been out of school for a long time. What do test results mean for people who’ve been out of school for, in some cases, decades? And what kinds of meaningful policies might be made on the basis of this data?

    Today my guest is the CEO of Canada’s Future Skills Centre, Noel Baldwin. Over the past decade, both in his roles at FSC, his previous ones at the Council Minister of Education Canada, he’s arguably been one of the country’s most dedicated users of PIAAC data. As part of Canada’s delegation to the OECD committee in charge of PIAAC, he also had a front row seat to the development of these tests and the machinery behind these big international surveys. 

    Over the course of the next 20 or so minutes, you’ll hear Noel and I, both fellow members of the Canada Millennium Scholarship Foundation Mafia, discuss such issues as how the wording of international surveys gets negotiated, why we seem to be witnessing planet wide declines in adult literacy, what research questions PIAAC is best suited to answer, and maybe most intriguingly what PIAAC 3 might look like a decade from now.

    I really enjoyed this conversation and I hope you do too. Anyway, over to Noel.


    The World of Higher Education Podcast
    Episode 3.28 | Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Transcript

    Alex Usher (AU): Noel, some of our listeners might be familiar with big international testing programs like PISA—the Program for International Student Assessment. But what is the Program for the International Assessment of Adult Competencies? What does it aim to measure, and why?

    Noel Baldwin (NB): It’s somewhat analogous to PISA, but it’s primarily focused on working-age adults. Like PISA, it’s a large-scale international assessment organized by the OECD—specifically by both the education and labor secretariats. It’s administered on the ground by national statistical agencies or other government agencies in participating countries.

    PIAAC is mainly focused on measuring skills like literacy and numeracy. Over time, though, the OECD has added other skill areas relevant to the intersection of education and labor markets—things like digital skills, technology use, problem solving, and social-emotional skills.

    In addition to the assessment itself, there’s a large battery of background questions that gather a lot of demographic information—details about respondents’ work life, and other factors like health and wellbeing. This allows researchers to draw correlations between the core skills being measured and how those skills are used, or what kind of impact they have on people’s lives.

    AU: How do they know that what’s being measured is actually useful in the workplace? I mean, the literacy section is reading comprehension, and the math is sort of like, you know, “If two trains are moving toward each other, one from Chicago and one from Pittsburgh…” It’s a bit more sophisticated than that, but that kind of thing. How do they know that actually measures anything meaningful for workplace competencies?

    NB: That’s a good question. One thing to start with is that the questions build from fairly easy and simple tasks to much more complex ones. That allows the OECD to create these scales, and they talk a lot about proficiency levels—level one up to five, and even below level one in some cases, for people with the weakest skill levels.

    And while PIAAC itself is relatively new, the assessment of these competencies isn’t. It actually dates back to the early 1990s. There’s been a lot of research—by the OECD and by psychometricians and other researchers—on the connections between these skills and broader outcomes.

    The key thing to understand is that, over time, there’s been strong evidence linking higher literacy and numeracy skills to a range of life outcomes, especially labor market outcomes. It’s a bit like educational attainment—these things often act as proxies for one another. But the stronger your skills, the more likely you are to be employed, to earn higher wages, to avoid unemployment, and to be adaptable and resilient.

    And it’s not just about work. It extends to other areas too—life satisfaction, for instance. There are even some interesting findings about democratic participation and people’s perceptions of how their society is doing. So there are pretty strong correlations between higher-level skills and a variety of positive outcomes.

    AU: But, I can imagine that the nature of an economy—whether it’s more manufacturing-based or service-based—might affect what kinds of skills are relevant. So different countries might actually want to measure slightly different things. How do you get 50—or however many, dozens of countries—to agree on what skills to assess and how to measure them?

    NB: The point at which OECD countries agreed to focus on literacy and numeracy actually predates me—and it also predates a lot of today’s focus on more digitally oriented skills. It was a much more analog world when this started, and so literacy and numeracy made a lot of sense. At the time, most of the information people consumed came in some form of media that required reading comprehension and the ability to navigate text. And then, on the numeracy side, the ability to do anything from basic to fairly advanced problem solving with numbers was highly relevant. So I suspect that when this was being developed—through the 1980s and into the early 1990s—there was a high degree of consensus around focusing on those core skills.

    The development of the instruments themselves is also an international effort. It’s led by the OECD, but they work with experts from a range of countries to test and validate the items used in the assessment. Educational Testing Service (ETS) in the U.S. is quite involved, and there are also experts from Australia and Canada. In fact, Canada was very involved in the early stages—both through Statistics Canada and other experts—particularly in developing some of the initial tools for measuring literacy. So, the consensus-building process includes not just agreeing on what to measure and how to administer it, but also developing the actual assessment items and ensuring they’re effective. They do field testing before rolling out the main assessment to make sure the tools are as valid as possible.

    AU: Once the results are in and published, what happens next? How do governments typically use this information to inform policy?

    NB: I’ll admit—even having been on the inside of some of this—it can still feel like a bit of a black box. In fact, I’d say it’s increasingly becoming one, and I think we’ll probably get into that more as the conversation goes on.

    That said, different countries—and even different provinces and territories within Canada—use the information in different ways. It definitely gets integrated into various internal briefings. I spent some time, as you know, at the Council of Ministers of Education, and we saw that both in our own work and in the work of officials across the provinces and territories.

    After the last cycle of PIAAC, for instance, Quebec produced some fairly detailed reports analyzing how Quebecers performed on the PIAAC scales—comparing them to other provinces and to other countries. That analysis helped spark conversations about what the results meant and what to do with them. New Brunswick, for example, launched a literacy strategy shortly after the last PIAAC cycle, which suggests a direct link between the data and policy action.

    So there are examples like that, but it’s also fair to say that a lot of the data ends up being used internally—to support conversations within governments. Even since the most recent PIAAC cycle was released in December, I’ve seen some of that happening. But there’s definitely less in the public domain than you might expect—and less than there used to be, frankly.

    AU: Some of the findings in this latest PIAAC cycle—the headline that got the most traction, I think—was the fact that we’re seeing declines in literacy and numeracy scores across much of the OECD. A few countries bucked the trend—Canada saw a small decline, and parts of Northern Europe did okay—but most countries were down. What are the possible explanations for this trend? And should we be concerned?

    NB: I think we should be really aware. When it comes to concern, though, I’m always a bit hesitant to declare a crisis. There’s a lot of work still to be done to unpack what’s going on in this PIAAC cycle.

    One thing to keep in mind is that most of the responses were collected during a time of ongoing global turmoil. The data was gathered in 2022, so we were still in the middle of the pandemic. Just getting the sample collected was a major challenge—and a much bigger one than usual.

    With that caveat in mind, the OECD has started to speculate a bit, especially about the literacy side. One of the things they’re pointing to is how radically the way people consume information has changed over the past 10 years.

    People are reading much shorter bits of text now, and they’re getting information in a much wider variety of formats. There are still items in the literacy assessment that resemble reading a paragraph in a printed newspaper—something that just doesn’t reflect how most people engage with information anymore. These days, we get a lot more of it through video and audio content.

    So I think those shifts in how we consume information are part of the story. But until we see more analysis, it’s hard to say for sure. There are some signals—differences in gender performance across countries, for example—that we need to unpack. And until we do that, we’re not going to have a great sense of why outcomes look the way they do.

    AU: Let’s focus on Canada for a second. As with most international education comparisons, we end up in the top—but at the bottom of the top third, basically. It doesn’t seem to matter what we do or when—it’s always that pattern. Looking at global trends, do you think Canada stands out in any way, positively or negatively? Are there things we’re doing right? Or things we’re getting wrong?

    NB: Well, I’d say we continue to see something that the OECD points out almost every time we do one of these assessments: the gap between our top performers and our lowest performers is smaller than in many other countries. That’s often taken as a sign of equity, and I’d say that’s definitely a good news story.

    In the global comparison, we held pretty much steady on literacy, while many countries saw declines. Holding steady when others are slipping isn’t a bad outcome. And in numeracy, we actually improved.

    The distribution of results across provinces was also more even than in the last cycle. Last time, there was much more variation, with several provinces falling below the OECD or Canadian average. This time around, we’re more tightly clustered, which I think is another positive.

    If you dig a little deeper, there are other encouraging signs. For example, while the OECD doesn’t have a perfect measure of immigration status, it can identify people who were born outside a country or whose parents were. Given how different Canada’s demographic profile is from nearly every other participating country—especially those in Northern Europe—I think we’re doing quite well in that regard.

    And in light of the conversations over the past few years about immigration policy and its impacts across our society, I think it’s a pretty good news story that we’re seeing strong performance among those populations as well.

    AU: I know we’ll disagree about this next question. My impression is that, in Canada, the way PIAAC gets used has really changed over the last decade. The first round of PIAAC results got a lot of attention—StatsCan and the Council of Ministers of Education both published lengthy analyses.

    And maybe “crickets” is too strong a word to describe the reaction this time, but it’s definitely quieter. My sense is that governments just don’t care anymore. When they talk about skills, the narrative seems focused solely on nursing and the skilled trades—because those are seen as bottlenecks on the social side and the private sector side.

    But there’s very little interest in improving transversal skills, and even less knowledge or strategy about how to do so. Make me less cynical.

    NB: Well, it’s funny—this question is actually what kicked off the conversation that led to this podcast. And I’ll confess, you’ve had me thinking about it for several weeks now.

    One thing I want to distinguish is caring about the skills themselves versus how the data is being released and used publicly. There’s no denying that we’re seeing less coming out publicly from the governments that funded the study. That’s just true—and I’m not sure that’s going to change.

    I think that reflects a few things. Partly, it’s the changed fiscal environment and what governments are willing to pay for. But it’s also about the broader information environment we’re in today compared to 2013.

    As I’ve been reflecting on this, I wonder if 2012 and 2013 were actually the tail end of the era of evidence-based policymaking—and that now we’re in the era of vibes-based policymaking. And if that’s the case, why would you write up detailed reports about something you’re mostly going to approach from the gut?

    On the skills side, though, I still think there’s an interesting question. A few weeks ago, I felt more strongly about this, but I still believe it’s not that governments don’t care about these foundational skills. Rather, I think the conversation about skills has shifted.

    We may have lost sight of how different types of skills build on one another—starting from foundational literacy and numeracy, then layering on problem-solving, and eventually reaching digital competencies. That understanding might be missing in the current conversation.

    Take the current moment around AI, for example. Maybe “craze” is too strong a word, but there’s a belief that people will become great at prompt engineering without any formal education. Mark Cuban—on BlueSky or wherever, I’m not sure what they call posts there—made a point recently that you won’t need formal education with generative AI. If you can get the right answers out of a large language model, you’ll outperform someone with an advanced degree.

    But that completely overlooks how much you need to understand in order to ask good questions—and to assess whether the answers you get are worth anything. So we may start to see that shift back.

    That said, you’re right—there has definitely been a move in recent years toward thinking about workforce issues rather than broader skill development. And that may be a big part of what’s going on.

    AU: What do you think is the most interesting or under-explored question that PIAAC data could help answer, but that we haven’t fully investigated yet? This dataset allows for a lot of interesting analysis. So if you could wave a magic wand and get some top researchers working on it—whether in Canada or internationally—where would you want them to focus?

    NB: First, I’ll just make a small plug. We’ve been working on what we hope will become a PIAAC research agenda—something that responds to the things we care about at the Future Skills Centre, but that we hope to advance more broadly in the coming weeks and months. So we are actively thinking about this.

    There are a bunch of areas that I think are really promising. One is the renewed conversation about productivity in Canada. I think PIAAC could shed light on the role that skills play in that. The Conference Board of Canada did a piece a while back looking at how much of the productivity gap between Canada and the U.S. is due to skill or labor factors. Their conclusion was that it wasn’t a huge part—but I think PIAAC gives us tools to continue digging into that question.

    Another area the OECD often highlights when talking about Canada is the extent to which workers are overqualified or overskilled for the jobs they’re in. That’s a narrative that’s been around for a while, but one where I think PIAAC could offer deeper insights.

    It becomes even more interesting when you try to link it to broader labor supply questions—like the role of immigration. Some people have suggested that one reason Canada lags in things like technology integration or capital investment is that we’ve substituted skilled labor for that kind of investment.

    With PIAAC, we might be able to explore whether overqualification or overskilling is connected to the way we’ve managed immigration over the last couple of decades.

    So, there are a few areas there that I think are both relevant and under-explored. And of course, on the international side, you’re right—we should be looking for examples of countries that have had success, and thinking about what we can emulate, borrow from, or be inspired by.

    AU: I don’t know if either of us wants to still be doing this in 10 years, but if we were to have this conversation again a decade from now, what do you think—or hope—will have changed? What will the long-term impact of PIAAC Cycle 2 have been, and how do you think PIAAC 3 might be different?

    NB: Well, I think I need to say this out loud: I’m actually worried there won’t be a PIAAC 3.

    We’re recording this in early 2025, which is a pretty turbulent time globally. One of the things that seems clear is that the new U.S. administration isn’t interested in the Department of Education—which likely means they won’t be interested in continuing the National Center for Education Statistics.

    And like with many international initiatives, the U.S. plays a big role in driving and valuing efforts like PIAAC. So I do worry about whether there will be a third cycle. If it happens without U.S. participation, it would be a very different kind of study.

    But I hope that in 10 years, we are talking about a robust PIAAC 3—with strong participation from across OECD countries.

    I also hope there’s continued investment in using PIAAC data to answer key research questions. It’s just one tool, of course, but it’s a big one. It’s the only direct assessment of adult skills we have—where someone is actually assessed on a defined set of competencies—so it’s really valuable.

    For an organization like ours, which is focused on adult skills in the workforce, it’s up to us to push forward and try to get answers to some of these questions. And I hope the research we and others are doing will find its way into policy conversations—especially as we think about how workforce needs, skills, and the broader economy are going to change over the next decade.

    It would be a wasted opportunity if it didn’t.

    AU: Noel, thanks so much for being with us today.

    NB: Thanks Alex.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service. Please note, the views and opinions expressed in each episode are those of the individual contributors, and do not necessarily reflect those of the podcast host and team, or our sponsors.

    This episode is sponsored by Studiosity. Student success, at scale – with an evidence-based ROI of 4.4x return for universities and colleges. Because Studiosity is AI for Learning — not corrections – to develop critical thinking, agency, and retention — empowering educators with learning insight. For future-ready graduates — and for future-ready institutions. Learn more at studiosity.com.

    Source link

  • Study Reveals Key Factors Driving Student College Choice in 2025

    Study Reveals Key Factors Driving Student College Choice in 2025

    A comprehensive new study by education research firm EAB has identified the most influential factors shaping how students choose colleges, with academic program variety, campus safety, and student organizations emerging as the top three drivers of student attraction.

    The research, analyzing data from U.S. four-year colleges, found that schools offering a wider range of majors see significantly higher student interest, with each additional program contributing to increased application and enrollment rates. Campus safety measures and the number of available student organizations were also found to be major factors in students’ decision-making process.

    “What’s particularly interesting is how these factors play out differently across institution types,” said Dr. Ryan Gardner-Cook, the project director. “For example, smaller schools gain more from incremental improvements in campus amenities and academic offerings compared to larger institutions.”

    The study also revealed that affordability remains a critical factor, especially for first-generation and low-income students. Schools with lower net prices and stronger financial aid packages showed notably higher attraction rates among these demographics.

    Environmental factors like climate and location also play a significant role. Schools in temperate climates and growing urban areas generally showed stronger appeal to prospective students. State-level political environments were found to increasingly influence student choice as well.

    The research identified nine distinct “institutional personas” ranging from “Accessible Education Anchors” to “Rigorous Academic Giants,” each with unique characteristics and challenges in attracting students. This classification system aims to help institutions better understand their competitive position and develop more effective recruitment strategies.

    For institutions looking to improve their student attraction, the study recommends focusing on controllable factors like admissions processes, student life offerings, and academic programs while finding ways to mitigate challenges related to location or cost.

    The findings come at a crucial time as higher education institutions face evolving student preferences and increasing competition for enrollment.

     

    Source link

  • Report Reveals Harvard MBAs Struggling to Get Jobs (Palki Sharma)

    Report Reveals Harvard MBAs Struggling to Get Jobs (Palki Sharma)

    A new report has revealed that 23% of Harvard MBAs were jobless even three months after their graduation. Similar trends have been reported in top B-schools across the world. Once considered a sure-shot ticket to success, what explains the changing fortunes of MBA degrees?

    Source link

  • Major parent survey reveals widespread dissatisfaction with state’s schools

    Major parent survey reveals widespread dissatisfaction with state’s schools

    A new survey of more than 400 New Mexico parents of school-aged children shows widespread dissatisfaction with the state’s public schools, that communication gaps between schools and parents are a serious concern, and that many parents have misperceptions about their children’s academic achievement.

    Results of the survey, “The State of Educational Opportunity in New Mexico,” were released Oct. 2 by NewMexicoKidsCAN, an education advocacy organization (and parent organization of New Mexico Education), focused on improving New Mexico’s public education system.

    The state survey was part of a national report authored by 50CAN, of which NewMexicoKidsCan is an affiliate. 50CAN is “focused on building the future of American education,” according to the organization’s website. That 214-page report, “The State of Educational Opportunity in America” provides a deep, 50-state dive into parental views of public education in their home states.

    Researchers surveyed more than 20,000 parents across the country, making it one of the largest education-focused surveys of parents in the past decade. This survey explores the ecosystem of educational opportunities inside and outside of school, and how they interrelate and impact a child’s success.

    “With such a large sample size, we are able to dig into the findings by state and across a range of important audiences. By making the findings publicly available, this is a gift of data that can inform conversations among communities and elected officials.” said Pam Loeb, Principal at Edge Research.

    The New Mexico survey provides insight into the educational opportunities available to children across New Mexico.

    The New Mexico survey uncovered key findings, including:

    • Parental dissatisfaction is widespread: Only about a third of New Mexico parents say they are “very satisfied” with their child’s school. Nationally, 45 percent of parents reported high satisfaction. New Mexico was one of the lower-ranked states in terms of parental satisfaction.
    • Communication Gaps Between Schools and Parents: Only 29% of New Mexico parents report feeling extremely confident in understanding their child’s academic progress ranking New Mexico second to last in the nation. 
    • Misperceptions about Student Achievement: 41% of New Mexico parents believe their child is above grade level in reading, yet state assessments show only 39% of students are reading at grade level. 
    • Afterschool Programs Show Promise: New Mexico ranks 22nd nationally in student participation in supervised afterschool programs, surpassing 28 other states. This success is likely attributed to increased state investments through the Extended Learning Time Program, which may have boosted overall participation rates.

    “This survey amplifies the voices of New Mexico parents,” said Amanda Aragon, Executive Director of NewMexicoKidsCAN. “The results reveal significant misperceptions about student performance, serious communication gaps between schools and parents, and widespread concerns about school satisfaction. 

    “It’s clear that many parents are not getting the information they need about their children’s academic progress. We must do more to close this communication gap and empower parents to be true partners in their child’s education.”

    “With such a large sample size, we are able to dig into the findings by state and across a range of important audiences. By making the findings publicly available, this is a gift of data that can inform conversations among communities and elected officials.” said Pam Loeb, Principal at Edge Research.

    Source link