Tag: Reveals

  • Texas Study Reveals Power of Combined Accelerated Programs for College Success

    Texas Study Reveals Power of Combined Accelerated Programs for College Success

    High school students who combine dual enrollment courses with Advanced Placement or International Baccalaureate programs are significantly more likely to graduate from college and earn higher salaries in their early twenties than peers who pursue only one type of accelerated coursework, according to a new report from the Community College Research Center.

    File photoThe study, which tracked Texas high school students expected to graduate in 2015-16 and 2016-17 for six years after high school, found that 71% of students who took both dual enrollment and AP/IB courses earned a postsecondary credential within six years—including 60% who completed a bachelor’s degree. By comparison, only 10% of students who took no accelerated coursework completed any postsecondary credential.

    “Most dual enrollment students in Texas also take other accelerated courses, and those who do tend to have stronger college and earnings trajectories,” said Dr.Tatiana Velasco, CCRC senior research associate. “It’s a pattern we hadn’t fully appreciated before, which offers clues for how to expand the benefits of dual enrollment to more students.”

    The financial benefits of combining accelerated programs extend well beyond graduation. Students who took both dual enrollment and AP/IB courses earned an average of $10,306 per quarter at age 24—more than $1,300 per quarter above students who took dual enrollment alone and nearly $1,400 per quarter more than those who took only AP/IB courses.

    These advantages persisted even after researchers controlled for student demographics, test scores, and school characteristics, suggesting the combination of programs provides genuine educational value rather than simply reflecting differences in student backgrounds.

    While the study revealed promising outcomes for students combining dual enrollment with career and technical education programs, participation in this pathway remains critically low. Fewer than 5% of students combine a CTE focus—defined as taking 10 or more CTE courses—with dual enrollment.

    Yet those who do show remarkable success. By age 24, dual enrollment students with a CTE focus earned an average of $9,746 per quarter, substantially more than CTE-focused students who didn’t take dual enrollment ($8,097) and second only to the dual enrollment/AP-IB combination group.

    The findings suggest a significant missed opportunity, particularly for students seeking technical career paths who could benefit from early college exposure while building specialized skills.

    The report highlights concerning equity gaps in accelerated coursework access. Students who combine dual enrollment with AP/IB courses are less diverse than those taking AP/IB alone, raising questions about which students have opportunities to maximize the benefits of accelerated learning.

    Early college high schools present a partial solution to this challenge. These specialized schools, where students can earn an associate degree while completing high school, serve more diverse student populations than other accelerated programs. Their graduates complete associate degrees at higher rates and earn more than Texas students overall by age 21. However, early college high schools serve only 5% of Texas students statewide.

    With less than 40% of Texas students without accelerated coursework enrolling in any postsecondary institution, and only one in five Texas students taking dual enrollment, researchers see substantial room for expansion.

    The report’s authors recommend that K-12 districts and colleges work to expand dual enrollment participation while ensuring these programs complement rather than compete with AP/IB offerings. They also call for increased access to dual enrollment for CTE students and additional support structures to promote student success in college-level coursework during high school.

     

    Source link

  • AAUP v. Rubio Reveals Details of Deportation Efforts

    AAUP v. Rubio Reveals Details of Deportation Efforts

    Today is the final day of the American Association of University Professors v. Rubio trial, in which the association, its chapters at Rutgers and Harvard Universities, and the Middle East Studies Association sued to stop the Trump administration from the “ideological deportation” of international students.

    The lawsuit argues that the deportations violate international students’ right to free expression and their Fifth Amendment right not to have laws enforced against them arbitrarily or discriminatorily. It also claims that the arrests of student protesters chilled speech on campuses—something witnesses corroborated.

    The trial, conducted during the last two weeks, revealed new details about the administration’s targeting of international students, including high profile cases like those of graduate students Mahmoud Khalil and Rümeysa Öztürk, who were detained by Immigration and Customs Enforcement in March. (Both have since been released.)

    Here are some of the key takeaways from the trial ahead of the parties’ closing statements.

    1. Dossiers about the targeted students included information about their protest activities.

    On Friday, John Armstrong, the most senior official at the State Department’s Bureau of Consular Affairs, testified that the memos written by state department officials recommending deportation actions and visa revocations contained details about student and faculty members’ activism.

    The memos have been designated as for “attorneys’ eyes only”—the most restrictive possible designation for sensitive information in a trial, which prevents even the plaintiffs and defendants from viewing them. But attorneys and witnesses quoted excerpts of them during the trial.

    The action memo for Öztürk highlighted an op-ed she had co-written supporting a call for her institution, Tufts University, to divest from companies with ties to Israel, Armstrong said, according to trial transcripts published by the Knight First Amendment Institute at Columbia University, which is representing the plaintiffs. But he insisted that the op-ed was not a “key factor” in the decision to revoke her visa and detain her.

    Another memo, regarding Columbia student activist Mohsen Mahdawi, specifically noted that “a court may consider his actions inextricably tied to speech protected under the First Amendment,” according to an excerpt read by Alexandra Conlon, an attorney for the plaintiffs.

    2. Investigators weren’t given guidance about what constitutes antisemitism.

    The State Department hasn’t release any guidance as to what, exactly, should be considered antisemitism, Armstrong acknowledged on Friday. He also stated that, to his knowledge, the officials who have written action memos about protesters haven’t received any training about what constitutes antisemitism.

    That’s significant, because at least one memo, Mahdawi’s, referred specifically to “antisemitic conduct.”

    “I do know that there’s a common understanding in our culture, in our society of what antisemitism is,” Armstrong said.

    When U.S. District Judge William G. Young pushed him to describe that “common understanding,” he responded: “In my opinion, antisemitism is unjustified views, biases, or prejudices, or actions against Jewish people, or Israel, that are the result of hatred towards them.”

    3. ICE officials leaned on the Canary Mission website to find students and professors to target.

    For over a decade, the anonymously operated site Canary Mission has been publishing the identities of students and professors they deem antisemitic. Several of those listed on the website, including Khalil, Mahdawi and Ă–ztĂĽrk, have been targeted since the Trump administration began taking aim at student protesters.

    On the third day of the trial, Peter Hatch, a senior ICE official, stated that “many of the names, even most of the names” on a list of noncitizen students presented to ICE’s “Tiger Team” for investigation came from the Canary Mission site.

    Hatch said that other names came from Betar USA, the American chapter of an international Zionist organization, which the Anti-Defamation League has labeled an extremist group.

    4. ICE agents said they prioritized the arrest of activists at the urging of their higher-ups.

    ICE agents who oversaw the arrests of Ă–ztĂĽrk, Khalil, Mahdawi, and Badar Khan Suri, a Georgetown University professor, said last Tuesday that the cases were unusual not just because of the legal grounds on which the activists were detained but also because the orders came from high-ranking officials in the organization.

    Patrick Cunningham, an agent with ICE’s Homeland Security Investigations office in Boston, said that the agency’s leaders were “inquiring” about Öztürk’s case, leading his office to prioritize her arrest.

    “I can’t recall a time that it’s come top-down like this with a Visa revocation, um, under my purview anyway,” Cunningham said, according to the transcript. “And so with the superiors that were, you know, inquiring about this, it made it a priority, because we worked for them.”

    5. Students and faculty confirmed they stopped protesting out of fear.

    Over the trial’s first two days, five noncitizen faculty members took the stand to describe how news about activists being targeted had caused them to stop engaging in various political activities. They said they decided not to attend protests or sign statements related to Israel’s war in Gaza after hearing about Khalil’s and Öztürk’s arrests.

    One Brown University professor, Nadje Al-Ali, said she cancelled longstanding plans to travel to Beirut and Baghdad for research into women artists and gender-based violence in the Middle East.

    “Following the arrest and the detention and the threat of deportation of several students, graduate students, and also I think one post-doc—I mean, most prominently Mahmoud Khalil but others as well—I started to think that it is not a good idea,” she said. “I felt that it was too risky for me to do research in the Middle East, come back, and then my pro-Palestinian speech would be flagged. And as a green card holder and also as a prior director for the Center For Middle East Studies that had been under attack, and there are a lot of sort of false allegations about, I felt very vulnerable.”;

    The fear also extended beyond speech related to the Middle East; Al-Ali also refrained from attending a protest on No Kings Day, a massive day of demonstration that opposed President Donald Trump’s policies in his second presidency, including cutting federal government offices, defunding research and social services, and his mass deportation campaign.

    Source link

  • ICE Reveals How It Targeted International Students

    ICE Reveals How It Targeted International Students

    Federal immigration officials targeted student visa holders by running their names through a federal database of criminal histories, according to court testimony given by Department of Homeland Security officials on Tuesday and reported by Politico.

    As part of the Student Criminal Alien Initiative, as officials dubbed the effort, 20 ICE agents and several federal contractors ran the names of 1.3 million potential student visa holders through the database, searching for those that were both still enrolled in programs and had had some brush with the criminal justice system. Many of those students had only minor criminal infractions on their record like traffic violations, and they often had never been charged. ICE used that information to terminate students’ SEVIS records.

    Officials testified that ICE ultimately flagged around 6,400 Student Exchange and Visitor Information System records for termination and used the data to revoke more than 3,000 student visas—far more than the 1,800 that Inside Higher Ed tracked over the past month. 

    The officials’ testimony came in a hearing for one of many lawsuits filed by international students and immigration attorneys challenging the sudden and unexplained visa terminations; dozens of the cases have been successful so far. Last week the agency restored international students’ visas amid the flurry of court losses and said it would release an updated policy in the near future. 

    On Monday, the Trump administration released a draft of that policy, which vastly expands the prior one and makes visa revocation legal grounds for a student’s legal residency to be terminated as well.

    Source link

  • Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Before our show starts today, I just wanna take a minute to note the passing of Professor Claire Callender, OBE. For the last two and a half decades, she’s been one of the most important figures in UK higher education studies, in particular with respect to student loans and student finance. Holder of a joint professorship at UCL Institute of Education and Birkbeck University of London, she was also instrumental in setting up the ESRC Centre for Global Higher Education, of which she later became deputy director. I just want to quote the short obituary that her colleague Simon Marginson wrote for her last week after her passing from lung cancer. He said, “What we’ll remember about Claire is the way she focused her formidable capacity for rational thought on matters to which she was committed, her gravitas that held the room when speaking, and the warmth that she evoked without fail in old and new acquaintances.”

    My thoughts and condolences to her partner Annette, and to her children. We’ll all miss Claire. 


    I suspect most of you are familiar with the OECD’s Program for International Student Assessment, or PISA. That’s a triannual test of 15 year olds around the world. It tries to compare how teenagers fare in real world tests of literacy and numeracy. But you might not be as familiar with PISA’s cousin, the Program for International Assessment of Adult Competencies or PIAAC. To simplify enormously, it’s PISA, but for adults, and it only comes out once a decade with the latest edition having appeared on December 10th of last year. Now, if you’re like most people, you’re probably asking yourself, what does PIAAC measure exactly?

    PISA pretty clearly is telling us something about school systems. Adults, the subject of the PIAAC test, they’ve been out of school for a long time. What do test results mean for people who’ve been out of school for, in some cases, decades? And what kinds of meaningful policies might be made on the basis of this data?

    Today my guest is the CEO of Canada’s Future Skills Centre, Noel Baldwin. Over the past decade, both in his roles at FSC, his previous ones at the Council Minister of Education Canada, he’s arguably been one of the country’s most dedicated users of PIAAC data. As part of Canada’s delegation to the OECD committee in charge of PIAAC, he also had a front row seat to the development of these tests and the machinery behind these big international surveys. 

    Over the course of the next 20 or so minutes, you’ll hear Noel and I, both fellow members of the Canada Millennium Scholarship Foundation Mafia, discuss such issues as how the wording of international surveys gets negotiated, why we seem to be witnessing planet wide declines in adult literacy, what research questions PIAAC is best suited to answer, and maybe most intriguingly what PIAAC 3 might look like a decade from now.

    I really enjoyed this conversation and I hope you do too. Anyway, over to Noel.


    The World of Higher Education Podcast
    Episode 3.28 | Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Transcript

    Alex Usher (AU): Noel, some of our listeners might be familiar with big international testing programs like PISA—the Program for International Student Assessment. But what is the Program for the International Assessment of Adult Competencies? What does it aim to measure, and why?

    Noel Baldwin (NB): It’s somewhat analogous to PISA, but it’s primarily focused on working-age adults. Like PISA, it’s a large-scale international assessment organized by the OECD—specifically by both the education and labor secretariats. It’s administered on the ground by national statistical agencies or other government agencies in participating countries.

    PIAAC is mainly focused on measuring skills like literacy and numeracy. Over time, though, the OECD has added other skill areas relevant to the intersection of education and labor markets—things like digital skills, technology use, problem solving, and social-emotional skills.

    In addition to the assessment itself, there’s a large battery of background questions that gather a lot of demographic information—details about respondents’ work life, and other factors like health and wellbeing. This allows researchers to draw correlations between the core skills being measured and how those skills are used, or what kind of impact they have on people’s lives.

    AU: How do they know that what’s being measured is actually useful in the workplace? I mean, the literacy section is reading comprehension, and the math is sort of like, you know, “If two trains are moving toward each other, one from Chicago and one from Pittsburgh…” It’s a bit more sophisticated than that, but that kind of thing. How do they know that actually measures anything meaningful for workplace competencies?

    NB: That’s a good question. One thing to start with is that the questions build from fairly easy and simple tasks to much more complex ones. That allows the OECD to create these scales, and they talk a lot about proficiency levels—level one up to five, and even below level one in some cases, for people with the weakest skill levels.

    And while PIAAC itself is relatively new, the assessment of these competencies isn’t. It actually dates back to the early 1990s. There’s been a lot of research—by the OECD and by psychometricians and other researchers—on the connections between these skills and broader outcomes.

    The key thing to understand is that, over time, there’s been strong evidence linking higher literacy and numeracy skills to a range of life outcomes, especially labor market outcomes. It’s a bit like educational attainment—these things often act as proxies for one another. But the stronger your skills, the more likely you are to be employed, to earn higher wages, to avoid unemployment, and to be adaptable and resilient.

    And it’s not just about work. It extends to other areas too—life satisfaction, for instance. There are even some interesting findings about democratic participation and people’s perceptions of how their society is doing. So there are pretty strong correlations between higher-level skills and a variety of positive outcomes.

    AU: But, I can imagine that the nature of an economy—whether it’s more manufacturing-based or service-based—might affect what kinds of skills are relevant. So different countries might actually want to measure slightly different things. How do you get 50—or however many, dozens of countries—to agree on what skills to assess and how to measure them?

    NB: The point at which OECD countries agreed to focus on literacy and numeracy actually predates me—and it also predates a lot of today’s focus on more digitally oriented skills. It was a much more analog world when this started, and so literacy and numeracy made a lot of sense. At the time, most of the information people consumed came in some form of media that required reading comprehension and the ability to navigate text. And then, on the numeracy side, the ability to do anything from basic to fairly advanced problem solving with numbers was highly relevant. So I suspect that when this was being developed—through the 1980s and into the early 1990s—there was a high degree of consensus around focusing on those core skills.

    The development of the instruments themselves is also an international effort. It’s led by the OECD, but they work with experts from a range of countries to test and validate the items used in the assessment. Educational Testing Service (ETS) in the U.S. is quite involved, and there are also experts from Australia and Canada. In fact, Canada was very involved in the early stages—both through Statistics Canada and other experts—particularly in developing some of the initial tools for measuring literacy. So, the consensus-building process includes not just agreeing on what to measure and how to administer it, but also developing the actual assessment items and ensuring they’re effective. They do field testing before rolling out the main assessment to make sure the tools are as valid as possible.

    AU: Once the results are in and published, what happens next? How do governments typically use this information to inform policy?

    NB: I’ll admit—even having been on the inside of some of this—it can still feel like a bit of a black box. In fact, I’d say it’s increasingly becoming one, and I think we’ll probably get into that more as the conversation goes on.

    That said, different countries—and even different provinces and territories within Canada—use the information in different ways. It definitely gets integrated into various internal briefings. I spent some time, as you know, at the Council of Ministers of Education, and we saw that both in our own work and in the work of officials across the provinces and territories.

    After the last cycle of PIAAC, for instance, Quebec produced some fairly detailed reports analyzing how Quebecers performed on the PIAAC scales—comparing them to other provinces and to other countries. That analysis helped spark conversations about what the results meant and what to do with them. New Brunswick, for example, launched a literacy strategy shortly after the last PIAAC cycle, which suggests a direct link between the data and policy action.

    So there are examples like that, but it’s also fair to say that a lot of the data ends up being used internally—to support conversations within governments. Even since the most recent PIAAC cycle was released in December, I’ve seen some of that happening. But there’s definitely less in the public domain than you might expect—and less than there used to be, frankly.

    AU: Some of the findings in this latest PIAAC cycle—the headline that got the most traction, I think—was the fact that we’re seeing declines in literacy and numeracy scores across much of the OECD. A few countries bucked the trend—Canada saw a small decline, and parts of Northern Europe did okay—but most countries were down. What are the possible explanations for this trend? And should we be concerned?

    NB: I think we should be really aware. When it comes to concern, though, I’m always a bit hesitant to declare a crisis. There’s a lot of work still to be done to unpack what’s going on in this PIAAC cycle.

    One thing to keep in mind is that most of the responses were collected during a time of ongoing global turmoil. The data was gathered in 2022, so we were still in the middle of the pandemic. Just getting the sample collected was a major challenge—and a much bigger one than usual.

    With that caveat in mind, the OECD has started to speculate a bit, especially about the literacy side. One of the things they’re pointing to is how radically the way people consume information has changed over the past 10 years.

    People are reading much shorter bits of text now, and they’re getting information in a much wider variety of formats. There are still items in the literacy assessment that resemble reading a paragraph in a printed newspaper—something that just doesn’t reflect how most people engage with information anymore. These days, we get a lot more of it through video and audio content.

    So I think those shifts in how we consume information are part of the story. But until we see more analysis, it’s hard to say for sure. There are some signals—differences in gender performance across countries, for example—that we need to unpack. And until we do that, we’re not going to have a great sense of why outcomes look the way they do.

    AU: Let’s focus on Canada for a second. As with most international education comparisons, we end up in the top—but at the bottom of the top third, basically. It doesn’t seem to matter what we do or when—it’s always that pattern. Looking at global trends, do you think Canada stands out in any way, positively or negatively? Are there things we’re doing right? Or things we’re getting wrong?

    NB: Well, I’d say we continue to see something that the OECD points out almost every time we do one of these assessments: the gap between our top performers and our lowest performers is smaller than in many other countries. That’s often taken as a sign of equity, and I’d say that’s definitely a good news story.

    In the global comparison, we held pretty much steady on literacy, while many countries saw declines. Holding steady when others are slipping isn’t a bad outcome. And in numeracy, we actually improved.

    The distribution of results across provinces was also more even than in the last cycle. Last time, there was much more variation, with several provinces falling below the OECD or Canadian average. This time around, we’re more tightly clustered, which I think is another positive.

    If you dig a little deeper, there are other encouraging signs. For example, while the OECD doesn’t have a perfect measure of immigration status, it can identify people who were born outside a country or whose parents were. Given how different Canada’s demographic profile is from nearly every other participating country—especially those in Northern Europe—I think we’re doing quite well in that regard.

    And in light of the conversations over the past few years about immigration policy and its impacts across our society, I think it’s a pretty good news story that we’re seeing strong performance among those populations as well.

    AU: I know we’ll disagree about this next question. My impression is that, in Canada, the way PIAAC gets used has really changed over the last decade. The first round of PIAAC results got a lot of attention—StatsCan and the Council of Ministers of Education both published lengthy analyses.

    And maybe “crickets” is too strong a word to describe the reaction this time, but it’s definitely quieter. My sense is that governments just don’t care anymore. When they talk about skills, the narrative seems focused solely on nursing and the skilled trades—because those are seen as bottlenecks on the social side and the private sector side.

    But there’s very little interest in improving transversal skills, and even less knowledge or strategy about how to do so. Make me less cynical.

    NB: Well, it’s funny—this question is actually what kicked off the conversation that led to this podcast. And I’ll confess, you’ve had me thinking about it for several weeks now.

    One thing I want to distinguish is caring about the skills themselves versus how the data is being released and used publicly. There’s no denying that we’re seeing less coming out publicly from the governments that funded the study. That’s just true—and I’m not sure that’s going to change.

    I think that reflects a few things. Partly, it’s the changed fiscal environment and what governments are willing to pay for. But it’s also about the broader information environment we’re in today compared to 2013.

    As I’ve been reflecting on this, I wonder if 2012 and 2013 were actually the tail end of the era of evidence-based policymaking—and that now we’re in the era of vibes-based policymaking. And if that’s the case, why would you write up detailed reports about something you’re mostly going to approach from the gut?

    On the skills side, though, I still think there’s an interesting question. A few weeks ago, I felt more strongly about this, but I still believe it’s not that governments don’t care about these foundational skills. Rather, I think the conversation about skills has shifted.

    We may have lost sight of how different types of skills build on one another—starting from foundational literacy and numeracy, then layering on problem-solving, and eventually reaching digital competencies. That understanding might be missing in the current conversation.

    Take the current moment around AI, for example. Maybe “craze” is too strong a word, but there’s a belief that people will become great at prompt engineering without any formal education. Mark Cuban—on BlueSky or wherever, I’m not sure what they call posts there—made a point recently that you won’t need formal education with generative AI. If you can get the right answers out of a large language model, you’ll outperform someone with an advanced degree.

    But that completely overlooks how much you need to understand in order to ask good questions—and to assess whether the answers you get are worth anything. So we may start to see that shift back.

    That said, you’re right—there has definitely been a move in recent years toward thinking about workforce issues rather than broader skill development. And that may be a big part of what’s going on.

    AU: What do you think is the most interesting or under-explored question that PIAAC data could help answer, but that we haven’t fully investigated yet? This dataset allows for a lot of interesting analysis. So if you could wave a magic wand and get some top researchers working on it—whether in Canada or internationally—where would you want them to focus?

    NB: First, I’ll just make a small plug. We’ve been working on what we hope will become a PIAAC research agenda—something that responds to the things we care about at the Future Skills Centre, but that we hope to advance more broadly in the coming weeks and months. So we are actively thinking about this.

    There are a bunch of areas that I think are really promising. One is the renewed conversation about productivity in Canada. I think PIAAC could shed light on the role that skills play in that. The Conference Board of Canada did a piece a while back looking at how much of the productivity gap between Canada and the U.S. is due to skill or labor factors. Their conclusion was that it wasn’t a huge part—but I think PIAAC gives us tools to continue digging into that question.

    Another area the OECD often highlights when talking about Canada is the extent to which workers are overqualified or overskilled for the jobs they’re in. That’s a narrative that’s been around for a while, but one where I think PIAAC could offer deeper insights.

    It becomes even more interesting when you try to link it to broader labor supply questions—like the role of immigration. Some people have suggested that one reason Canada lags in things like technology integration or capital investment is that we’ve substituted skilled labor for that kind of investment.

    With PIAAC, we might be able to explore whether overqualification or overskilling is connected to the way we’ve managed immigration over the last couple of decades.

    So, there are a few areas there that I think are both relevant and under-explored. And of course, on the international side, you’re right—we should be looking for examples of countries that have had success, and thinking about what we can emulate, borrow from, or be inspired by.

    AU: I don’t know if either of us wants to still be doing this in 10 years, but if we were to have this conversation again a decade from now, what do you think—or hope—will have changed? What will the long-term impact of PIAAC Cycle 2 have been, and how do you think PIAAC 3 might be different?

    NB: Well, I think I need to say this out loud: I’m actually worried there won’t be a PIAAC 3.

    We’re recording this in early 2025, which is a pretty turbulent time globally. One of the things that seems clear is that the new U.S. administration isn’t interested in the Department of Education—which likely means they won’t be interested in continuing the National Center for Education Statistics.

    And like with many international initiatives, the U.S. plays a big role in driving and valuing efforts like PIAAC. So I do worry about whether there will be a third cycle. If it happens without U.S. participation, it would be a very different kind of study.

    But I hope that in 10 years, we are talking about a robust PIAAC 3—with strong participation from across OECD countries.

    I also hope there’s continued investment in using PIAAC data to answer key research questions. It’s just one tool, of course, but it’s a big one. It’s the only direct assessment of adult skills we have—where someone is actually assessed on a defined set of competencies—so it’s really valuable.

    For an organization like ours, which is focused on adult skills in the workforce, it’s up to us to push forward and try to get answers to some of these questions. And I hope the research we and others are doing will find its way into policy conversations—especially as we think about how workforce needs, skills, and the broader economy are going to change over the next decade.

    It would be a wasted opportunity if it didn’t.

    AU: Noel, thanks so much for being with us today.

    NB: Thanks Alex.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service. Please note, the views and opinions expressed in each episode are those of the individual contributors, and do not necessarily reflect those of the podcast host and team, or our sponsors.

    This episode is sponsored by Studiosity. Student success, at scale – with an evidence-based ROI of 4.4x return for universities and colleges. Because Studiosity is AI for Learning — not corrections – to develop critical thinking, agency, and retention — empowering educators with learning insight. For future-ready graduates — and for future-ready institutions. Learn more at studiosity.com.

    Source link

  • Study Reveals Key Factors Driving Student College Choice in 2025

    Study Reveals Key Factors Driving Student College Choice in 2025

    A comprehensive new study by education research firm EAB has identified the most influential factors shaping how students choose colleges, with academic program variety, campus safety, and student organizations emerging as the top three drivers of student attraction.

    The research, analyzing data from U.S. four-year colleges, found that schools offering a wider range of majors see significantly higher student interest, with each additional program contributing to increased application and enrollment rates. Campus safety measures and the number of available student organizations were also found to be major factors in students’ decision-making process.

    “What’s particularly interesting is how these factors play out differently across institution types,” said Dr. Ryan Gardner-Cook, the project director. “For example, smaller schools gain more from incremental improvements in campus amenities and academic offerings compared to larger institutions.”

    The study also revealed that affordability remains a critical factor, especially for first-generation and low-income students. Schools with lower net prices and stronger financial aid packages showed notably higher attraction rates among these demographics.

    Environmental factors like climate and location also play a significant role. Schools in temperate climates and growing urban areas generally showed stronger appeal to prospective students. State-level political environments were found to increasingly influence student choice as well.

    The research identified nine distinct “institutional personas” ranging from “Accessible Education Anchors” to “Rigorous Academic Giants,” each with unique characteristics and challenges in attracting students. This classification system aims to help institutions better understand their competitive position and develop more effective recruitment strategies.

    For institutions looking to improve their student attraction, the study recommends focusing on controllable factors like admissions processes, student life offerings, and academic programs while finding ways to mitigate challenges related to location or cost.

    The findings come at a crucial time as higher education institutions face evolving student preferences and increasing competition for enrollment.

     

    Source link

  • Report Reveals Harvard MBAs Struggling to Get Jobs (Palki Sharma)

    Report Reveals Harvard MBAs Struggling to Get Jobs (Palki Sharma)

    A new report has revealed that 23% of Harvard MBAs were jobless even three months after their graduation. Similar trends have been reported in top B-schools across the world. Once considered a sure-shot ticket to success, what explains the changing fortunes of MBA degrees?

    Source link

  • Major parent survey reveals widespread dissatisfaction with state’s schools

    Major parent survey reveals widespread dissatisfaction with state’s schools

    A new survey of more than 400 New Mexico parents of school-aged children shows widespread dissatisfaction with the state’s public schools, that communication gaps between schools and parents are a serious concern, and that many parents have misperceptions about their children’s academic achievement.

    Results of the survey, “The State of Educational Opportunity in New Mexico,” were released Oct. 2 by NewMexicoKidsCAN, an education advocacy organization (and parent organization of New Mexico Education), focused on improving New Mexico’s public education system.

    The state survey was part of a national report authored by 50CAN, of which NewMexicoKidsCan is an affiliate. 50CAN is “focused on building the future of American education,” according to the organization’s website. That 214-page report, “The State of Educational Opportunity in America” provides a deep, 50-state dive into parental views of public education in their home states.

    Researchers surveyed more than 20,000 parents across the country, making it one of the largest education-focused surveys of parents in the past decade. This survey explores the ecosystem of educational opportunities inside and outside of school, and how they interrelate and impact a child’s success.

    “With such a large sample size, we are able to dig into the findings by state and across a range of important audiences. By making the findings publicly available, this is a gift of data that can inform conversations among communities and elected officials.” said Pam Loeb, Principal at Edge Research.

    The New Mexico survey provides insight into the educational opportunities available to children across New Mexico.

    The New Mexico survey uncovered key findings, including:

    • Parental dissatisfaction is widespread: Only about a third of New Mexico parents say they are “very satisfied” with their child’s school. Nationally, 45 percent of parents reported high satisfaction. New Mexico was one of the lower-ranked states in terms of parental satisfaction.
    • Communication Gaps Between Schools and Parents: Only 29% of New Mexico parents report feeling extremely confident in understanding their child’s academic progress ranking New Mexico second to last in the nation. 
    • Misperceptions about Student Achievement: 41% of New Mexico parents believe their child is above grade level in reading, yet state assessments show only 39% of students are reading at grade level. 
    • Afterschool Programs Show Promise: New Mexico ranks 22nd nationally in student participation in supervised afterschool programs, surpassing 28 other states. This success is likely attributed to increased state investments through the Extended Learning Time Program, which may have boosted overall participation rates.

    “This survey amplifies the voices of New Mexico parents,” said Amanda Aragon, Executive Director of NewMexicoKidsCAN. “The results reveal significant misperceptions about student performance, serious communication gaps between schools and parents, and widespread concerns about school satisfaction. 

    “It’s clear that many parents are not getting the information they need about their children’s academic progress. We must do more to close this communication gap and empower parents to be true partners in their child’s education.”

    “With such a large sample size, we are able to dig into the findings by state and across a range of important audiences. By making the findings publicly available, this is a gift of data that can inform conversations among communities and elected officials.” said Pam Loeb, Principal at Edge Research.

    Source link