Category: Data

  • Otus Wins Gold Stevie® Award for Customer Service Department of the Year

    Otus Wins Gold Stevie® Award for Customer Service Department of the Year

    CHICAGO, IL (GLOBE NEWSWIRE) — Otus, a leading provider of K-12 student data and assessment solutions, has been awarded a prestigious Gold Stevie® Award in the category of Customer Service Department of the Year at the 2025 American Business Awards®. This recognition celebrates the company’s unwavering commitment to supporting educators, students, and families through exceptional service and innovation.

    In addition to the Gold award, Otus also earned two Silver Stevie® Awards: one for Company of the Year – Computer Software – Medium Size, and another honoring Co-founder and President Chris Hull as Technology Executive of the Year.

    “It is an incredible honor to be recognized, but the real win is knowing our work is making a difference for educators and students,” said Hull. “As a former teacher, I know how difficult it can be to juggle everything that is asked of you. At Otus, we focus on building tools that save time, surface meaningful insights, and make student data easier to use—so teachers can focus on what matters most: helping kids grow.”

    The American Business Awards®, now in their 23rd year, are the premier business awards program in the United States, honoring outstanding performances in the workplace across a wide range of industries. The competition receives more than 12,000 nominations every year. Judges selected Otus for its outstanding 98.7% customer satisfaction with chat interactions, and exceptional 89% gross retention in 2024. They also praised the company’s unique blend of technology and human touch, noting its strong focus on educator-led support, onboarding, data-driven product evolution, and professional development.

    “We believe great support starts with understanding the realities educators face every day. Our Client Success team is largely made up of former teachers and school leaders, so we speak the same language. Whether it’s during onboarding, training, or day-to-day communication, we’re here to help districts feel confident and supported. This recognition is a reflection of how seriously we take that responsibility and energizes us to keep raising the bar,” said Phil Collins, Ed.D., Chief Customer Officer at Otus.

    Otus continues to make significant strides in simplifying teaching and learning by offering a unified platform that integrates assessment, data, and instruction—all in one place. Otus has supported over 1 million students nationwide by helping educators make data-informed decisions, monitor progress, and personalize learning. These honors reflect the company’s growth, innovation, and steadfast commitment to helping school communities succeed.

    About Otus

    Otus, an award-winning edtech company, empowers educators to maximize student performance with a comprehensive K-12 assessment, data, and insights solution. Committed to student achievement and educational equity, Otus combines student data with powerful tools that provide educators, administrators, and families with the insights they need to make a difference. Built by teachers for teachers, Otus creates efficiencies in data management, assessment, and progress monitoring to help educators focus on what matters most—student success. Today, Otus partners with school districts nationwide to create informed, data-driven learning environments. Learn more at Otus.com.

    Stay connected with Otus on LinkedIn, Facebook, X, and Instagram.

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)



    Source link

  • Govt. data error sparks doubt over US international enrolments

    Govt. data error sparks doubt over US international enrolments

    The reliability of federal datasets is under scrutiny after an error was identified on the Student and Exchange Visitor Information System (SEVIS) website that appeared to show stagnating international student numbers from August 2024 to the present.  

    The error, brought to The PIE News’s attention by EnglishUSA, casts doubt on recent headlines and media reports about declining international student enrolments in the US, with SEVIS data appearing to show an enrolment decline of 11% between March 2024 and March 2025.  

    “Starting in August 2024, the data appears to be duplicated month after month, with flatlined totals for students on F and M visas. These figures show virtually no fluctuation during a period when natural enrolment shifts would be expected,” explained EnglishUSA executive director, Cheryl Delk-Le Good.  

    “This irregularity comes at a time of heightened concern within the field, particularly as educators and administrators manage the fallout from widespread SEVIS terminations and the resulting confusion around visa status for international students,” added Delk-Le Good.  

    The US Department of Homeland Security (DHS), which runs SEVIS, was alerted to the error on April 14 and said it was “working to resolve the issue”.  

    As of April 25, the dataset has not been updated, and DHS has not responded to The PIE’s request for comment.  

    US International Trade Administration. Market Diversification Tool for International Education. 2023. Retrieved: April 11, 2025.

    Notably, the inaccuracies begin in August 2024 and span both US administrations, suggesting “a computer glitch rather than an intentional act,” said Mark Algren – interim director of the Applied English Center at the University of Kansas and a contributor to EnglishUSA’s data initiatives – who noticed the anomaly.  

    However, Algren added that he had “no idea why someone didn’t catch it,” with the considerable timeframe of the glitch likely to hamper confidence in federal datasets that are relied on by institutions and that ensure transparency in the system.  

    Total F&M visa holders in the US: 

    Month  Total F&M  Change from previous month 
    August 24   1,091,134  -59,822 
    September 24   1,091,137  +3 
    October 24  1,091,141  +4 
    November 24  1,091,144  +3 
    January 25  1,091,142  -2 
    February 25  1,091,155  +13 
    March 25  1,091,161  +11 
    Source: SEVIS

    It is important to note that each monthly dataset recorded by SEVIS is a snapshot of a given day that month, and the drop recorded in August 2024 (which is considered the last accurate figure) could have been taken before many students arrived for the fall academic term.  

    For this reason, “it’s hard to say that an August report is representative of the following fall term,” said Algren, with the true figures yet to be seen.  

    At the start of the 2024/25 academic year, IIE’s fall snapshot reported a 3% rise in international student enrolment, building on sustained growth over the last three years. 

    Despite recent uncertainty in the US caused by the current administration’s recent attacks on higher education, the period of SEVIS’ misreporting represents an earlier timeframe before the impact of Trump’s policies came into effect.  

    Source link

  • Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Before our show starts today, I just wanna take a minute to note the passing of Professor Claire Callender, OBE. For the last two and a half decades, she’s been one of the most important figures in UK higher education studies, in particular with respect to student loans and student finance. Holder of a joint professorship at UCL Institute of Education and Birkbeck University of London, she was also instrumental in setting up the ESRC Centre for Global Higher Education, of which she later became deputy director. I just want to quote the short obituary that her colleague Simon Marginson wrote for her last week after her passing from lung cancer. He said, “What we’ll remember about Claire is the way she focused her formidable capacity for rational thought on matters to which she was committed, her gravitas that held the room when speaking, and the warmth that she evoked without fail in old and new acquaintances.”

    My thoughts and condolences to her partner Annette, and to her children. We’ll all miss Claire. 


    I suspect most of you are familiar with the OECD’s Program for International Student Assessment, or PISA. That’s a triannual test of 15 year olds around the world. It tries to compare how teenagers fare in real world tests of literacy and numeracy. But you might not be as familiar with PISA’s cousin, the Program for International Assessment of Adult Competencies or PIAAC. To simplify enormously, it’s PISA, but for adults, and it only comes out once a decade with the latest edition having appeared on December 10th of last year. Now, if you’re like most people, you’re probably asking yourself, what does PIAAC measure exactly?

    PISA pretty clearly is telling us something about school systems. Adults, the subject of the PIAAC test, they’ve been out of school for a long time. What do test results mean for people who’ve been out of school for, in some cases, decades? And what kinds of meaningful policies might be made on the basis of this data?

    Today my guest is the CEO of Canada’s Future Skills Centre, Noel Baldwin. Over the past decade, both in his roles at FSC, his previous ones at the Council Minister of Education Canada, he’s arguably been one of the country’s most dedicated users of PIAAC data. As part of Canada’s delegation to the OECD committee in charge of PIAAC, he also had a front row seat to the development of these tests and the machinery behind these big international surveys. 

    Over the course of the next 20 or so minutes, you’ll hear Noel and I, both fellow members of the Canada Millennium Scholarship Foundation Mafia, discuss such issues as how the wording of international surveys gets negotiated, why we seem to be witnessing planet wide declines in adult literacy, what research questions PIAAC is best suited to answer, and maybe most intriguingly what PIAAC 3 might look like a decade from now.

    I really enjoyed this conversation and I hope you do too. Anyway, over to Noel.


    The World of Higher Education Podcast
    Episode 3.28 | Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Transcript

    Alex Usher (AU): Noel, some of our listeners might be familiar with big international testing programs like PISA—the Program for International Student Assessment. But what is the Program for the International Assessment of Adult Competencies? What does it aim to measure, and why?

    Noel Baldwin (NB): It’s somewhat analogous to PISA, but it’s primarily focused on working-age adults. Like PISA, it’s a large-scale international assessment organized by the OECD—specifically by both the education and labor secretariats. It’s administered on the ground by national statistical agencies or other government agencies in participating countries.

    PIAAC is mainly focused on measuring skills like literacy and numeracy. Over time, though, the OECD has added other skill areas relevant to the intersection of education and labor markets—things like digital skills, technology use, problem solving, and social-emotional skills.

    In addition to the assessment itself, there’s a large battery of background questions that gather a lot of demographic information—details about respondents’ work life, and other factors like health and wellbeing. This allows researchers to draw correlations between the core skills being measured and how those skills are used, or what kind of impact they have on people’s lives.

    AU: How do they know that what’s being measured is actually useful in the workplace? I mean, the literacy section is reading comprehension, and the math is sort of like, you know, “If two trains are moving toward each other, one from Chicago and one from Pittsburgh…” It’s a bit more sophisticated than that, but that kind of thing. How do they know that actually measures anything meaningful for workplace competencies?

    NB: That’s a good question. One thing to start with is that the questions build from fairly easy and simple tasks to much more complex ones. That allows the OECD to create these scales, and they talk a lot about proficiency levels—level one up to five, and even below level one in some cases, for people with the weakest skill levels.

    And while PIAAC itself is relatively new, the assessment of these competencies isn’t. It actually dates back to the early 1990s. There’s been a lot of research—by the OECD and by psychometricians and other researchers—on the connections between these skills and broader outcomes.

    The key thing to understand is that, over time, there’s been strong evidence linking higher literacy and numeracy skills to a range of life outcomes, especially labor market outcomes. It’s a bit like educational attainment—these things often act as proxies for one another. But the stronger your skills, the more likely you are to be employed, to earn higher wages, to avoid unemployment, and to be adaptable and resilient.

    And it’s not just about work. It extends to other areas too—life satisfaction, for instance. There are even some interesting findings about democratic participation and people’s perceptions of how their society is doing. So there are pretty strong correlations between higher-level skills and a variety of positive outcomes.

    AU: But, I can imagine that the nature of an economy—whether it’s more manufacturing-based or service-based—might affect what kinds of skills are relevant. So different countries might actually want to measure slightly different things. How do you get 50—or however many, dozens of countries—to agree on what skills to assess and how to measure them?

    NB: The point at which OECD countries agreed to focus on literacy and numeracy actually predates me—and it also predates a lot of today’s focus on more digitally oriented skills. It was a much more analog world when this started, and so literacy and numeracy made a lot of sense. At the time, most of the information people consumed came in some form of media that required reading comprehension and the ability to navigate text. And then, on the numeracy side, the ability to do anything from basic to fairly advanced problem solving with numbers was highly relevant. So I suspect that when this was being developed—through the 1980s and into the early 1990s—there was a high degree of consensus around focusing on those core skills.

    The development of the instruments themselves is also an international effort. It’s led by the OECD, but they work with experts from a range of countries to test and validate the items used in the assessment. Educational Testing Service (ETS) in the U.S. is quite involved, and there are also experts from Australia and Canada. In fact, Canada was very involved in the early stages—both through Statistics Canada and other experts—particularly in developing some of the initial tools for measuring literacy. So, the consensus-building process includes not just agreeing on what to measure and how to administer it, but also developing the actual assessment items and ensuring they’re effective. They do field testing before rolling out the main assessment to make sure the tools are as valid as possible.

    AU: Once the results are in and published, what happens next? How do governments typically use this information to inform policy?

    NB: I’ll admit—even having been on the inside of some of this—it can still feel like a bit of a black box. In fact, I’d say it’s increasingly becoming one, and I think we’ll probably get into that more as the conversation goes on.

    That said, different countries—and even different provinces and territories within Canada—use the information in different ways. It definitely gets integrated into various internal briefings. I spent some time, as you know, at the Council of Ministers of Education, and we saw that both in our own work and in the work of officials across the provinces and territories.

    After the last cycle of PIAAC, for instance, Quebec produced some fairly detailed reports analyzing how Quebecers performed on the PIAAC scales—comparing them to other provinces and to other countries. That analysis helped spark conversations about what the results meant and what to do with them. New Brunswick, for example, launched a literacy strategy shortly after the last PIAAC cycle, which suggests a direct link between the data and policy action.

    So there are examples like that, but it’s also fair to say that a lot of the data ends up being used internally—to support conversations within governments. Even since the most recent PIAAC cycle was released in December, I’ve seen some of that happening. But there’s definitely less in the public domain than you might expect—and less than there used to be, frankly.

    AU: Some of the findings in this latest PIAAC cycle—the headline that got the most traction, I think—was the fact that we’re seeing declines in literacy and numeracy scores across much of the OECD. A few countries bucked the trend—Canada saw a small decline, and parts of Northern Europe did okay—but most countries were down. What are the possible explanations for this trend? And should we be concerned?

    NB: I think we should be really aware. When it comes to concern, though, I’m always a bit hesitant to declare a crisis. There’s a lot of work still to be done to unpack what’s going on in this PIAAC cycle.

    One thing to keep in mind is that most of the responses were collected during a time of ongoing global turmoil. The data was gathered in 2022, so we were still in the middle of the pandemic. Just getting the sample collected was a major challenge—and a much bigger one than usual.

    With that caveat in mind, the OECD has started to speculate a bit, especially about the literacy side. One of the things they’re pointing to is how radically the way people consume information has changed over the past 10 years.

    People are reading much shorter bits of text now, and they’re getting information in a much wider variety of formats. There are still items in the literacy assessment that resemble reading a paragraph in a printed newspaper—something that just doesn’t reflect how most people engage with information anymore. These days, we get a lot more of it through video and audio content.

    So I think those shifts in how we consume information are part of the story. But until we see more analysis, it’s hard to say for sure. There are some signals—differences in gender performance across countries, for example—that we need to unpack. And until we do that, we’re not going to have a great sense of why outcomes look the way they do.

    AU: Let’s focus on Canada for a second. As with most international education comparisons, we end up in the top—but at the bottom of the top third, basically. It doesn’t seem to matter what we do or when—it’s always that pattern. Looking at global trends, do you think Canada stands out in any way, positively or negatively? Are there things we’re doing right? Or things we’re getting wrong?

    NB: Well, I’d say we continue to see something that the OECD points out almost every time we do one of these assessments: the gap between our top performers and our lowest performers is smaller than in many other countries. That’s often taken as a sign of equity, and I’d say that’s definitely a good news story.

    In the global comparison, we held pretty much steady on literacy, while many countries saw declines. Holding steady when others are slipping isn’t a bad outcome. And in numeracy, we actually improved.

    The distribution of results across provinces was also more even than in the last cycle. Last time, there was much more variation, with several provinces falling below the OECD or Canadian average. This time around, we’re more tightly clustered, which I think is another positive.

    If you dig a little deeper, there are other encouraging signs. For example, while the OECD doesn’t have a perfect measure of immigration status, it can identify people who were born outside a country or whose parents were. Given how different Canada’s demographic profile is from nearly every other participating country—especially those in Northern Europe—I think we’re doing quite well in that regard.

    And in light of the conversations over the past few years about immigration policy and its impacts across our society, I think it’s a pretty good news story that we’re seeing strong performance among those populations as well.

    AU: I know we’ll disagree about this next question. My impression is that, in Canada, the way PIAAC gets used has really changed over the last decade. The first round of PIAAC results got a lot of attention—StatsCan and the Council of Ministers of Education both published lengthy analyses.

    And maybe “crickets” is too strong a word to describe the reaction this time, but it’s definitely quieter. My sense is that governments just don’t care anymore. When they talk about skills, the narrative seems focused solely on nursing and the skilled trades—because those are seen as bottlenecks on the social side and the private sector side.

    But there’s very little interest in improving transversal skills, and even less knowledge or strategy about how to do so. Make me less cynical.

    NB: Well, it’s funny—this question is actually what kicked off the conversation that led to this podcast. And I’ll confess, you’ve had me thinking about it for several weeks now.

    One thing I want to distinguish is caring about the skills themselves versus how the data is being released and used publicly. There’s no denying that we’re seeing less coming out publicly from the governments that funded the study. That’s just true—and I’m not sure that’s going to change.

    I think that reflects a few things. Partly, it’s the changed fiscal environment and what governments are willing to pay for. But it’s also about the broader information environment we’re in today compared to 2013.

    As I’ve been reflecting on this, I wonder if 2012 and 2013 were actually the tail end of the era of evidence-based policymaking—and that now we’re in the era of vibes-based policymaking. And if that’s the case, why would you write up detailed reports about something you’re mostly going to approach from the gut?

    On the skills side, though, I still think there’s an interesting question. A few weeks ago, I felt more strongly about this, but I still believe it’s not that governments don’t care about these foundational skills. Rather, I think the conversation about skills has shifted.

    We may have lost sight of how different types of skills build on one another—starting from foundational literacy and numeracy, then layering on problem-solving, and eventually reaching digital competencies. That understanding might be missing in the current conversation.

    Take the current moment around AI, for example. Maybe “craze” is too strong a word, but there’s a belief that people will become great at prompt engineering without any formal education. Mark Cuban—on BlueSky or wherever, I’m not sure what they call posts there—made a point recently that you won’t need formal education with generative AI. If you can get the right answers out of a large language model, you’ll outperform someone with an advanced degree.

    But that completely overlooks how much you need to understand in order to ask good questions—and to assess whether the answers you get are worth anything. So we may start to see that shift back.

    That said, you’re right—there has definitely been a move in recent years toward thinking about workforce issues rather than broader skill development. And that may be a big part of what’s going on.

    AU: What do you think is the most interesting or under-explored question that PIAAC data could help answer, but that we haven’t fully investigated yet? This dataset allows for a lot of interesting analysis. So if you could wave a magic wand and get some top researchers working on it—whether in Canada or internationally—where would you want them to focus?

    NB: First, I’ll just make a small plug. We’ve been working on what we hope will become a PIAAC research agenda—something that responds to the things we care about at the Future Skills Centre, but that we hope to advance more broadly in the coming weeks and months. So we are actively thinking about this.

    There are a bunch of areas that I think are really promising. One is the renewed conversation about productivity in Canada. I think PIAAC could shed light on the role that skills play in that. The Conference Board of Canada did a piece a while back looking at how much of the productivity gap between Canada and the U.S. is due to skill or labor factors. Their conclusion was that it wasn’t a huge part—but I think PIAAC gives us tools to continue digging into that question.

    Another area the OECD often highlights when talking about Canada is the extent to which workers are overqualified or overskilled for the jobs they’re in. That’s a narrative that’s been around for a while, but one where I think PIAAC could offer deeper insights.

    It becomes even more interesting when you try to link it to broader labor supply questions—like the role of immigration. Some people have suggested that one reason Canada lags in things like technology integration or capital investment is that we’ve substituted skilled labor for that kind of investment.

    With PIAAC, we might be able to explore whether overqualification or overskilling is connected to the way we’ve managed immigration over the last couple of decades.

    So, there are a few areas there that I think are both relevant and under-explored. And of course, on the international side, you’re right—we should be looking for examples of countries that have had success, and thinking about what we can emulate, borrow from, or be inspired by.

    AU: I don’t know if either of us wants to still be doing this in 10 years, but if we were to have this conversation again a decade from now, what do you think—or hope—will have changed? What will the long-term impact of PIAAC Cycle 2 have been, and how do you think PIAAC 3 might be different?

    NB: Well, I think I need to say this out loud: I’m actually worried there won’t be a PIAAC 3.

    We’re recording this in early 2025, which is a pretty turbulent time globally. One of the things that seems clear is that the new U.S. administration isn’t interested in the Department of Education—which likely means they won’t be interested in continuing the National Center for Education Statistics.

    And like with many international initiatives, the U.S. plays a big role in driving and valuing efforts like PIAAC. So I do worry about whether there will be a third cycle. If it happens without U.S. participation, it would be a very different kind of study.

    But I hope that in 10 years, we are talking about a robust PIAAC 3—with strong participation from across OECD countries.

    I also hope there’s continued investment in using PIAAC data to answer key research questions. It’s just one tool, of course, but it’s a big one. It’s the only direct assessment of adult skills we have—where someone is actually assessed on a defined set of competencies—so it’s really valuable.

    For an organization like ours, which is focused on adult skills in the workforce, it’s up to us to push forward and try to get answers to some of these questions. And I hope the research we and others are doing will find its way into policy conversations—especially as we think about how workforce needs, skills, and the broader economy are going to change over the next decade.

    It would be a wasted opportunity if it didn’t.

    AU: Noel, thanks so much for being with us today.

    NB: Thanks Alex.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service. Please note, the views and opinions expressed in each episode are those of the individual contributors, and do not necessarily reflect those of the podcast host and team, or our sponsors.

    This episode is sponsored by Studiosity. Student success, at scale – with an evidence-based ROI of 4.4x return for universities and colleges. Because Studiosity is AI for Learning — not corrections – to develop critical thinking, agency, and retention — empowering educators with learning insight. For future-ready graduates — and for future-ready institutions. Learn more at studiosity.com.

    Source link

  • Data shows growing GenAI adoption in K-12

    Data shows growing GenAI adoption in K-12

    Key points:

    • K-12 GenAI adoption rates have grown–but so have concerns 
    • A new era for teachers as AI disrupts instruction
    • With AI coaching, a math platform helps students tackle tough concepts
    • For more news on GenAI, visit eSN’s AI in Education hub

    Almost 3 in 5 K-12 educators (55 percent) have positive perceptions about GenAI, despite concerns and perceived risks in its adoption, according to updated data from Cengage Group’s “AI in Education” research series, which regularly evaluates AI’s impact on education.  

    More News from eSchool News

    HVAC projects to improve indoor air quality. Tutoring programs for struggling students. Tuition support for young people who want to become teachers in their home communities.

    Our school has built up its course offerings without having to add headcount. Along the way, we’ve also gained a reputation for having a wide selection of general and advanced courses for our growing student body.

    When it comes to visual creativity, AI tools let students design posters, presentations, and digital artwork effortlessly. Students can turn their ideas into professional-quality visuals, sparking creativity and innovation.

    Ensuring that girls feel supported and empowered in STEM from an early age can lead to more balanced workplaces, economic growth, and groundbreaking discoveries.

    In my work with middle school students, I’ve seen how critical that period of development is to students’ future success. One area of focus in a middle schooler’s development is vocabulary acquisition.

    For students, the mid-year stretch is a chance to assess their learning, refine their decision-making skills, and build momentum for the opportunities ahead.

    Middle school marks the transition from late childhood to early adolescence. Developmental psychologist Erik Erikson describes the transition as a shift from the Industry vs. Inferiority stage into the Identity vs. Role Confusion stage.

    Art has a unique power in the ESL classroom–a magic that bridges cultures, ignites imagination, and breathes life into language. For English Language Learners (ELLs), it’s more than an expressive outlet.

    In the year 2025, no one should have to be convinced that protecting data privacy matters. For education institutions, it’s really that simple of a priority–and that complicated.

    Teachers are superheroes. Every day, they rise to the challenge, pouring their hearts into shaping the future. They stay late to grade papers and show up early to tutor struggling students.

    Want to share a great resource? Let us know at [email protected].

    Source link

  • Three-quarters of global study decisions determined by cost

    Three-quarters of global study decisions determined by cost

    International students are increasingly looking for affordable destinations and alternative programs rather than give up on study abroad due to increasing costs, a new ApplyBoard survey has shown.  

    While 77% of surveyed students ranked affordable tuition fees as the most important factor shaping study decisions, only 9% said they planned to defer their studies based on these concerns, according to a recent student survey from ApplyBoard edtech firm.  

    “Students weren’t planning to wait for things to change,” said ApplyBoard senior communications manager Brooke Kelly: “They’re considering new destinations, adjusting which programs they apply to, and accepting that they have to balance work with study, but they’re still planning to study abroad,” she maintained.  

    Just over one in four students said they were considering different study destinations than originally planned, with Denmark, Finland, Nigeria and Italy the most popular emerging destinations.  

    Additionally, 55% of students said they would have to work part-time to afford their study abroad program.  

    After affordability, came employability (57%), career readiness (49%), high-quality teaching (47%), and program reputation (45%), as factors shaping student decision-making.  

    With students increasingly thinking about work opportunities, software and civil engineering topped students’ career choices, with nursing as the second most popular field. Tech fields including IT, cybersecurity, and data analysis also showed strong interest. 

    What’s more, interest in PhD programs saw a 4% rise on the previous year, while over half of students were considering master’s degrees, indicating that students are increasingly prioritising credentials and post-study work opportunities.  

    [Students are] considering new destinations, adjusting which programs they apply to, and accepting that they have to balance work with study, but they’re still planning to study abroad

    Brooke Kelly, ApplyBoard

    The study surveyed over 3,500 students from 84 countries, with the most represented countries being Nigeria, Ghana, Canada, Pakistan, Bangladesh and India.  

    Given its share of international students, it should be noted that China is absent from the top 10 most represented countries.  

    As students’ priorities shift and currencies fluctuate, “diversity will be key to mitigate against increased volatility and to ensure campuses remain vibrant with students from all around the world,” said Kelly.  

    Meanwhile, institutions should increase communication about scholarships and financial aid, offer more hybrid learning experiences and highlight programs on different timelines such as accelerated degrees, she advised.  

    While alternative markets are on the rise, 65% of respondents said they were only interested in studying in one of the six major destinations, with Canada followed by the US, UK, Australia, Germany and Ireland, in order of popularity.  

    Despite Canada’s international student caps, the largest proportion of students said they were ‘extremely’, ‘very’ or ‘moderately’ interested in the study destination, highlighting its enduring appeal among young people.  

    While stricter controls on post study work were implemented in Canada last year, in a rare easing of policies, the IRCC recently said that all college graduates would once again be eligible for post study work.  

    This change, combined with the fact that international students can still be accompanied by their dependants while studying in Canada, is likely to have contributed to it maintaining its attractiveness, according to Kelly.  

    Source link

  • What the End of DoED Means for the EdTech Industry

    What the End of DoED Means for the EdTech Industry

    The Fed’s influence over school districts had implications beyond just funding and data. Eliminating The Office of Education Technology (OET) will create significant gaps in educational technology research, validation, and equity assurance. Kris Astle, Education Strategist for SMART Technologies, discusses how industry self-governance, third-party organizations, and increased vendor responsibility might fill these gaps, while emphasizing the importance of research-backed design and implementation to ensure effective technology deployment in classrooms nationwide. Have a listen:

    Key Takeaways

    More News from eSchool News

    In recent years, the rise of AI technologies and the increasing pressures placed on students have made academic dishonesty a growing concern. Students, especially in the middle and high school years, have more opportunities than ever to cheat using AI tools.

    As technology trainers, we support teachers’ and administrators’ technology platform needs, training, and support in our district. We do in-class demos and share as much as we can with them, and we also send out a weekly newsletter.

    Math is a fundamental part of K-12 education, but students often face significant challenges in mastering increasingly challenging math concepts.

    Throughout my education, I have always been frustrated by busy work–the kind of homework that felt like an obligatory exercise rather than a meaningful learning experience.

    During the pandemic, thousands of school systems used emergency relief aid to buy laptops, Chromebooks, and other digital devices for students to use in remote learning.

    Education today looks dramatically different from classrooms of just a decade ago. Interactive technologies and multimedia tools now replace traditional textbooks and lectures, creating more dynamic and engaging learning environments.

    There is significant evidence of the connection between physical movement and learning.  Some colleges and universities encourage using standing or treadmill desks while studying, as well as taking breaks to exercise.

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters. In recent weeks, we’ve seen federal and state governments issue stop-work orders, withdraw contracts, and terminate…

    English/language arts and science teachers were almost twice as likely to say they use AI tools compared to math teachers or elementary teachers of all subjects, according to a February 2025 survey from the RAND Corporation.

    During the seven years I served on the Derry School Board in New Hampshire, the board often came first. During those last two years during COVID, when I was chair, that meant choosing many late-night meetings over dinner with my family.

    Want to share a great resource? Let us know at [email protected].

    Source link

  • Capability for change – preparing for digital learning futures

    Capability for change – preparing for digital learning futures

    Digital transformation is an ongoing journey for higher education institutions, but there is something quite distinctive about the current moment.

    The combination of financial uncertainty, changing patterns of student engagement, and the seismic arrival of artificial intelligence is pointing to a future for higher education learning and teaching and a digital student experience that will certainly have some core elements in common with current practice but is likely in many respects to look rather different.

    At the moment I see myself and my colleagues trying to cling to what we always did and what we always know. And I really do think the whole future of what we do and how we teach our students, and what we teach our students is going to accelerate and change very, very quickly now, in the next five years. Institutional leader

    Our conversations with sector leaders and experts over the past six months indicate an ambition to build consistent, inclusive and engaging digital learning environments and to deploy data much more strategically. Getting it right opens up all kinds of possibilities to extend the reach of higher education and to innovate in models for engagement. But future change demands different kinds of technological capabilities, and working practices, and institutions are saying that they are hindered by legacy systems, organisational silos, and a lack of a unified vision.

    Outdated systems do not “talk to each other,” and on a cultural level as departments and central teams also do not “talk to each other” – or may struggle to find a common language. And rather than making life easier, many feel that technology creates significant inefficiencies, forcing staff to spend more time on administrative tasks and less on what truly matters.

    I think the problem always is when we hope something’s going to make it more efficient. But then it just adds a layer of complexity into what we’re doing…I think that’s what we struggle with – what can genuinely deliver some time savings and efficiencies as opposed to putting another layer in a process? Institutional leader

    In the spirit of appreciative inquiry, our report Capability for change – preparing for digital learning futures draws on a series of in depth discussions with leaders of learning and teaching, and digital technology, digital experts and students’ union representatives. We explore the sorts of change that are already in train, and surface insight about how institutions are thinking in terms of building whole-organisation capabilities. “Digital dexterity” – the ability to deploy technology strategically, efficiently, and innovatively to achieve core objectives – may be yet another tech buzzword, but it captures a sense of where organisations are trying to get to.

    While immediate financial pressures may require cutting costs and reprofiling investment, long term sustainability depends on moving forward with change, finding ways, not to do more with less but to do things differently. To realise the most value from technology investment institutional leaders need to find ways to ensure that across the institution staff teams have the knowledge, the motivation and the tools to deploy technology in the service of student success.

    How institutions are building organisational capability

    Running through all our conversations was a tension, albeit a potentially productive one: there needs to be much more consistency and clarity about the primary strategic objectives of the institution and the core technology platforms and applications that enable them. But the effect of, in essence, imposing a more streamlined “central” vision, expectations and processes should be to enable and empower the academic and professional teams to do the things that make for a great student experience. Our research indicates that institutions are focusing on three areas: leadership and strategy; digital capabilities of institutional staff; and breaking down the vertical silos that can hamper effective cross-organisational working.

    A number of reflections point to strategy-level improvements – such as ensuring there is strategic alignment between institutional objectives for student success, and technology and digital strategies; listening to the feedback from students and staff about what they need from technology; setting priorities, and resourcing those priorities from end to end from technology procurement to deployment and evaluation of impact. One institutional leader described what happens when digital strategies get lost in principles and forget to align with the wider success of the organisation:

    The old strategy is fairly similar, I imagine, to many digital strategies that you would have seen – it talks about being user focused, talks about lean delivery, talks about agile methodologies, product and change management and delivering value through showing, not telling. So it was a very top level strategy, but really not built with outcomes at its absolute core, like, what are the things that are genuinely going to change for people, for students? Institutional leader

    Discussions of staff digital capabilities recognised that institutional staff are often hampered by organisational complexity and bureaucracy which too often is mirrored in the digital sphere. One e-learning professional suggested that there is a need for research to really understand why there is a tendency towards proliferation of processes and systems, and confront the impact on staff workloads.

    There may also be limits to what can reasonably be expected from teaching staff in terms of digital learning design:

    You need to establish minimum benchmarks and get everyone to that place, and then some people will be operating well beyond that. You can be clear about basic benchmark expectations around student experience – and then beyond that you need to put in actual support [such as learning design experts] to implement the curriculum framework. E-learning professional

    But the broader insight on staff development was around shifting from provision of training on how to operate systems or tools to a more context-specific exploration of how the available technologies and data can help educators achieve their student success ambitions. Value is more systematically created across the organisation when those academic and professional teams who work directly with students are able to use the technology and data available creatively to enhance their practice and to problem solve.

    Where data has been used before it’s very much sat with senior colleagues in the institution. And you know it’s helped in decision making. But the next step is to try and empower colleagues at the coal face to use data in their day to day interventions with their students… How can they use the data to inform how they support their students? Institutional leader

    Decisive leadership may be successful in setting priorities and streamlining the processes and technologies that underpin them; strong focus on professional development may engage and enable institutional staff. But culture change will come when institutions find ways to systematically build “horizontals” across silos – mechanisms for collaborative and shared activity that bridge different perspectives, languages and disciplinary and professional cultures.

    Some examples we saw included embedding digital professionals in faculties and academic business processes such as recruitment panels, convening of cross-organisation thinking on shared challenges, and appointment of “change agent” roles with a skillset and remit to roam across boundaries.

    Technology providers must be part of the solution – acting as strategic partners rather than suppliers. One way to do that is to support institutions to pilot, test, and develop proof of concept before they decide to invest in large-scale change. Another is to work with institutions to understand how technology is deployed in practice, and the evolving needs of user communities. To be a great partner to the higher education sector means having a deep understanding not only of the technological capabilities that could help the sector but how these might weave into an organisation’s wider mission and values. In this way, technology providers can help to build capability for change.

    This article is published in association with Kortext. You can download the Capability for change report on Kortext’s website. The authors would like to thank all those who shared their insight to inform the report. 

    Source link

  • Check-in on Administrative Bloat, 2025 Edition

    Check-in on Administrative Bloat, 2025 Edition

    Check-in on Administrative Bloat, 2025 Edition

    It’s been a little over five years since I took a serious dive into the question of “administrative bloat,” which apparently exists everywhere but in the statistics. Still, always good to check assumptions every once in a while, and I thought five years was long enough to make a new look at the data worthwhile. So here goes:

    Let’s start by reviewing what we can and cannot know about staffing at Canadian universities. StatsCan tracks the number of permanent ranked faculty pretty accurately through its University and College Academic Staff Survey (UCASS), and in a loosey-goosier fashion through the Labour Force Survey. The latter gives much higher numbers than the former, as shown below in Figure 1, which compares the number of “ranked” academics from UCASS with the number of permanent, full-time academics from the LFS.

    Figure 1 – Full-time Academic Staff Numbers According to LFS and UCASS

    StatsCan also tracks the total number of employees—both salaried and hourly—in the university sector using the Survey of Employment, Payroll and Hours (SEPH). However, in theory, if you subtract the number of FT academic staff from the number of total staff, you should be able to get the total number of non-academic staff, right? Well, unfortunately, this is where the discrepancy between UCASS and LFS runs into some problems. In Figure 2, I show the implied number of non-academics using both methods. The growth rates are different because of the difference in observations in the early period, but the two estimates do both converge on the observation that there are about 130,000 non-academic staff at Canadian universities, or about two and a half times the complement of academic staff.

    Figure 2 – Implied Non-Academic Staff Numbers using SEPH, LFS and UCASS

    So, that’s evidence of bloat, right? Well, maybe. Personally, what I take from Figure 2 is that either (or both) the LFS numbers and the SEPH numbers are probably flaming hot garbage. There’s simply no way that the number of non-academic staff has increased by 170% in the past twenty years, as a combination of the SEPH and LFS data suggests. For reasons that will become apparent shortly, I also have serious doubts that it’s increased by 85% either, as the combination of SPEH and UCASS suggests. Because there is a second set of data available to look at this question, one that shows expenditure on salaries, and it shows a much different picture.

    The annual FIUC survey shows how much money is spent on wages for ranked academics as well as how much is spent on non-academics (it also shows wages for instructional staff without academic rank,” but I exclude this here for ease of analysis). Over the past three years, it is true that non-academic salary mass has risen, and academic ones have not (score for the bloat theory!), but looked at with a 25-year lens, Figure 3 shows that the rate of increase is about the same (score one against).

    Figure 3 – Total Expenditures on Salaries by Employee Group, in millions of $2023

    Basically, the salary data in Figure 3 tells a completely different story than the SEPH/LFS/UCASS data in Figure 2. All you do is divide the spending data by the implied headcounts to see what I mean (which I do below). Figure 4 shows the implied change in average academic pay and average A&S pay, dividing total FIUC pay by the UCASS academic staff numbers and the A&S staff numbers implied by subtracting the UCASS numbers from the SEPH numbers, i.e., the orange line from Figure 2. To believe both sets of data, you have to believe that average academic salaries have increased substantially while average salaries for non-academics have declined substantially.

    Figure 4 – Change in Implied Average Pay, Academic Staff vs. A&S Staff, 2001-02 = 100

    In Figure 4, the blue line representing academic salaries is more or less consistent with the long-term trend in salaries we have seen by looking at salary survey data (which I last did back here): significant growth in the 00s and much slower growth thereafter. There are no staff salary surveys to use for comparison, but let’s put it this way: when people talk about “bloat” in non-academic staff positions, they normally mean it in the sense that the bloat is coming from expensive A&S staff, overpaid A&S staff, etc. For Figure 4 to be true, the growth in staff numbers would need to come almost entirely from more junior, less well-paid staff. It’s not impossible that this is true, but it’s not consistent with the general vibe about bloat, either

    So who knows, really? There’s a lot of contradictory data here, some of which argues strongly in favour of the bloat argument, but quite a bit of which points in the other direction. Better data is needed to answer this question probably isn’t forthcoming.

    Meanwhile, we can take one last look at A&S expenditure data. We can check to see if the pattern of A&S salary expenditures across university operating functions has changed over time. As Figure 5 shows, the answer is “a little bit.” Central Administration now takes up 25% of total A&S salary expenditures, up from 22% 20 years ago. Student services and external relations are up much more sharply in proportional terms, but since they were both starting from a low base, they don’t impact the overall numbers that much. Libraries, physical plant, and non-credit instruction are the categories losing share.

    Figure 5: Share of Total A&S Salary Mass by Function, Canadian University Operating Grants, Select Years

    And there you have it: more data than you probably needed on administrative bloat. See you back here again in 2030.

    Source link

  • Professional services staff need equal recognition – visibility in sector data would be a good start

    Professional services staff need equal recognition – visibility in sector data would be a good start

    Achieving recognition for the significant contribution of professional services staff is a collaborative, cross-sector effort.

    With HESA’s second consultation on higher education staff statistics welcoming responses until 3 April, AGCAS has come together with a wide range of membership bodies representing professional services staff across higher education to release a statement warmly welcoming HESA’s proposal to widen coverage of the higher education staff record to include technical staff and professional and operational staff.

    By creating a more complete staff record, HESA aim to deliver better understanding of the diverse workforce supporting the delivery of UK higher education. AGCAS, together with AHEP, AMOSSHE, ASET, CRAC-Vitae, NADP and UMHAN, welcome these proposals. We have taken this collaborative approach because we have a common goal of seeking wider recognition for the outstanding contributions and work of our members in professional services roles, and the impact they make on their institutions, regions, graduates and students.

    A matter of visibility

    Since the 2019–20 academic year, higher education providers in England and Northern Ireland have had the option to return data on non-academic staff to HESA. However, this has led to a lack of comprehensive visibility for many professional services staff. In the 2023–24 academic year, out of 228 providers only 125 opted to return data on all their non-academic staff – leaving 103 providers opting out.

    This gap in data collection has raised concerns about the recognition and visibility of these essential staff members – and has not gone unnoticed by professional services staff themselves. As one AGCAS member noted:

    Professional service staff have largely remained invisible when reporting on university staff numbers. Professional services provide critical elements of student experience and outcomes, and this needs to be recognised and reflected better in statutory reporting.

    This sentiment underscores the importance of the proposed changes by HESA, and the reason for our shared response.

    Who is and is not

    A further element of the consultation considers a move away from the term “non-academic” to better reflect the roles and contributions of these staff members and proposes to collect data on staff employment functions.

    Again, we collectively strongly support these proposed changes, which have the potential to better understand and acknowledge the wide range of staff working to deliver outstanding higher education across the UK. The term non-academic has long been contentious across higher education. While continuing to separate staff into role types may cause issues for those in the third space, shifting away from a term and approach that defines professional services staff by othering them is a welcome change.

    As we move forward, it is essential to continue fostering collaboration and mutual respect between academic and professional services staff. Challenging times across higher education can create or enhance partnership working between academic and professional services staff, in order to tackle shared difficulties, increase collaboration and form strategic alliances.

    A better environment

    By working in this way, we can create a more inclusive and supportive environment that recognises the diverse contributions of all staff members, ultimately enhancing outcomes for all higher education stakeholders, particularly students.

    Due to the nature of our memberships, our shared statement focuses on professional services staff in higher education – but we also welcome the clear focus on operational and technical staff from HESA, who again make vital contributions to their institutions.

    We all know that representation matters to our members, and the higher education staff that we collectively represent. HESA’s proposed changes could help to start a move towards fully and equitably recognising the vital work of professional services staff across higher education. By expanding data collection to include wider staff roles and moving away from the term “non-academic”, we can better understand and acknowledge the wide range of contributions that support the higher education sector.

    This is just the first step towards better representation and recognition, but it is an important one.

    Source link

  • Student experience is becoming more transactional – but that doesn’t make it less meaningful

    Student experience is becoming more transactional – but that doesn’t make it less meaningful

    It seems that few can agree about what the future student experience will look like but there is a growing consensus that for the majority of higher education institutions (bar a few outliers) it will – and probably should – look different from today.

    For your institution, that might look like a question of curriculum – addressing student demand for practical skills, career competencies and civic values to be more robustly embedded in academic courses. It might be about the structure of delivery – with the Lifelong Learning Entitlement funding per credit model due to roll out in the next few years and the associated opportunity to flex how students access programmes of study and accrue credit. It might be a question of modality and responding to demands for flexibility in accessing learning materials remotely using technology.

    When you combine all these changes and trends you potentially arrive at a more fragmented and transient model of higher education, with students passing through campus or logging in remotely to pick up their higher education work alongside their other commitments. Academic community – at least in the traditional sense of the campus being the locus of daily activity for students and academics – already appears at risk, and some worry that there is a version of the future in which it is much-reduced or disappears altogether.

    Flexibility, not fragmentation

    With most higher education institutions facing difficult financial circumstances without any immediate prospect of external relief, the likelihood is that cost-saving measures reduce both the institutional capacity to provide wraparound services and the opportunities for the kind of human-to-human contact that shows up organically when everyone is co-located. Sam Sanders

    One of the challenges for higher education in the decade ahead will be how to sustain motivation and engagement, build connection and belonging, and support students’ wellbeing, while responding to that shifting pattern of how students practically encounter learning.

    The current model still relies on high-quality person to person interaction in classrooms, labs, on placement, in accessing services, and in extra-curricular activities. When you have enough of that kind of rich human interaction it’s possible to some extent to tolerate a degree of (for want of a better word) shonky-ness in students’ functional and administrative interactions with their institution.

    That’s not a reflection of the skills and professionalism of the staff who manage those interactions; it’s testament to the messiness of decades of technology systems procurement that has not kept up with the changing demands of higher education operational management. The amount of institutional resource devoted to maintaining and updating these systems, setting up workarounds when they don’t serve desired institutional processes, and extracting and translating data from them is no longer justifiable in the current environment.

    Lots of institutional leaders accept that change is coming. Many are leading significant transformation and reform programmes that respond to one or more of the changes noted above. But they are often trying – at some expense – to build a change agenda on top of a fragile foundational infrastructure. And this is where a change in mindset and culture will be needed to allow institutions to build the kind of student experiences that we think are likely to become dominant within the next decade.

    Don’t fear the transactional

    Maintaining quality when resources are constrained requires a deep appreciation of the “moments that matter” in student experience – those that will have lasting impact on students’ sense of academic identity and connection, and by association their success – and those that can be, essentially, transactional. Pete Moss

    If, as seems to be the case, the sector is moving towards a world in which students need a greater bulk of their interaction with their institution to be in that “transactional” bucket two things follow:

    One is that the meaningful bits of learning, teaching, academic support and student development have to be REALLY meaningful, enriching encounters for both students and the staff who are educating them – because it’s these moments that will bring the education experience to life and have a transformative effect on students. To some degree how each institution creates that sense of meaningfulness and where it chooses to focus its pedagogical efforts may act as a differentiator to guide student choice.

    The second is that the transactional bits have to REALLY work – at a baseline be low-friction, designed with the user in mind, and make the best possible use of technologies to support a more grab-and-go, self-service, accessible-anywhere model that can be scaled for a diverse student body with complicated lives.

    Transactional should not mean ‘one-size-fits-all’ – in fact careful investment in technology should mean that it is possible to build a more inclusive experience through adapting to students’ needs, whether that’s about deploying translation software, integrating assistive technologies, or natural language search functionality. Lizzie Falkowska

    Optimally, institutions will be seeking to get to the point where it is possible to track a student right from their first interaction with the institution all the way through becoming an alumnus – and be able to accommodate a student being several things at once, or moving “backwards” along that critical path as well as “forwards.” Having the data foundations in place to understand where a student is now, as well as where they have come from, and even where they want to get to, makes it possible to build a genuinely personalised experience.

    In this “transactional” domain, there is much less opportunity for strategic differentiation with competitor institutions – though there is a lot of opportunity for hygiene failure, if students who find their institution difficult to deal with decide to take their credits and port them elsewhere. Institutional staff, too, need to be able to quickly and easily conduct transactional business with the institution, so that their time is devoted as much as possible to the knowledge and student engagement work that is simply more important.

    Critically, the more that institutions adopt common core frameworks and processes in that transactional bucket of activity, the more efficient the whole sector can be, and the more value can be realised in the “meaningful” bucket. That means resisting the urge to tinker and adapt, letting go of the myth of exceptionalism, and embracing an “adopt not adapt” mindset.

    Fixing the foundations

    To get there, institutions need to go back to basics in the engine-room of the student experience – the student record system. The student system of 15-20 years ago was a completely internally focused statutory engine, existing for award board grids and HESA returns. Student records is now seen as a student-centric platform that happens to support other outputs and outcomes, both student-facing interactions, and management information that can drive decision-making about where resource input is generating the best returns.

    The breadth of things in the student experience that need to be supported has expanded rapidly, and will continue to need to be adapted. Right now, institutions need their student record system to be able to cope with feeding data into other platforms to allow (within institutional data ethics frameworks) useful reporting on things like usage and engagement patterns. Increasingly ubiquitous AI functionality in information search, student support, and analytics needs to be underpinned by high quality data or it will not realise any value when rolled out.

    Going further, as institutions start to explore opportunities for strategic collaboration, co-design of qualifications and pathways in response to regional skills demands, or start to diversify their portfolio to capture the benefits of the LLE funding model, moving toward a common data framework and standards will be a key enabler for new opportunities to emerge.

    The extent to which the sector is able to adopt a common set of standards and interoperability expectations for student records is the extent to which it can move forward collectively with establishing a high quality baseline for managing the bit of student experience that might be “transactional” in their function, but that will matter greatly as creating the foundations for the bits that really do create lasting value.

    This article is published in association with KPMG.

    Source link