Category: Data

  • Supporting Student Wellbeing in Uncertain Times

    Supporting Student Wellbeing in Uncertain Times

    Higher education is operating in a time of rapid change and uncertainty. Changes in federal and state policy, funding, and increasing polarization are reshaping campus environments and profoundly affecting many students’ experiences. As leaders, it is critical to understand how these forces are impacting student wellbeing—and what actions institutions can take to adapt and strengthen their supports for students.

    The Action Network for Equitable Wellbeing (ANEW) is a networked community of higher education changemakers working together to advance systems-level transformation to improve student wellbeing. Drawing on the involvement of more than 200 colleges and universities, our experience shows that while there is no single solution, institutions can act quickly and intentionally to strengthen student support using a practical, data-driven, human-centered approach.

    Through this collaborative work, we’ve identified three strategies that are helping campuses respond more effectively to the rapidly evolving needs of their students: using real-time disaggregated data, conducting empathy interviews, and building a rhythm of frequent data collection and sense-making.

    Collect real-time quantitative data and analyze it thoughtfully

    How students are doing can change rapidly as policies and rhetoric shift, availability of external resources change, significant events on campus or in the world occur, and new barriers or supports emerge. Relying on older data (e.g. survey data collected nine months ago) can miss important changes. Without timely insight, decisions may be based on outdated information or an incomplete understanding. Systematically collecting real-time data helps institutions stay aligned with students’ current realities.

    To support this kind of real-time data collection, ANEW institutions have used the Wellbeing Improvement Survey for Higher Education Settings (WISHES)—a short survey, available at no cost, that provides institutions with timely and actionable data on a range of outcomes and experiences influencing student wellbeing. WISHES helps institutions monitor student wellbeing and stay responsive to the present moment.

    But aggregate data tell only part of the story. To understand how different groups of students are faring, disaggregating data by relevant student characteristics can reveal patterns that may be hidden in campus-wide averages and allow institutions to focus support where it is most needed, such as groups of students who might be disproportionately struggling.

    In fall 2023, the University of California, Irvine administered WISHES, disaggregated its data, and found that Middle Eastern students seemed to be experiencing more challenges than their peers in some measures. “Aggregate data really doesn’t tell you anything [about what to do]—you have to disaggregate,” said Doug Everhart, director of student wellness and health promotion at UC Irvine. “In order to find meaning behind the data, you have to follow up and ask questions to dig into the lived experience and the ‘why’. That focus is what makes [the ANEW] approach so useful.” The real-time disaggregated data allowed the team to better understand the Middle Eastern student experience and develop strategies responsive to their needs.

    Conduct empathy interviews to develop actionable, human-centered insights

    Real-time disaggregated survey data can reveal where differences exist—but it likely won’t explain them. Empathy interview is a method used in diverse sectors and settings to understand what’s behind the patterns in quantitative data. These insights are important for informing what specific changes are needed to better support students.

    An empathy interview is a one-on-one session that uses deep listening and responsive prompts to explore the lived experience of an individual on a specific topic such as wellbeing. Empathy interviews uncover holistic and nuanced perspectives about a student’s life—including what they’re facing, what matters to them, and how they navigate challenges and opportunities. Empathy interviews are not formal research, but they offer a structured way for leaders to move beyond assumptions and gain insights that are authentic, revealing, and actionable from those who are most affected.

    Katy Redd, executive director of the Longhorn Wellness Center at the University of Texas at Austin, reflected on the value of this strategy, “Going through this process pushed us to confront the gap between how we assume students experience college and what their day-to-day reality actually looks like for low-income students. Listening closely helped us notice invisible norms and structures that many students are expected to navigate without support. It shifted our mindset—away from surface-level solutions and toward deeper questions about how our systems function and for whom.”

    Michelle Kelly, assistant vice president for health and wellbeing at the University of Texas at Arlington, described a similar shift in perspective: “There was a moment after our empathy interviews where it just clicked: we’d been asking students to navigate systems we ourselves hadn’t fully mapped. It was humbling—but also motivating. Hearing their stories reminded us that the data isn’t just about trends—it’s about real people trying to make it through college while juggling a hundred other things.”

    These interviews, coupled with WISHES data, revealed insights that were difficult to uncover through other methods and have helped institutions think and act more systematically about what’s shaping students’ experiences and outcomes.

    Develop a rhythm of frequent data collection and sense-making

    Being responsive to student needs isn’t about changing course in response to every complaint—it’s about noticing patterns early and adjusting when needed, which requires more than one-time or yearly data collection. Institutions that build a regular rhythm of frequent data collection and sense-making are better equipped to detect shifts, learn from them, and adapt in ways that support student wellbeing.

    WISHES is most effective when administered multiple times per semester over many years. Data collected frequently over time provide helpful context when trying to understand how students are impacted by significant events on campus or in the world. Institutions can better answer questions like: Are students struggling more or less than they were at similar points of the semester in previous years? In times of extraordinary change, it is easy to imagine that students are doing worse than they were previously. Frequent data collection and sense-making allow us to objectively determine if this assumption is true.

    ANEW institutions that frequently collect data over time using WISHES have been able to understand in close to real time how large external events—such as the pandemic, October 7, and the shifting political environment—have impacted student wellbeing. Schools have reported that WISHES data enabled them to check their assumptions about the impact these events had on student wellbeing. In some cases, assumptions have been disproven using data, allowing schools to avoid trying to solve nonexistent problems or the wrong problem.

    As the University of Maryland reflects, “We’ve administered WISHES 10 times over the past two years and have seen firsthand the benefits of frequent data collection and are excited for the future. We most recently have begun to build a dashboard to display our WISHES metrics over time and democratize these critical insights to a myriad of roles within our campus community, which we hope will lead to more effective support for students across our university.”

    In the face of today’s challenges, higher education has a powerful opportunity—and responsibility—to lead with empathy, insight, and action. By embracing a data-driven, student-centered approach, institutions can move beyond assumptions and truly understand what their students need to flourish. The experiences shared by ANEW institutions demonstrate that meaningful change is not only possible but already underway. Now is the time for campuses to lean in, listen deeply, and build the systems that will support every student’s wellbeing.


    This post was written by Joanna Adams (Rochester Institute of Technology), Jennifer Maltby (Rochester Institute of Technology), and Allison Smith (New York University), with the co-leadership and insights of hundreds of changemakers contributing to the Action Network for Equitable Wellbeing.


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • Ohio District Awarded CoSN Trusted Learning Environment Mini Seal for Student Data Privacy Practices

    Ohio District Awarded CoSN Trusted Learning Environment Mini Seal for Student Data Privacy Practices

    Washington, D.C.    CoSN today awarded Delaware Area Career Center in Delaware, Ohio, the Trusted Learning Environment (TLE) Mini Seal in the Business Practice. The CoSN TLE Seal is a national distinction awarded to school districts implementing rigorous privacy policies and practices to help protect student information. Delaware Area Career Center is the sixth school district in Ohio to earn a TLE Seal or TLE Mini Seal. To date, TLE Seal recipients have improved privacy protections for over 1.2 million students.

    The CoSN TLE Seal program requires that school systems uphold high standards for protecting student data privacy across five key practice areas: Leadership, Business, Data Security, Professional Development and Classroom. The TLE Mini Seal program enables school districts nationwide to build toward earning the full TLE Seal by addressing privacy requirements in one or more practice areas at a time. All TLE Seal and Mini Seal applicants receive feedback and guidance to help them improve their student data privacy programs.

    “CoSN is committed to supporting districts as they address the complex demands of student data privacy. We’re proud to see Delaware Area Career Center take meaningful steps to strengthen its privacy practices and to see the continued growth of the TLE Seal program in Ohio,” said Keith Krueger, CEO, CoSN.

    “Earning the TLE Mini Seal is a tremendous acknowledgement of the work we’ve done to uphold high standards in safeguarding student data. This achievement inspires confidence in our community and connects us through a shared commitment to privacy, transparency and security at every level,” said Rory Gaydos, Director of Information Technology, Delaware Area Career Center.

    The CoSN TLE Seal is the only privacy framework designed specifically for school systems. Earning the TLE Seal requires that school systems have taken measurable steps to implement, maintain and improve organization-wide student data privacy practices. All TLE Seal recipients are required to demonstrate that improvement through a reapplication process every two years.

    To learn more about the TLE Seal program, visit www.cosn.org/trusted.

    About CoSN CoSN, the world-class professional association for K-12 EdTech leaders, stands at the forefront of education innovation. We are driven by a mission to equip current and aspiring K-12 education technology leaders, their teams, and school districts with the community, knowledge, and professional development they need to cultivate engaging learning environments. Our vision is rooted in a future where every learner reaches their unique potential, guided by our community. CoSN represents over 13 million students and continues to grow as a powerful and influential voice in K-12 education. www.cosn.org

    About the CoSN Trusted Learning Environment Seal Program The CoSN Trusted Learning Environment (TLE) Seal Program is the nation’s only data privacy framework for school systems, focused on building a culture of trust and transparency. The TLE Seal was developed by CoSN in collaboration with a diverse group of 28 school system leaders nationwide and with support from AASA, The School Superintendents Association, the Association of School Business Officials International (ASBO) and ASCD. School systems that meet the program requirements will earn the TLE Seal, signifying their commitment to student data privacy to their community. TLE Seal recipients also commit to continuous examination and demonstrable future advancement of their privacy practices. www.cosn.org/trusted

    About Delaware Area Career Center Delaware Area Career Center provides unique elective courses to high school students in Delaware County and surrounding areas. We work in partnership with partner high schools to enhance academic education with hands-on instruction that is focused on each individual student’s area of interest. DACC students still graduate from their home high school, but they do so with additional college credits, industry credentials, and valuable experiences. www.delawareareacc.org

    Connect With Us

    Facebook,Twitter, LinkedIn

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link

  • Improving State Longitudinal Data Systems

    Improving State Longitudinal Data Systems

    Title: Powering Potential: Using Data to Support Postsecondary Access, Completion, and Return on Investment
    Source: The Data Quality Campaign

    To make decisions about when and where to pursue their next educational credential, students and their families need to be able to understand the full picture of pursuing further education. They need access to real-time program information, which includes data on enrollment and completion, program performance, financial aid availability, employment, and return on investment.

    A new publication from the Data Quality Campaign highlights the current landscape and challenges of state data systems for postsecondary education and offers recommendations to align state and institutional data systems.

    Key findings include:

    How the existing postsecondary and workforce data landscape varies

    According to the report, nearly all states have agencies that oversee postsecondary institutions and collect some student or programmatic data within postsecondary student unit record systems (PSURSs). However, the authors note that agency-specific data are often disconnected from other sectors’ data. As a result, student information cannot connect with postgraduation outcomes, as is possible with statewide longitudinal data systems.

    Education and workforce data systems differ greatly across states. Sixty-eight percent of PSURSs connect to workforce data, but only 11 percent identify the industry and general occupation that individuals are employed in.

    States collect a variety of postsecondary data from institutions through a variety of methods, but the report emphasizes that states identify many common uses of the data, such as in supporting workforce alignment.

    Data challenges that states are facing

    The report observes that federal funding for states to develop data systems has been increasingly siloed, with different grant programs focusing on the development of data systems that each have a narrow focus (e.g., workforce and K–12 data).

    Education and workforce data systems identify students using different methods, making connecting individuals’ data and tracking their pathways difficult. However, the authors note that some states are making changes to improve matching accuracy.

    Recommendations for states to proactively use data in cooperation with postsecondary institutions

    The report recommends that states ensure data are used in collaboration with postsecondary institutions to inform policy and practice. This includes creating guided pathways and aligning institutions’ educational offerings with their states’ workforce needs. By evaluating trends in postsecondary completion, employment outcomes, and employment needs, policymakers can refine programs that guide students into pathways with high completion and high-paying careers.

    Institutions collect a variety of information about students, including enrollment demographics and course grades. According to the report, given many institutions’ limitations to do robust analysis, this information should be integrated with statewide data systems.

    States can use data to make the admissions and financial aid application processes easier for students and to streamline the process of enrolling in high-demand educational offerings. States and institutions can also leverage their shared data to identify students at higher risk of not completing their postsecondary program and tailor financial support, emergency aid, and academic supports to provide on-time interventions to these students.

    To read the full report from the Data Quality Campaign, click here.

    —Austin Freeman


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • Is data infrastructure the missing backbone of UK international HE?

    Is data infrastructure the missing backbone of UK international HE?

    IHEC‘s report,Towards a Future UK International Higher Education Strategy: Resilience, Purpose and Precision, released in April 2025, describes accurate data and timely insights as “the lifeblood” of an effective international education strategy.

    The Commission is calling on the government develop a digital data portal for international student information, accessible to universities and relevant public bodies.

    Its vision is a significant leap from the fragmented systems the sector currently relies on – where data is outdated and siloed across agencies.

    Stakeholders frequently point out that UK policy often trails real-world data by nearly two years.

    The Commission envisions a secure portal compiling data from various sources – Home Office visa issuance, HESA enrolments, accommodation, and health service usage – tracking, almost in real time, where international students are coming from and enrolling.

    Imagine a world where universities can instantly access up-to-date visa grant statistics by country, and local councils can anticipate the number of international students arriving in their area.

    With real-time insights at their fingertips, IHEC suggests that institutions, policymakers, and stakeholders could plan proactively – enhancing housing, support services, and infrastructure.

    “A system like this is entirely within our competence to establish,” according to IHEC.

    This isn’t the only tool the Commission has in its sights. As part of its ambitions, it also advocates for a market intelligence platform that would equip the UK with the insights needed to stay ahead of global competitors.

    “Via a public-private partnership (perhaps a tender to specialist data firms), we could build a system that aggregates data on international education demand worldwide – including demographics, economic indicators, competitor country trends, search engine, and agent application data – to predict future demand patterns,” outlined the report.

    Via a public-private partnership (perhaps a tender to specialist data firms), we could build a system that aggregates data on international education demand worldwide
    IHEC

    The platform would answer key questions like: “Which emerging markets are gaining interest?” or “What’s the projected demand for STEM Masters over the next five years?”

    “The sector must have access to better and more timely data about what is happening in international recruitment markets, as well as how this is playing out
    at institutional and sector levels, to more effectively address challenges and opportunities,” asserted Chris Skidmore, IHEC chair and former UK universities minister.

    With this intelligence, the Commission hopes the UK can spot opportunities early and respond to risks before they grow. It should also include an open-source competitor tracker – comparing performance across countries on things like visa wait times, tuition fees, and scholarship availability – so the UK can see how it stacks up and stay competitive.

    To steer these efforts, the Commission recommends establishing a public-private sector International Education Data and Insight Taskforce, made up of statisticians and analysts from various government departments, as well as industry experts and leaders from the growing number of private sector companies that provide sophisticated data about current and potential future trends.

    The Commission names Enroly, Studyportals, IDP and QS as key players doing valuable work in this area.

    IHEC’s full report ‘Towards a Future UK International Higher Education Strategy: Resilience, Purpose and Precision’ is available here.

    Source link

  • Otus Wins Gold Stevie® Award for Customer Service Department of the Year

    Otus Wins Gold Stevie® Award for Customer Service Department of the Year

    CHICAGO, IL (GLOBE NEWSWIRE) — Otus, a leading provider of K-12 student data and assessment solutions, has been awarded a prestigious Gold Stevie® Award in the category of Customer Service Department of the Year at the 2025 American Business Awards®. This recognition celebrates the company’s unwavering commitment to supporting educators, students, and families through exceptional service and innovation.

    In addition to the Gold award, Otus also earned two Silver Stevie® Awards: one for Company of the Year – Computer Software – Medium Size, and another honoring Co-founder and President Chris Hull as Technology Executive of the Year.

    “It is an incredible honor to be recognized, but the real win is knowing our work is making a difference for educators and students,” said Hull. “As a former teacher, I know how difficult it can be to juggle everything that is asked of you. At Otus, we focus on building tools that save time, surface meaningful insights, and make student data easier to use—so teachers can focus on what matters most: helping kids grow.”

    The American Business Awards®, now in their 23rd year, are the premier business awards program in the United States, honoring outstanding performances in the workplace across a wide range of industries. The competition receives more than 12,000 nominations every year. Judges selected Otus for its outstanding 98.7% customer satisfaction with chat interactions, and exceptional 89% gross retention in 2024. They also praised the company’s unique blend of technology and human touch, noting its strong focus on educator-led support, onboarding, data-driven product evolution, and professional development.

    “We believe great support starts with understanding the realities educators face every day. Our Client Success team is largely made up of former teachers and school leaders, so we speak the same language. Whether it’s during onboarding, training, or day-to-day communication, we’re here to help districts feel confident and supported. This recognition is a reflection of how seriously we take that responsibility and energizes us to keep raising the bar,” said Phil Collins, Ed.D., Chief Customer Officer at Otus.

    Otus continues to make significant strides in simplifying teaching and learning by offering a unified platform that integrates assessment, data, and instruction—all in one place. Otus has supported over 1 million students nationwide by helping educators make data-informed decisions, monitor progress, and personalize learning. These honors reflect the company’s growth, innovation, and steadfast commitment to helping school communities succeed.

    About Otus

    Otus, an award-winning edtech company, empowers educators to maximize student performance with a comprehensive K-12 assessment, data, and insights solution. Committed to student achievement and educational equity, Otus combines student data with powerful tools that provide educators, administrators, and families with the insights they need to make a difference. Built by teachers for teachers, Otus creates efficiencies in data management, assessment, and progress monitoring to help educators focus on what matters most—student success. Today, Otus partners with school districts nationwide to create informed, data-driven learning environments. Learn more at Otus.com.

    Stay connected with Otus on LinkedIn, Facebook, X, and Instagram.

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)



    Source link

  • Govt. data error sparks doubt over US international enrolments

    Govt. data error sparks doubt over US international enrolments

    The reliability of federal datasets is under scrutiny after an error was identified on the Student and Exchange Visitor Information System (SEVIS) website that appeared to show stagnating international student numbers from August 2024 to the present.  

    The error, brought to The PIE News’s attention by EnglishUSA, casts doubt on recent headlines and media reports about declining international student enrolments in the US, with SEVIS data appearing to show an enrolment decline of 11% between March 2024 and March 2025.  

    “Starting in August 2024, the data appears to be duplicated month after month, with flatlined totals for students on F and M visas. These figures show virtually no fluctuation during a period when natural enrolment shifts would be expected,” explained EnglishUSA executive director, Cheryl Delk-Le Good.  

    “This irregularity comes at a time of heightened concern within the field, particularly as educators and administrators manage the fallout from widespread SEVIS terminations and the resulting confusion around visa status for international students,” added Delk-Le Good.  

    The US Department of Homeland Security (DHS), which runs SEVIS, was alerted to the error on April 14 and said it was “working to resolve the issue”.  

    As of April 25, the dataset has not been updated, and DHS has not responded to The PIE’s request for comment.  

    US International Trade Administration. Market Diversification Tool for International Education. 2023. Retrieved: April 11, 2025.

    Notably, the inaccuracies begin in August 2024 and span both US administrations, suggesting “a computer glitch rather than an intentional act,” said Mark Algren – interim director of the Applied English Center at the University of Kansas and a contributor to EnglishUSA’s data initiatives – who noticed the anomaly.  

    However, Algren added that he had “no idea why someone didn’t catch it,” with the considerable timeframe of the glitch likely to hamper confidence in federal datasets that are relied on by institutions and that ensure transparency in the system.  

    Total F&M visa holders in the US: 

    Month  Total F&M  Change from previous month 
    August 24   1,091,134  -59,822 
    September 24   1,091,137  +3 
    October 24  1,091,141  +4 
    November 24  1,091,144  +3 
    January 25  1,091,142  -2 
    February 25  1,091,155  +13 
    March 25  1,091,161  +11 
    Source: SEVIS

    It is important to note that each monthly dataset recorded by SEVIS is a snapshot of a given day that month, and the drop recorded in August 2024 (which is considered the last accurate figure) could have been taken before many students arrived for the fall academic term.  

    For this reason, “it’s hard to say that an August report is representative of the following fall term,” said Algren, with the true figures yet to be seen.  

    At the start of the 2024/25 academic year, IIE’s fall snapshot reported a 3% rise in international student enrolment, building on sustained growth over the last three years. 

    Despite recent uncertainty in the US caused by the current administration’s recent attacks on higher education, the period of SEVIS’ misreporting represents an earlier timeframe before the impact of Trump’s policies came into effect.  

    Source link

  • Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Before our show starts today, I just wanna take a minute to note the passing of Professor Claire Callender, OBE. For the last two and a half decades, she’s been one of the most important figures in UK higher education studies, in particular with respect to student loans and student finance. Holder of a joint professorship at UCL Institute of Education and Birkbeck University of London, she was also instrumental in setting up the ESRC Centre for Global Higher Education, of which she later became deputy director. I just want to quote the short obituary that her colleague Simon Marginson wrote for her last week after her passing from lung cancer. He said, “What we’ll remember about Claire is the way she focused her formidable capacity for rational thought on matters to which she was committed, her gravitas that held the room when speaking, and the warmth that she evoked without fail in old and new acquaintances.”

    My thoughts and condolences to her partner Annette, and to her children. We’ll all miss Claire. 


    I suspect most of you are familiar with the OECD’s Program for International Student Assessment, or PISA. That’s a triannual test of 15 year olds around the world. It tries to compare how teenagers fare in real world tests of literacy and numeracy. But you might not be as familiar with PISA’s cousin, the Program for International Assessment of Adult Competencies or PIAAC. To simplify enormously, it’s PISA, but for adults, and it only comes out once a decade with the latest edition having appeared on December 10th of last year. Now, if you’re like most people, you’re probably asking yourself, what does PIAAC measure exactly?

    PISA pretty clearly is telling us something about school systems. Adults, the subject of the PIAAC test, they’ve been out of school for a long time. What do test results mean for people who’ve been out of school for, in some cases, decades? And what kinds of meaningful policies might be made on the basis of this data?

    Today my guest is the CEO of Canada’s Future Skills Centre, Noel Baldwin. Over the past decade, both in his roles at FSC, his previous ones at the Council Minister of Education Canada, he’s arguably been one of the country’s most dedicated users of PIAAC data. As part of Canada’s delegation to the OECD committee in charge of PIAAC, he also had a front row seat to the development of these tests and the machinery behind these big international surveys. 

    Over the course of the next 20 or so minutes, you’ll hear Noel and I, both fellow members of the Canada Millennium Scholarship Foundation Mafia, discuss such issues as how the wording of international surveys gets negotiated, why we seem to be witnessing planet wide declines in adult literacy, what research questions PIAAC is best suited to answer, and maybe most intriguingly what PIAAC 3 might look like a decade from now.

    I really enjoyed this conversation and I hope you do too. Anyway, over to Noel.


    The World of Higher Education Podcast
    Episode 3.28 | Overskilled and Underused? What PIAAC Reveals About the Canadian Workforce

    Transcript

    Alex Usher (AU): Noel, some of our listeners might be familiar with big international testing programs like PISA—the Program for International Student Assessment. But what is the Program for the International Assessment of Adult Competencies? What does it aim to measure, and why?

    Noel Baldwin (NB): It’s somewhat analogous to PISA, but it’s primarily focused on working-age adults. Like PISA, it’s a large-scale international assessment organized by the OECD—specifically by both the education and labor secretariats. It’s administered on the ground by national statistical agencies or other government agencies in participating countries.

    PIAAC is mainly focused on measuring skills like literacy and numeracy. Over time, though, the OECD has added other skill areas relevant to the intersection of education and labor markets—things like digital skills, technology use, problem solving, and social-emotional skills.

    In addition to the assessment itself, there’s a large battery of background questions that gather a lot of demographic information—details about respondents’ work life, and other factors like health and wellbeing. This allows researchers to draw correlations between the core skills being measured and how those skills are used, or what kind of impact they have on people’s lives.

    AU: How do they know that what’s being measured is actually useful in the workplace? I mean, the literacy section is reading comprehension, and the math is sort of like, you know, “If two trains are moving toward each other, one from Chicago and one from Pittsburgh…” It’s a bit more sophisticated than that, but that kind of thing. How do they know that actually measures anything meaningful for workplace competencies?

    NB: That’s a good question. One thing to start with is that the questions build from fairly easy and simple tasks to much more complex ones. That allows the OECD to create these scales, and they talk a lot about proficiency levels—level one up to five, and even below level one in some cases, for people with the weakest skill levels.

    And while PIAAC itself is relatively new, the assessment of these competencies isn’t. It actually dates back to the early 1990s. There’s been a lot of research—by the OECD and by psychometricians and other researchers—on the connections between these skills and broader outcomes.

    The key thing to understand is that, over time, there’s been strong evidence linking higher literacy and numeracy skills to a range of life outcomes, especially labor market outcomes. It’s a bit like educational attainment—these things often act as proxies for one another. But the stronger your skills, the more likely you are to be employed, to earn higher wages, to avoid unemployment, and to be adaptable and resilient.

    And it’s not just about work. It extends to other areas too—life satisfaction, for instance. There are even some interesting findings about democratic participation and people’s perceptions of how their society is doing. So there are pretty strong correlations between higher-level skills and a variety of positive outcomes.

    AU: But, I can imagine that the nature of an economy—whether it’s more manufacturing-based or service-based—might affect what kinds of skills are relevant. So different countries might actually want to measure slightly different things. How do you get 50—or however many, dozens of countries—to agree on what skills to assess and how to measure them?

    NB: The point at which OECD countries agreed to focus on literacy and numeracy actually predates me—and it also predates a lot of today’s focus on more digitally oriented skills. It was a much more analog world when this started, and so literacy and numeracy made a lot of sense. At the time, most of the information people consumed came in some form of media that required reading comprehension and the ability to navigate text. And then, on the numeracy side, the ability to do anything from basic to fairly advanced problem solving with numbers was highly relevant. So I suspect that when this was being developed—through the 1980s and into the early 1990s—there was a high degree of consensus around focusing on those core skills.

    The development of the instruments themselves is also an international effort. It’s led by the OECD, but they work with experts from a range of countries to test and validate the items used in the assessment. Educational Testing Service (ETS) in the U.S. is quite involved, and there are also experts from Australia and Canada. In fact, Canada was very involved in the early stages—both through Statistics Canada and other experts—particularly in developing some of the initial tools for measuring literacy. So, the consensus-building process includes not just agreeing on what to measure and how to administer it, but also developing the actual assessment items and ensuring they’re effective. They do field testing before rolling out the main assessment to make sure the tools are as valid as possible.

    AU: Once the results are in and published, what happens next? How do governments typically use this information to inform policy?

    NB: I’ll admit—even having been on the inside of some of this—it can still feel like a bit of a black box. In fact, I’d say it’s increasingly becoming one, and I think we’ll probably get into that more as the conversation goes on.

    That said, different countries—and even different provinces and territories within Canada—use the information in different ways. It definitely gets integrated into various internal briefings. I spent some time, as you know, at the Council of Ministers of Education, and we saw that both in our own work and in the work of officials across the provinces and territories.

    After the last cycle of PIAAC, for instance, Quebec produced some fairly detailed reports analyzing how Quebecers performed on the PIAAC scales—comparing them to other provinces and to other countries. That analysis helped spark conversations about what the results meant and what to do with them. New Brunswick, for example, launched a literacy strategy shortly after the last PIAAC cycle, which suggests a direct link between the data and policy action.

    So there are examples like that, but it’s also fair to say that a lot of the data ends up being used internally—to support conversations within governments. Even since the most recent PIAAC cycle was released in December, I’ve seen some of that happening. But there’s definitely less in the public domain than you might expect—and less than there used to be, frankly.

    AU: Some of the findings in this latest PIAAC cycle—the headline that got the most traction, I think—was the fact that we’re seeing declines in literacy and numeracy scores across much of the OECD. A few countries bucked the trend—Canada saw a small decline, and parts of Northern Europe did okay—but most countries were down. What are the possible explanations for this trend? And should we be concerned?

    NB: I think we should be really aware. When it comes to concern, though, I’m always a bit hesitant to declare a crisis. There’s a lot of work still to be done to unpack what’s going on in this PIAAC cycle.

    One thing to keep in mind is that most of the responses were collected during a time of ongoing global turmoil. The data was gathered in 2022, so we were still in the middle of the pandemic. Just getting the sample collected was a major challenge—and a much bigger one than usual.

    With that caveat in mind, the OECD has started to speculate a bit, especially about the literacy side. One of the things they’re pointing to is how radically the way people consume information has changed over the past 10 years.

    People are reading much shorter bits of text now, and they’re getting information in a much wider variety of formats. There are still items in the literacy assessment that resemble reading a paragraph in a printed newspaper—something that just doesn’t reflect how most people engage with information anymore. These days, we get a lot more of it through video and audio content.

    So I think those shifts in how we consume information are part of the story. But until we see more analysis, it’s hard to say for sure. There are some signals—differences in gender performance across countries, for example—that we need to unpack. And until we do that, we’re not going to have a great sense of why outcomes look the way they do.

    AU: Let’s focus on Canada for a second. As with most international education comparisons, we end up in the top—but at the bottom of the top third, basically. It doesn’t seem to matter what we do or when—it’s always that pattern. Looking at global trends, do you think Canada stands out in any way, positively or negatively? Are there things we’re doing right? Or things we’re getting wrong?

    NB: Well, I’d say we continue to see something that the OECD points out almost every time we do one of these assessments: the gap between our top performers and our lowest performers is smaller than in many other countries. That’s often taken as a sign of equity, and I’d say that’s definitely a good news story.

    In the global comparison, we held pretty much steady on literacy, while many countries saw declines. Holding steady when others are slipping isn’t a bad outcome. And in numeracy, we actually improved.

    The distribution of results across provinces was also more even than in the last cycle. Last time, there was much more variation, with several provinces falling below the OECD or Canadian average. This time around, we’re more tightly clustered, which I think is another positive.

    If you dig a little deeper, there are other encouraging signs. For example, while the OECD doesn’t have a perfect measure of immigration status, it can identify people who were born outside a country or whose parents were. Given how different Canada’s demographic profile is from nearly every other participating country—especially those in Northern Europe—I think we’re doing quite well in that regard.

    And in light of the conversations over the past few years about immigration policy and its impacts across our society, I think it’s a pretty good news story that we’re seeing strong performance among those populations as well.

    AU: I know we’ll disagree about this next question. My impression is that, in Canada, the way PIAAC gets used has really changed over the last decade. The first round of PIAAC results got a lot of attention—StatsCan and the Council of Ministers of Education both published lengthy analyses.

    And maybe “crickets” is too strong a word to describe the reaction this time, but it’s definitely quieter. My sense is that governments just don’t care anymore. When they talk about skills, the narrative seems focused solely on nursing and the skilled trades—because those are seen as bottlenecks on the social side and the private sector side.

    But there’s very little interest in improving transversal skills, and even less knowledge or strategy about how to do so. Make me less cynical.

    NB: Well, it’s funny—this question is actually what kicked off the conversation that led to this podcast. And I’ll confess, you’ve had me thinking about it for several weeks now.

    One thing I want to distinguish is caring about the skills themselves versus how the data is being released and used publicly. There’s no denying that we’re seeing less coming out publicly from the governments that funded the study. That’s just true—and I’m not sure that’s going to change.

    I think that reflects a few things. Partly, it’s the changed fiscal environment and what governments are willing to pay for. But it’s also about the broader information environment we’re in today compared to 2013.

    As I’ve been reflecting on this, I wonder if 2012 and 2013 were actually the tail end of the era of evidence-based policymaking—and that now we’re in the era of vibes-based policymaking. And if that’s the case, why would you write up detailed reports about something you’re mostly going to approach from the gut?

    On the skills side, though, I still think there’s an interesting question. A few weeks ago, I felt more strongly about this, but I still believe it’s not that governments don’t care about these foundational skills. Rather, I think the conversation about skills has shifted.

    We may have lost sight of how different types of skills build on one another—starting from foundational literacy and numeracy, then layering on problem-solving, and eventually reaching digital competencies. That understanding might be missing in the current conversation.

    Take the current moment around AI, for example. Maybe “craze” is too strong a word, but there’s a belief that people will become great at prompt engineering without any formal education. Mark Cuban—on BlueSky or wherever, I’m not sure what they call posts there—made a point recently that you won’t need formal education with generative AI. If you can get the right answers out of a large language model, you’ll outperform someone with an advanced degree.

    But that completely overlooks how much you need to understand in order to ask good questions—and to assess whether the answers you get are worth anything. So we may start to see that shift back.

    That said, you’re right—there has definitely been a move in recent years toward thinking about workforce issues rather than broader skill development. And that may be a big part of what’s going on.

    AU: What do you think is the most interesting or under-explored question that PIAAC data could help answer, but that we haven’t fully investigated yet? This dataset allows for a lot of interesting analysis. So if you could wave a magic wand and get some top researchers working on it—whether in Canada or internationally—where would you want them to focus?

    NB: First, I’ll just make a small plug. We’ve been working on what we hope will become a PIAAC research agenda—something that responds to the things we care about at the Future Skills Centre, but that we hope to advance more broadly in the coming weeks and months. So we are actively thinking about this.

    There are a bunch of areas that I think are really promising. One is the renewed conversation about productivity in Canada. I think PIAAC could shed light on the role that skills play in that. The Conference Board of Canada did a piece a while back looking at how much of the productivity gap between Canada and the U.S. is due to skill or labor factors. Their conclusion was that it wasn’t a huge part—but I think PIAAC gives us tools to continue digging into that question.

    Another area the OECD often highlights when talking about Canada is the extent to which workers are overqualified or overskilled for the jobs they’re in. That’s a narrative that’s been around for a while, but one where I think PIAAC could offer deeper insights.

    It becomes even more interesting when you try to link it to broader labor supply questions—like the role of immigration. Some people have suggested that one reason Canada lags in things like technology integration or capital investment is that we’ve substituted skilled labor for that kind of investment.

    With PIAAC, we might be able to explore whether overqualification or overskilling is connected to the way we’ve managed immigration over the last couple of decades.

    So, there are a few areas there that I think are both relevant and under-explored. And of course, on the international side, you’re right—we should be looking for examples of countries that have had success, and thinking about what we can emulate, borrow from, or be inspired by.

    AU: I don’t know if either of us wants to still be doing this in 10 years, but if we were to have this conversation again a decade from now, what do you think—or hope—will have changed? What will the long-term impact of PIAAC Cycle 2 have been, and how do you think PIAAC 3 might be different?

    NB: Well, I think I need to say this out loud: I’m actually worried there won’t be a PIAAC 3.

    We’re recording this in early 2025, which is a pretty turbulent time globally. One of the things that seems clear is that the new U.S. administration isn’t interested in the Department of Education—which likely means they won’t be interested in continuing the National Center for Education Statistics.

    And like with many international initiatives, the U.S. plays a big role in driving and valuing efforts like PIAAC. So I do worry about whether there will be a third cycle. If it happens without U.S. participation, it would be a very different kind of study.

    But I hope that in 10 years, we are talking about a robust PIAAC 3—with strong participation from across OECD countries.

    I also hope there’s continued investment in using PIAAC data to answer key research questions. It’s just one tool, of course, but it’s a big one. It’s the only direct assessment of adult skills we have—where someone is actually assessed on a defined set of competencies—so it’s really valuable.

    For an organization like ours, which is focused on adult skills in the workforce, it’s up to us to push forward and try to get answers to some of these questions. And I hope the research we and others are doing will find its way into policy conversations—especially as we think about how workforce needs, skills, and the broader economy are going to change over the next decade.

    It would be a wasted opportunity if it didn’t.

    AU: Noel, thanks so much for being with us today.

    NB: Thanks Alex.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service. Please note, the views and opinions expressed in each episode are those of the individual contributors, and do not necessarily reflect those of the podcast host and team, or our sponsors.

    This episode is sponsored by Studiosity. Student success, at scale – with an evidence-based ROI of 4.4x return for universities and colleges. Because Studiosity is AI for Learning — not corrections – to develop critical thinking, agency, and retention — empowering educators with learning insight. For future-ready graduates — and for future-ready institutions. Learn more at studiosity.com.

    Source link

  • Data shows growing GenAI adoption in K-12

    Data shows growing GenAI adoption in K-12

    Key points:

    • K-12 GenAI adoption rates have grown–but so have concerns 
    • A new era for teachers as AI disrupts instruction
    • With AI coaching, a math platform helps students tackle tough concepts
    • For more news on GenAI, visit eSN’s AI in Education hub

    Almost 3 in 5 K-12 educators (55 percent) have positive perceptions about GenAI, despite concerns and perceived risks in its adoption, according to updated data from Cengage Group’s “AI in Education” research series, which regularly evaluates AI’s impact on education.  

    More News from eSchool News

    HVAC projects to improve indoor air quality. Tutoring programs for struggling students. Tuition support for young people who want to become teachers in their home communities.

    Our school has built up its course offerings without having to add headcount. Along the way, we’ve also gained a reputation for having a wide selection of general and advanced courses for our growing student body.

    When it comes to visual creativity, AI tools let students design posters, presentations, and digital artwork effortlessly. Students can turn their ideas into professional-quality visuals, sparking creativity and innovation.

    Ensuring that girls feel supported and empowered in STEM from an early age can lead to more balanced workplaces, economic growth, and groundbreaking discoveries.

    In my work with middle school students, I’ve seen how critical that period of development is to students’ future success. One area of focus in a middle schooler’s development is vocabulary acquisition.

    For students, the mid-year stretch is a chance to assess their learning, refine their decision-making skills, and build momentum for the opportunities ahead.

    Middle school marks the transition from late childhood to early adolescence. Developmental psychologist Erik Erikson describes the transition as a shift from the Industry vs. Inferiority stage into the Identity vs. Role Confusion stage.

    Art has a unique power in the ESL classroom–a magic that bridges cultures, ignites imagination, and breathes life into language. For English Language Learners (ELLs), it’s more than an expressive outlet.

    In the year 2025, no one should have to be convinced that protecting data privacy matters. For education institutions, it’s really that simple of a priority–and that complicated.

    Teachers are superheroes. Every day, they rise to the challenge, pouring their hearts into shaping the future. They stay late to grade papers and show up early to tutor struggling students.

    Want to share a great resource? Let us know at submissions@eschoolmedia.com.

    Source link

  • Three-quarters of global study decisions determined by cost

    Three-quarters of global study decisions determined by cost

    International students are increasingly looking for affordable destinations and alternative programs rather than give up on study abroad due to increasing costs, a new ApplyBoard survey has shown.  

    While 77% of surveyed students ranked affordable tuition fees as the most important factor shaping study decisions, only 9% said they planned to defer their studies based on these concerns, according to a recent student survey from ApplyBoard edtech firm.  

    “Students weren’t planning to wait for things to change,” said ApplyBoard senior communications manager Brooke Kelly: “They’re considering new destinations, adjusting which programs they apply to, and accepting that they have to balance work with study, but they’re still planning to study abroad,” she maintained.  

    Just over one in four students said they were considering different study destinations than originally planned, with Denmark, Finland, Nigeria and Italy the most popular emerging destinations.  

    Additionally, 55% of students said they would have to work part-time to afford their study abroad program.  

    After affordability, came employability (57%), career readiness (49%), high-quality teaching (47%), and program reputation (45%), as factors shaping student decision-making.  

    With students increasingly thinking about work opportunities, software and civil engineering topped students’ career choices, with nursing as the second most popular field. Tech fields including IT, cybersecurity, and data analysis also showed strong interest. 

    What’s more, interest in PhD programs saw a 4% rise on the previous year, while over half of students were considering master’s degrees, indicating that students are increasingly prioritising credentials and post-study work opportunities.  

    [Students are] considering new destinations, adjusting which programs they apply to, and accepting that they have to balance work with study, but they’re still planning to study abroad

    Brooke Kelly, ApplyBoard

    The study surveyed over 3,500 students from 84 countries, with the most represented countries being Nigeria, Ghana, Canada, Pakistan, Bangladesh and India.  

    Given its share of international students, it should be noted that China is absent from the top 10 most represented countries.  

    As students’ priorities shift and currencies fluctuate, “diversity will be key to mitigate against increased volatility and to ensure campuses remain vibrant with students from all around the world,” said Kelly.  

    Meanwhile, institutions should increase communication about scholarships and financial aid, offer more hybrid learning experiences and highlight programs on different timelines such as accelerated degrees, she advised.  

    While alternative markets are on the rise, 65% of respondents said they were only interested in studying in one of the six major destinations, with Canada followed by the US, UK, Australia, Germany and Ireland, in order of popularity.  

    Despite Canada’s international student caps, the largest proportion of students said they were ‘extremely’, ‘very’ or ‘moderately’ interested in the study destination, highlighting its enduring appeal among young people.  

    While stricter controls on post study work were implemented in Canada last year, in a rare easing of policies, the IRCC recently said that all college graduates would once again be eligible for post study work.  

    This change, combined with the fact that international students can still be accompanied by their dependants while studying in Canada, is likely to have contributed to it maintaining its attractiveness, according to Kelly.  

    Source link

  • What the End of DoED Means for the EdTech Industry

    What the End of DoED Means for the EdTech Industry

    The Fed’s influence over school districts had implications beyond just funding and data. Eliminating The Office of Education Technology (OET) will create significant gaps in educational technology research, validation, and equity assurance. Kris Astle, Education Strategist for SMART Technologies, discusses how industry self-governance, third-party organizations, and increased vendor responsibility might fill these gaps, while emphasizing the importance of research-backed design and implementation to ensure effective technology deployment in classrooms nationwide. Have a listen:

    Key Takeaways

    More News from eSchool News

    In recent years, the rise of AI technologies and the increasing pressures placed on students have made academic dishonesty a growing concern. Students, especially in the middle and high school years, have more opportunities than ever to cheat using AI tools.

    As technology trainers, we support teachers’ and administrators’ technology platform needs, training, and support in our district. We do in-class demos and share as much as we can with them, and we also send out a weekly newsletter.

    Math is a fundamental part of K-12 education, but students often face significant challenges in mastering increasingly challenging math concepts.

    Throughout my education, I have always been frustrated by busy work–the kind of homework that felt like an obligatory exercise rather than a meaningful learning experience.

    During the pandemic, thousands of school systems used emergency relief aid to buy laptops, Chromebooks, and other digital devices for students to use in remote learning.

    Education today looks dramatically different from classrooms of just a decade ago. Interactive technologies and multimedia tools now replace traditional textbooks and lectures, creating more dynamic and engaging learning environments.

    There is significant evidence of the connection between physical movement and learning.  Some colleges and universities encourage using standing or treadmill desks while studying, as well as taking breaks to exercise.

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters. In recent weeks, we’ve seen federal and state governments issue stop-work orders, withdraw contracts, and terminate…

    English/language arts and science teachers were almost twice as likely to say they use AI tools compared to math teachers or elementary teachers of all subjects, according to a February 2025 survey from the RAND Corporation.

    During the seven years I served on the Derry School Board in New Hampshire, the board often came first. During those last two years during COVID, when I was chair, that meant choosing many late-night meetings over dinner with my family.

    Want to share a great resource? Let us know at submissions@eschoolmedia.com.

    Source link