Category: Testing

  • the realities of foreign language anxiety

    the realities of foreign language anxiety

    Picture this: you’ve crossed oceans, packed your suitcase, a dictionary (or maybe just Google Translate), your dreams, and a relentless drive to succeed in a US higher education setting. You’ve landed in the United States, ready for college life. But before you can even start worrying about your academic experience or how to navigate campus life and groceries you’re hit with a more personal challenge: “Will I sound awkward if I say this out loud?”

    For many non-native English speakers, this is not just a fleeting thought. It’s a daily reality known as foreign language anxiety – “the feeling of tension and apprehension specifically associated with second language contexts, including speaking, listening, and learning.” It can limit and negatively impact a student’s ability to communicate, threaten self-confidence, and, over time, affect academic performance.

    Why it matters more than we think

    Foreign language anxiety is more than a minor inconvenience. International students must maintain full-time enrolment to keep their visa status. If foreign language anxiety leads to missed classes, delayed assignments, or low grades, the consequences can be severe — including losing that status and returning home without a degree.

    Even though incoming students meet minimum language proficiency requirements, many have had little practice using English in real-life spontaneous situations. Passing a standardised test is one thing; responding to a professor’s question in front of a class of native speakers is another. This gap can lead to self-consciousness, fear, and avoidance behaviours that hinder academic and social success.

    The three faces of language anxiety

    Research shows that foreign language anxiety often takes three forms:

    1. Fear of negative evaluation – Worrying about being judged for language mistakes, whether by professors or peers. Some students are comfortable in class but avoid informal conversations. Others avoid eye contact entirely to escape being called on.
    2. Communication apprehension – Feeling uneasy about speaking in a foreign language, even for students who were confident communicators in their home country. Concerns about sounding less capable than native speakers can lead to silence in classroom discussions.
    3. Test anxiety – Stress about organising and expressing ideas under time pressure in a second language. This is not just about knowing the material; it’s about performing under linguistic and cognitive strain.

    These anxieties can actively block learning. When students focus on how they sound rather than what is being said, their ability to process information suffers.

    The role of faculty and administrators

    Faculty and administrators may underestimate how much their approach affects international students’ confidence. Being corrected for grammar in front of others is one of the most anxiety-provoking experiences students report. In contrast, giving students time to answer, offering feedback privately, and creating an environment where mistakes are treated as part of learning can significantly reduce foreign language anxiety.

    When capable, motivated students are held back by the effects of foreign language anxiety, institutions risk losing both talent and the global perspectives these students offer

    University administrators can also make a difference through peer mentoring programs, conversation workshops, and targeted support services. However, these resources are only effective if students are aware of them and feel comfortable using them.

    Why this isn’t just a student problem

    It’s easy to think of foreign language anxiety as a personal obstacle each student must overcome, but it has larger implications. International students bring global perspectives, enrich classroom discussions, and contribute to campus culture.

    Their success is both a moral responsibility and an investment in the overall quality and strength of higher education. When capable, motivated students are held back by the effects of foreign language anxiety, institutions risk losing both talent and the global perspectives these students offer. Taking steps to reduce its impact benefits the entire academic community.

    Moving forward

    Addressing foreign language anxiety is not about lowering academic standards. It’s about giving students a fair chance to meet them by reducing unnecessary barriers. For students, this means practicing conversation in low anxiety provoking settings, seeking clarification when needed, and accepting that mistakes are a natural part of language learning. For faculty and staff, it means being intentional about communication, offering encouragement, and ensuring that resources are accessible and culturally responsive.

    Foreign language anxiety is a shared challenge that can undermine even the most motivated and capable students. Often, the greatest hurdle of studying abroad is not mastering complex coursework, adjusting to life far from home, or navigating cultural differences – it is the moment a student must raise their hand, speak in a language that is not their own, and hope that their words are understood as intended.

    Beyond academics, foreign language anxiety can affect the kinds of social and academic engagement that are essential for building leadership skills. Group work, class discussions, and participation in student organisations often require students to communicate ideas clearly, respond to feedback, and collaborate across cultures – the same skills needed to lead effectively in professional environments.

    However, literature on foreign language anxiety suggests that students may hesitate to take on visible roles or avoid speaking in group settings altogether, limiting their ability to practice these skills. When students withdraw from such opportunities, they lose more than a chance to participate – they miss experiences that can shape confidence, decision-making, and the ability to work with diverse teams.

    Understanding and addressing the impact of foreign language anxiety, therefore, is not only relevant for academic success but also for preparing graduates to step into leadership roles in a global context.

    Source link

  • A gender gap in STEM widened during the pandemic. Schools are trying to make up lost ground

    A gender gap in STEM widened during the pandemic. Schools are trying to make up lost ground

    IRVING, Texas — Crowded around a workshop table, four girls at de Zavala Middle School puzzled over a Lego machine they had built. As they flashed a purple card in front of a light sensor, nothing happened. 

    The teacher at the Dallas-area school had emphasized that in the building process, there are no such thing as mistakes. Only iterations. So the girls dug back into the box of blocks and pulled out an orange card. They held it over the sensor and the machine kicked into motion. 

    “Oh! Oh, it reacts differently to different colors,” said sixth grader Sofia Cruz.

    In de Zavala’s first year as a choice school focused on science, technology, engineering and math, the school recruited a sixth grade class that’s half girls. School leaders are hoping the girls will stick with STEM fields. In de Zavala’s higher grades — whose students joined before it was a STEM school — some elective STEM classes have just one girl enrolled. 

    Efforts to close the gap between boys and girls in STEM classes are picking up after losing steam nationwide during the chaos of the Covid pandemic. Schools have extensive work ahead to make up for the ground girls lost, in both interest and performance.

    In the years leading up to the pandemic, the gender gap nearly closed. But within a few years, girls lost all the ground they had gained in math test scores over the previous decade, according to an Associated Press analysis. While boys’ scores also suffered during Covid, they have recovered faster than girls, widening the gender gap.

    As learning went online, special programs to engage girls lapsed — and schools were slow to restart them. Zoom school also emphasized rote learning, a technique based on repetition that some experts believe may favor boys, instead of teaching students to solve problems in different ways, which may benefit girls. 

    Old practices and biases likely reemerged during the pandemic, said Michelle Stie, a vice president at the National Math and Science Initiative.

    “Let’s just call it what it is,” Stie said. “When society is disrupted, you fall back into bad patterns.”

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.

    In most school districts in the 2008-09 school year, boys had higher average math scores on standardized tests than girls, according to AP’s analysis, which looked at scores across 15 years in over 5,000 school districts. It was based on average test scores for third through eighth graders in 33 states, compiled by the Educational Opportunity Project at Stanford University. 

    A decade later, girls had not only caught up, they were ahead: Slightly more than half of districts had higher math averages for girls.

    Within a few years of the pandemic, the parity disappeared. In 2023-24, boys on average outscored girls in math in nearly 9 out of 10 districts.

    A separate study by NWEA, an education research company, found gaps between boys and girls in science and math on national assessments went from being practically non-existent in 2019 to favoring boys around 2022.

    Studies have indicated girls reported higher levels of anxiety and depression during the pandemic, plus more caretaking burdens than boys, but the dip in academic performance did not appear outside STEM. Girls outperformed boys in reading in nearly every district nationwide before the pandemic and continued to do so afterward.

    “It wasn’t something like Covid happened and girls just fell apart,” said Megan Kuhfeld, one of the authors of the NWEA study. 

    Related: These districts are bucking the national math slump 

    In the years leading up to the pandemic, teaching practices shifted to deemphasize speed, competition and rote memorization. Through new curriculum standards, schools moved toward research-backed methods that emphasized how to think flexibly to solve problems and how to tackle numeric problems conceptually.

    Educators also promoted participation in STEM subjects and programs that boosted girls’ confidence, including extracurriculars that emphasized hands-on learning and connected abstract concepts to real-life applications. 

    When STEM courses had large male enrollment, Superintendent Kenny Rodrequez noticed girls losing interest as boys dominated classroom discussions at his schools in Grandview C-4 District outside Kansas City. Girls were significantly more engaged after the district moved some of its introductory hands-on STEM curriculum to the lower grade levels and balanced classes by gender, he said.

    When schools closed for the pandemic, the district had to focus on making remote learning work. When in-person classes resumed, some of the teachers had left, and new ones had to be trained in the curriculum, Rodrequez said. 

    “Whenever there’s crisis, we go back to what we knew,” Rodrequez said. 

    Related: One state tried algebra for all eighth graders. It hasn’t gone well

    Despite shifts in societal perceptions, a bias against girls persists in science and math subjects, according to teachers, administrators and advocates. It becomes a message girls can internalize about their own abilities, they say, even at a very young age. 

    In his third grade classroom in Washington, D.C., teacher Raphael Bonhomme starts the year with an exercise where students break down what makes up their identity. Rarely do the girls describe themselves as good at math. Already, some say they are “not a math person.” 

    “I’m like, you’re 8 years old,” he said. “What are you talking about, ‘I’m not a math person?’” 

    Girls also may have been more sensitive to changes in instructional methods spurred by the pandemic, said Janine Remillard, a math education professor at the University of Pennsylvania. Research has found girls tend to prefer learning things that are connected to real-life examples, while boys generally do better in a competitive environment. 

    “What teachers told me during Covid is the first thing to go were all of these sense-making processes,” she said. 

    Related: OPINION: Everyone can be a math person but first we have to make math instruction more inclusive 

    At de Zavala Middle School in Irving, the STEM program is part of a push that aims to build curiosity, resilience and problem-solving across subjects.

    Coming out of the pandemic, Irving schools had to make a renewed investment in training for teachers, said Erin O’Connor, a STEM and innovation specialist there.

    The district last year also piloted a new science curriculum from Lego Education. The lesson involving the machine at de Zavala, for example, had students learn about kinetic energy. Fifth graders learned about genetics by building dinosaurs and their offspring with Lego blocks, identifying shared traits. 

    “It is just rebuilding the culture of, we want to build critical thinkers and problem solvers,” O’Connor said.

    Teacher Tenisha Willis recently led second graders at Irving’s Townley Elementary School through building a machine that would push blocks into a container. She knelt next to three girls who were struggling.

    They tried to add a plank to the wheeled body of the machine, but the blocks didn’t move enough. One girl grew frustrated, but Willis was patient. She asked what else they could try, whether they could flip some parts around. The girls ran the machine again. This time, it worked.

    “Sometimes we can’t give up,” Willis said. “Sometimes we already have a solution. We just have to adjust it a little bit.” 

    Lurye reported from Philadelphia. Todd Feathers contributed reporting from New York. 

    The Associated Press’ education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Nation’s Report Card at risk, researchers say

    Nation’s Report Card at risk, researchers say

    This story was reported by and originally published by APM Reports in connection with its podcast Sold a Story: How Teach Kids to Read Went So Wrong.

    When voters elected Donald Trump in November, most people who worked at the U.S. Department of Education weren’t scared for their jobs. They had been through a Trump presidency before, and they hadn’t seen big changes in their department then. They saw their work as essential, mandated by law, nonpartisan and, as a result, insulated from politics.

    Then, in early February, the Department of Government Efficiency showed up. Led at the time by billionaire CEO Elon Musk, and known by the cheeky acronym DOGE, it gutted the Department of Education’s Institute of Education Sciences, posting on X that the effort would ferret out “waste, fraud and abuse.”

    A post from the Department of Government Efficiency.

    When it was done, DOGE had cut approximately $900 million in research contracts and more than 90 percent of the institute’s workforce had been laid off. (The current value of the contracts was closer to $820 million, data compiled by APM Reports shows, and the actual savings to the government was substantially less, because in some cases large amounts of money had been spent already.)

    Among staff cast aside were those who worked on the National Assessment of Educational Progress — also known as the Nation’s Report Card — which is one of the few federal education initiatives the Trump administration says it sees as valuable and wants to preserve.

    The assessment is a series of tests administered nearly every year to a national sample of more than 10,000 students in grades 4, 8 and 12. The tests regularly measure what students across the country know in reading, math and other subjects. They allow the government to track how well America’s students are learning overall. Researchers can also combine the national data with the results of tests administered by states to draw comparisons between schools and districts in different states.

    The assessment is “something we absolutely need to keep,” Education Secretary Linda McMahon said at an education and technology summit in San Diego earlier this year. “If we don’t, states can be a little manipulative with their own results and their own testing. I think it’s a way that we keep everybody honest.”

    But researchers and former Department of Education employees say they worry that the test will become less and less reliable over time, because the deep cuts will cause its quality to slip — and some already see signs of trouble.

    “The main indication is that there just aren’t the staff,” said Sean Reardon, a Stanford University professor who uses the testing data to research gaps in learning between students of different income levels.

    All but one of the experts who make sure the questions in the assessment are fair and accurate — called psychometricians — have been laid off from the National Center for Education Statistics. These specialists play a key role in updating the test and making sure it accurately measures what students know.

    “These are extremely sophisticated test assessments that required a team of researchers to make them as good as they are,” said Mark Seidenberg, a researcher known for his significant contributions to the science of reading. Seidenberg added that “a half-baked” assessment would undermine public confidence in the results, which he described as “essentially another way of killing” the assessment.

    The Department of Education defended its management of the assessment in an email: “Every member of the team is working toward the same goal of maintaining NAEP’s gold-standard status,” it read in part.

    The National Assessment Governing Board, which sets policies for the national test, said in a statement that it had temporarily assigned “five staff members who have appropriate technical expertise (in psychometrics, assessment operations, and statistics) and federal contract management experience” to work at the National Center for Education Statistics. No one from DOGE responded to a request for comment.

    Harvard education professor Andrew Ho, a former member of the governing board, said the remaining staff are capable, but he’s concerned that there aren’t enough of them to prevent errors.

    “In order to put a good product up, you need a certain number of person-hours, and a certain amount of continuity and experience doing exactly this kind of job, and that’s what we lost,” Ho said.

    The Trump administration has already delayed the release of some testing data following the cutbacks. The Department of Education had previously planned to announce the results of the tests for 8th grade science, 12th grade math and 12th grade reading this summer; now that won’t happen until September. The board voted earlier this year to eliminate more than a dozen tests over the next seven years, including fourth grade science in 2028 and U.S. history for 12th graders in 2030. The governing board has also asked Congress to postpone the 2028 tests to 2029, citing a desire to avoid releasing test results in an election year. 

    “Today’s actions reflect what assessments the Governing Board believes are most valuable to stakeholders and can be best assessed by NAEP at this time, given the imperative for cost efficiencies,” board chair and former North Carolina Gov. Bev Perdue said earlier this year in a press release.

    The National Assessment Governing Board canceled more than a dozen tests when it revised the schedule for the National Assessment of Educational Progress in April. This annotated version of the previous schedule, adopted in 2023, shows which tests were canceled. Topics shown in all caps were scheduled for a potential overhaul; those annotated with a red star are no longer scheduled for such a revision.

    Recent estimates peg the annual cost to keep the national assessment running at about $190 million per year, a fraction of the department’s 2025 budget of approximately $195 billion.

    Adam Gamoran, president of the William T. Grant Foundation, said multiple contracts with private firms — overseen by Department of Education staff with “substantial expertise” — are the backbone of the national test.

    “You need a staff,” said Gamoran, who was nominated last year to lead the Institute of Education Sciences. He was never confirmed by the Senate. “The fact that NCES now only has three employees indicates that they can’t possibly implement NAEP at a high level of quality, because they lack the in-house expertise to oversee that work. So that is deeply troubling.”

    The cutbacks were widespread — and far outside of what most former employees had expected under the new administration.

    “I don’t think any of us imagined this in our worst nightmares,” said a former Education Department employee, who spoke on condition of anonymity for fear of retaliation by the Trump administration. “We weren’t concerned about the utter destruction of this national resource of data.”

    “At what point does it break?” the former employee asked.

    Related: Suddenly sacked

    Every state has its own test for reading, math and other subjects. But state tests vary in difficulty and content, which makes it tricky to compare results in Minnesota to Mississippi or Montana.

    “They’re totally different tests with different scales,” Reardon said. “So NAEP is the Rosetta stone that lets them all be connected.”

    Reardon and his team at Stanford used statistical techniques to combine the federal assessment results with state test scores and other data sets to create the Educational Opportunity Project. The project, first released in 2016 and updated periodically in the years that followed, shows which schools and districts are getting the best results — especially for kids from poor families. Since the project’s release, Reardon said, the data has been downloaded 50,000 times and is used by researchers, teachers, parents, school boards and state education leaders to inform their decisions.

    For instance, the U.S. military used the data to measure school quality when weighing base closures, and superintendents used it to find demographically similar but higher-performing districts to learn from, Reardon said.

    If the quality of the data slips, those comparisons will be more difficult to make.

    “My worry is we just have less-good information on which to base educational decisions at the district, state and school level,” Reardon said. “We would be in the position of trying to improve the education system with no information. Sort of like, ‘Well, let’s hope this works. We won’t know, but it sounds like a good idea.’”

    Seidenberg, the reading researcher, said the national assessment “provided extraordinarily important, reliable information about how we’re doing in terms of teaching kids to read and how literacy is faring in the culture at large.”

    Producing a test without keeping the quality up, Seidenberg said, “would be almost as bad as not collecting the data at all.”

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.



    Source link

  • Australia expands accepted English language tests for visa applications

    Australia expands accepted English language tests for visa applications

    LanguageCert Academic, CELPIP General, and the Michigan English Test (MET) are now officially accepted for use in Australian visa applications.

    With this update, a total of nine tests from eight different providers are now officially recognised for Australian visa purposes. These include previously accepted options such as IELTS, Pearson (PTE Academic), Cambridge English, TOEFL iBT, and OET. Notably, IELTS Academic and IELTS General Training are now registered as separate tests.

    Commenting on the news, Sharon Harvey, CEO of Michigan Language Assessment, said: “We are proud that Michigan Language Assessment has been approved by the government of Australia for MET to be used for Australian visa purposes. This recognition is a clear acknowledgment of the validity and reliability of MET, and of its value in assessing and certifying English language skills.”

    We are proud that Michigan Language Assessment has been approved by the government of Australia for MET to be used for Australian visa purposes

    The company, which launched in 2009 and enhanced with a secure digital version in 2021, said that to earn this status, MET underwent an extensive validation process.

    Meanwhile, LanguageCert‘s partnerships and recognitions director Fraser Cargill said the company was “excited to deepen our engagement in Australia, supporting individuals as they pursue opportunities in this dynamic country”.

    “This contract reflects our ongoing commitment to supporting government departments with secure solutions and individuals worldwide in achieving their academic, professional or personal goals through accessible and trusted language assessment,” he added.

    As of August 7, updated score requirements for certain tests have been implemented, with full details available on the Department of Home Affairs website.

    For its part, CELPIP General said it was “pleased to announce” that its test was one of those accepted by the Australian government as proof of English langage proficiency for visa purposes.

    “With this designation, we are pleased to provide test takers seeking to attain an Australian visa with the same dedicated assessment of English language proficiency that is tried and true for the government of Canada and other score users,” it said.

    Source link

  • If we are serious about improving student outcomes, we can’t treat teacher retention as an afterthought

    If we are serious about improving student outcomes, we can’t treat teacher retention as an afterthought

    In the race to help students recover from pandemic-related learning loss, education leaders have overlooked one of the most powerful tools already at their disposal: experienced teachers.

    For decades, a myth has persisted in education policy circles that after their first few years on the job, teachers stop improving. This belief has undercut efforts to retain seasoned educators, with many policymakers and administrators treating veteran teachers as replaceable cogs rather than irreplaceable assets.

    But that myth doesn’t hold up. The evidence tells a different story: Teachers don’t hit a plateau after year five. While their growth may slow, it doesn’t stop. In the right environments — with collaborative colleagues, supportive administrators and stable classroom assignments — teachers can keep getting better well into their second decade in the classroom.

    This insight couldn’t come at a more critical time. As schools work to accelerate post-pandemic learning recovery, especially for the most vulnerable students, they need all the instructional expertise they can muster.

    That means not just recruiting new teachers but keeping their best educators in the classroom and giving them the support they need to thrive.

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.

    In a new review of 23 longitudinal studies conducted by the Learning Policy Institute and published by the Thomas B. Fordham Institute, all but one of the studies showed that teachers generally improve significantly during their first five years. The research review also found continued, albeit slower, improvement well into years 6 through 15; several of the studies found improvement into later years of teaching, though at a diminished pace.

    These gains translate into measurable benefits for students: higher test scores, fewer disciplinary issues, reduced absenteeism and increased postsecondary attainment. In North Carolina, for example, students with highly experienced English teachers learned more and were substantially less likely to skip school and more likely to enjoy reading. These effects were strongest for students who were most at risk of falling behind.

    While experience helps all teachers improve, we’re currently failing to build that experience where it’s needed most. Schools serving large populations of low-income Black and Hispanic students are far more likely to be staffed primarily by early career teachers.

    And unfortunately, they’re also more likely to see those teachers leave after just a few years. This churn makes it nearly impossible to build a stable, experienced workforce in high-need schools.

    It also robs novice teachers of the veteran mentors who could help them get better faster and robs students of the opportunity to learn from seasoned educators who have refined their craft over time.

    To fix this, we need to address both sides of the equation: helping teachers improve and keeping them in the classrooms that need them most.

    Research points to several conditions that support continued teacher growth. Beginning teachers are more likely to stay and improve if they have had high-quality preparation and mentoring. Teaching is not a solo sport. Educators who work alongside more experienced peers improve faster, especially in the early years.

    Teachers also improve more when they’re able to teach the same grade level or subject year after year. Unfortunately, those in under-resourced schools are more likely to be shuffled around, undermining their ability to build expertise.

    Perhaps most importantly, schools that have strong leadership and which foster time for collaboration and a culture of professional trust see greater gains in teacher retention over time.

    Teachers who feel supported by their administrators, who collaborate with a team that shares their mission and who aren’t constantly switching subjects or grade levels are far more likely to stay in the profession.

    Pay matters too, especially in high-need schools where working conditions are toughest. But incentives alone aren’t enough. Short-term bonuses can attract teachers, but they won’t keep them if the work environment drives them away.

    Related: One state radically boosted new teacher pay – and upset a lot of teachers

    If we’re serious about improving student outcomes, especially in the wake of the pandemic, we have to stop treating teacher retention as an afterthought. That means retooling our policies to reflect what the research now clearly shows: experience matters, and it can be cultivated.

    Policymakers should invest in high-quality teacher preparation and mentoring programs, particularly in high-need schools. They should create conditions that promote teacher stability and collaboration, such as protected planning time and consistent teaching assignments.

    Principals must be trained not just as managers, but as instructional leaders capable of building strong school cultures. And state and district leaders must consider meaningful financial incentives and other supports to retain experienced teachers in the classrooms that need them most.

    With the right support, teachers can keep getting better. In this moment of learning recovery, a key to success is keeping teachers in schools and consciously supporting their growing effectiveness.

    Linda Darling-Hammond is founding president and chief knowledge officer at the Learning Policy Institute. Michael J. Petrilli is president of the Thomas B. Fordham Institute, a visiting fellow at the Hoover Institution and an executive editor of Education Next.

    Contact the opinion editor at [email protected].

    This story about teacher retention was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Release of NAEP science scores

    Release of NAEP science scores

    UPDATE: After this story was published, the Education Department issued a press release Monday afternoon, July 7, announcing that Matthew Soldner will serve as acting commissioner of the National Center for Education Statistics, in addition to his role as acting director of the Institute of Education Sciences. The job of statistics chief had been vacant since March and had prevented the release of assessment results.

    The repercussions from the decimation of staff at the Education Department keep coming. Last week, the fallout led to a delay in releasing results from a national science test.

    The National Assessment of Educational Progress (NAEP) is best known for tests that track reading and math achievement but includes other subjects, too. In early 2024, when the main reading and math tests were administered, there was also a science section for eighth graders. 

    The board that oversees NAEP had announced at its May meeting that it planned to release the science results in June. But that month has since come and gone. 

    Why the delay? There is no commissioner of education statistics to sign off on the score report, a requirement before it is released, according to five current and former officials who are familiar with the release of NAEP scores, but asked to remain anonymous because they were not authorized to speak to the press or feared retaliation. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Peggy Carr, a Biden administration appointee, was dismissed as the commissioner of the National Center for Education Statistics in February, two years before the end of her six-year term set by Congress. Chris Chapman was named acting commissioner, but he was fired in March, along with half the employees at the Education Department. The role has remained vacant since.

    A spokesman for the National Assessment Governing Board, which oversees NAEP,  said the science scores will be released later this summer, but denied that the lack of a commissioner is the obstacle. “The report building is proceeding so the naming of a commissioner is not a bureaucratic hold-up to its progress,” Stephaan Harris said by email.

    The delay matters. Education policymakers have been keen to learn if science achievement had held steady after the pandemic or tumbled along with reading and math. (Those reading and math scores were released in January.)

    The Trump administration has vowed to dismantle the Education Department and did not respond to an emailed question about when a new commissioner would be appointed. 

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    Researchers hang onto data

    Keeping up with administration policy can be head-spinning these days. Education researchers were notified in March that they would have to relinquish federal data they were using for their studies. (The department shares restricted datasets, which can include personally identifiable information about students, with approved researchers.) 

    But researchers learned on June 30 that the department had changed its mind and decided not to terminate this remote access. 

    Lawyers who are suing the Trump administration on behalf of education researchers heralded this about-face as a “big win.” Researchers can now finish projects in progress. 

    Still, researchers don’t have a way of publishing or presenting papers that use this data. Since the mass firings in mid-March, there is no one remaining inside the Education Department to review their papers for any inadvertent disclosure of student data, a required step before public release. And there is no process at the moment for researchers to request data access for future studies. 

    “While ED’s change-of-heart regarding remote access is welcome,” said Adam Pulver of Public Citizen Litigation Group, “other vital services provided by the Institute of Education Sciences have been senselessly, illogically halted without consideration of the impact on the nation’s educational researchers and the education community more broadly.  We will continue to press ahead with our case as to the other arbitrarily canceled programs.”

    Pulver is the lead attorney for one of three suits fighting the Education Department’s termination of research and statistics activities. Judges in the District of Columbia and Maryland have denied researchers a preliminary injunction to restore the research and data cuts. But the Maryland case is now fast-tracked and the court has asked the Trump administration to produce an administrative record of its decision-making process by July 11. (See this previous story for more background on the court cases.)

    Related: Education researchers sue Trump administration, testing executive power

    Some NSF grants restored in California

    Just as the Education Department is quietly restarting some activities that DOGE killed, so is the National Science Foundation (NSF). The federal science agency posted on its website that it had reinstated 114 awards to 45 institutions as of June 30. NSF said it was doing so to comply with a federal court order to reinstate awards to all University of California researchers. It was unclear how many of these research projects concerned education, one of the major areas that NSF funds.

    Researchers and universities outside the University of California system are hoping for the same reversal. In June, the largest professional organization of education researchers, the American Educational Research Association, joined forces with a large coalition of organizations and institutions in filing a legal challenge to the mass termination of grants by the NSF. Education grants were especially hard hit in a series of cuts in April and May. Democracy Forward, a public interest law firm, is spearheading this case.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about delaying the NAEP science score report was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • HOELT tender explores digital tests

    HOELT tender explores digital tests

    The latest round of engagement is being undertaken “to gather market insights on newly available and emerging technology in relation to remote testing, and the viability of incorporating this into the HOELT service,” said the Home Office in a notice on July 2.  

    The notice is the latest update to a government tender to design and maintain a dedicated Secure English Language Test (SELT) owned by the Home Office, holding an initial contract value of £1.13bn, which was since reduced to £680m

    The original tender, launched in August 2024, made no mention of engagement on emerging technologies and digital tests, instead outlining plans for in-person delivery, including invigilators and ID-verification services at physical test centres around the world.  

    As per the latest update, the developer that wins the tender will still be responsible for “establishing and managing global test centres” – of which there are 268 – though the notice suggests that remote testing will also be incorporated into the model. 

    While the sector has embraced online delivery and at-home testing, the Home Office will also be taking stock of rising concerns among the public about the use of AI in English proficiency tests.

    According to a recent YouGov poll, 40% of the public are worried about AI causing a greater risk of cheating on English language tests, with a similar proportion concerned about the ability of AI to properly assess language skills.

    The poll, commissioned by Cambridge University Press & Assessment, asked respondents specifically about tests assessing English language skills for people applying to work and study in the UK.

    Additional findings revealed the public’s unease at the prospect of limited human interaction and concerns that AI-led exams would disadvantage those with limited access to technology – both cited by roughly a quarter of respondents.

    Meanwhile, only 8% said they had “no concerns” about the use of AI in English language tests for people applying to work or study in the UK.

    39% of the public are concerned about AI-based tests enabling cheating

    YouGov Poll

    Under its initial plans, the Home Office proposed disaggregating the service into two lines; the development and ongoing support of a Home Office branded test to be used globally, and the facilitation of tests around the world, according to the tender. 

    However, the government’s slashing of the value of the tender led some stakeholders to speculate that the Home Office might turn to a single supplier for both development and delivery.  

    Despite the additional engagement around emerging technologies and remote testing, the value of the tender remains at £680m (excluding VAT).  

    Since the government put out the HOELT tender last year, there has been little news about which companies are throwing their hats in the ring or what their proposed model would look like.

    Currently, PearsonLanguageCert, Trinity College London, and IELTS – which is co-owned by IDP, Cambridge English and the British Council – deliver Home Office-approved SELTs in the UK. 

    The deadline for the latest round of engagement is July 17.

    Source link

  • A smaller Nation’s Report Card

    A smaller Nation’s Report Card

    As Education Secretary Linda McMahon was busy dismantling her cabinet department, she vowed to preserve one thing: the National Assessment of Educational Progress (NAEP), also known as the Nation’s Report Card. In early April, she told a gathering of ed tech companies and investors that the national exam was “something we absolutely need to keep,” because it’s a “way that we keep everybody honest” about the truth of how much students across the country actually know.  

    That was clearly a promise with an asterisk. 

    Less than two weeks later, on Monday of this week, substantial parts of NAEP came crumbling down when the board that oversees the exam reluctantly voted to kill more than a dozen of the assessments that comprise the Nation’s Report Card over the next seven years. 

    The main reading and math tests, which are required by Congress, were preserved. But to cut costs in an attempt to appease Elon Musk’s Department of Government Efficiency or DOGE, the National Assessment Governing Board (NAGB) scrapped a 2029 administration of the Long-Term Trend NAEP, an exam that has tracked student achievement since the 1970s.* Also cut were fourth grade science in 2028, 12th grade science in 2032 and 12th grade history in 2030. Writing assessments, which had been slated for 2032, were canceled entirely. State and local results were also dropped for an assortment of exams. For example, no state-level results will be reported for 12th grade reading and math in 2028, nor will there be district-level results for eighth grade science that year. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    “These are recommendations that we are making with much pain,” said board chair Beverly Perdue, a former North Carolina governor who was appointed to this leadership role in 2018 during President Donald Trump’s first term. “None of us want to do this.”

    The board didn’t provide an official explanation for its moves. But the vice chair, Martin West, a Harvard professor of education, said in an interview that the cuts were an effort to save the 2026 assessments. “A moment of reckoning came more quickly because of the pressures on the program to reduce expenses in real time,” he said. 

    In other words, the board was effectively cutting off the patient’s appendages to try to save the brain and the heart. Despite the sacrifice, it’s still not clear that the gambit will work.

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    DOGE has been demanding 50 percent cuts to the $190 million a year testing program. Nearly all the work is handled by outside contractors, such as Westat and ETS, and five-year contracts were awarded at the end of 2024. But instead of paying the vendors annually, DOGE has diced the payments into shorter increments, putting pressure on the contractors to accept sharp cuts, according to several former Education Department employees. At the moment, several of the contracts are scheduled to run out of money in May and June, and DOGE’s approval is needed to restart the flow of money. Indeed, DOGE allowed one NAEP contract to run out of funds entirely on March 31, forcing ETS employees to stop work on writing new questions for future exams. 

    Reading and math tests are scheduled to start being administered in schools in January 2026, and so additional disruptions could derail the main NAEP assessment altogether. NAEP is taken by a sample of 450,000 students who are selected to represent all the fourth and eighth graders in the nation, and each student only takes part of a test. This sampling approach avoids the burden of testing every child in the country, but it requires Education Department contractors to make complicated statistical calculations for the number of test takers and the number of test sections needed to produce valid and reliable results. Contractors must then package the test sections into virtual test booklets for students to take online. The Education Department also must get approval from the federal Office of Management and Budget to begin testing in schools — yet another set of paperwork that is handled by contractors. 

    A DOGE dilemma 

    People familiar with the board’s deliberations were concerned that contractors might be pressured to agree to cuts that could harm the quality and the validity of the exam itself. Significant changes to the exam or its administration could make it impossible to compare student achievement with the 2024 results, potentially undermining the whole purpose of the assessment. 

    Board members were ultimately faced with a dilemma. They could cut corners on the full range of assessments or hope to maintain NAEP’s high quality with a much smaller basket of tests. They chose the latter.

    The cuts were designed to comply with congressional mandates. While the Long-Term Trend assessment is required by Congress, the law does not state how frequently it must be administered, and so the governing board has deferred it until 2033. Many testing experts have questioned whether this exam has become redundant now that the main NAEP has a 35-year history of student performance. The board has discussed scrapping this exam since 2017. “The passage of time raises questions about its continued value,” said West.

    Related: NAEP, the Nation’s Report Card, was supposed to be safe. It’s not

    The writing assessments, originally scheduled for 2032 for grades four, eight and 12, needed an overhaul and that would have been an expensive, difficult process especially with current debates over what it means to teach writing in the age of AI.

    The loss of state- and district-level results for some exams, such as high school reading and math, were some of the more painful cuts. The ability to compare student achievement across state lines has been one of the most valuable aspects of the NAEP tests because the comparison can provide role models for other states and districts. 

    Cost cutting

    “Everyone agrees that NAEP can be more efficient,” said West, who added that the board has been trying to cut costs for many years.  But he said that it is tricky to test changes for future exams without jeopardizing the validity and the quality of the current exam. That dual path can sometimes add costs in the short term. 

    It was unclear how many millions of dollars the governing board saved with its assessment cancellations Monday, but the savings are certainly less than the 50 percent cut that DOGE is demanding. The biggest driver of the costs is the main NAEP test, which is being preserved. The contracts are awarded by task and not by assessment, and so the contractors have to come back with estimates of how much the cancellation of some exams will affect its expenses. For example, now that fourth grade science isn’t being administered in 2028, no questions need to be written for it. But field staff will still need to go to schools that year to administer tests, including reading and math, which haven’t been cut.

    Compare old and new assessment schedules

    Outside observers decried the cuts on social media, with one education commentator saying the cancellations were “starting to cut into the muscle.” Science and history, though not mandated by Congress, are important to many. ”We should care about how our schools are teaching students science,” said Allison Socol, who leads preschool to high school policy at EdTrust, a nonprofit that advocates for equity in education. “Any data point you look at shows that future careers will rely heavily on STEM skills.”

    Socol worries that DOGE will not be satisfied with the board’s cuts and demand more. “It’s just so much easier to destroy things than to build them,” she said. “And it’s very easy, once you’ve taken one thing away, to take another one and another one and another one.”

    On April 17, the Education Department announced that the 2026 NAEP would proceed as planned. But after mass layoffs in March, it remained unclear if the department has the capacity to oversee the process, since only two employees with NAEP experience are left out of almost 30 who used to work on the test. McMahon might need to rehire some employees to pull it off, but new hiring would contradict the spirit of Trump’s executive order to close the department.

    Socol fears that the Trump administration doesn’t really want to measure student achievement. “There is a very clear push from the administration, not just in the education sector, to have a lot less information about how our public institutions are serving the people in this country,” Socol said. “It is a lot easier to ignore inequality if you can’t see it, and that is the point.”

    The Education Department did not respond to my questions about their intentions for NAEP. McMahon has been quite forceful in articulating the value of the assessments, but she might not have the final say since DOGE has to approve the NAEP contracts. “What’s very clear is that the office of the secretary does not completely control the DOGE people,” said a person with knowledge of the dynamics inside the Education Department. “McMahon’s views affect DOGE priorities, but McMahon doesn’t have direct control at all.”

    The ball is now in DOGE’s court.  

    Canceled assessments

    • Long-Term Trend (LTT) assessments in math and reading for 9, 13 and 17 year olds in 2029. (The Education Department previously canceled the 2025 LTT for 17 year olds in February 2025.)
    • Science: Fourth-grade in 2028, 12th grade in 2032
    • History: 12th grade in 2030
    • Writing:  Fourth, eighth and 12th grades in 2032
    • State-level results: 12th grade math and reading in 2028 and 2032, eighth grade history in 2030
    • District-level results: Eighth-grade science in 2028 and 2032

    For more details, refer to the new schedule, adopted in April 2025, and compare with the old, now-defunct schedule from 2023. 

    *Correction: An earlier version of this sentence incorrectly said that two administrations of the Long-Term Trend NAEP had been scrapped by the governing board on April 21. Only the 2029 administration was canceled by the board. The 2025 Long-Term Trend NAEP for 17 year olds was canceled by the Education Department in February. Nine- and 13-year-old students had already taken it by April.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about NAEP cuts was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Biden Announces New COVID-19 Mitigation Plan – CUPA-HR

    Biden Announces New COVID-19 Mitigation Plan – CUPA-HR

    by CUPA-HR | September 9, 2021

    On September 9, President Biden released a new COVID-19 mitigation plan, which includes several new requirements and recommendations for employers, employees, schools and others across the country. The six-part plan includes new policies and strategies to vaccinate more unvaccinated individuals, administer booster shots, keep schools safely open, increase testing facilities and products, protect economic recovery, and improve treatments for COVID-19.

    New Vaccine Requirements for Many Employers and Employees

    In the new plan, the Biden administration announced that the Occupational Safety and Health Administration (OSHA) is developing a rule to require all employers under OSHA’s jurisdiction with 100 or more employees “to ensure their workforce is fully vaccinated or require any workers who remain unvaccinated to produce a negative test result on at least a weekly basis before coming to work.” Additionally, OSHA is developing a rule that will require these same employers to provide paid time off to their employees to allow them to get vaccinated and recover from post-vaccination symptoms. According to the plan, both of these rules will be implemented by OSHA through an Emergency Temporary Standard (ETS), though when the ETS will be issued is still unknown. Importantly, while OSHA’s direct jurisdiction is limited to private sector employers, the ETS requirements could extend to many state and local government employers as detailed in the February 2021 CRS report.

    In addition to the requirements for employers with 100 or more employees, the plan also announced that healthcare workers at Medicare and Medicaid participating healthcare settings will be required to be vaccinated. Unlike the requirements for employers with 100 or more employees, these requirements do not allow for a testing option in lieu of getting vaccinated. According to the announcement, this requirement will apply to over 17 million healthcare workers across the country.

    Lastly, the plan states that President Biden signed two Executive Orders that will require all federal executive branch workers and federal contractors to be vaccinated. Unlike previous Biden administration policies on vaccine mandates for federal employees and contractors, the new requirements will no longer provide the option for unvaccinated employees to undergo regular testing instead of getting vaccinated, “with exceptions only as required by law.”

    COVID-19 Testing and Booster Shots

    In addition to these new vaccine requirements, the plan lays out the Biden administration’s plans to expand and improve testing and to provide booster shots to eligible Americans. According to the plan, booster shots will begin during the week of September 20 after the Food and Drug Administration authorizes their use.

    President Calls on Entertainment Venues to Require Vaccination or Negative Tests

    While not directly mandating, the president’s plan also calls on entertainment venues such as sports arenas, large concert halls and other venues where large groups of people gather — many of which are common to campuses — to require that their patrons be vaccinated or show a negative test for entry.

    As the Biden administration moves forward with implementing this new plan, CUPA-HR will continue to keep members apprised of any new guidance or requirements that come from this announcement.



    Source link