Tag: Score

  • What the NAEP Proficient Score Really Means for Learning – The 74

    What the NAEP Proficient Score Really Means for Learning – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    In September, The 74 published Robert Pondiscio’s opinion piece discussing how people without strong reading skills lack what it takes “to effectively weigh competing claims” and “can’t reconcile conflicts, judge evidence or detect bias.” He adds, “They may read the words, but they can’t test the arguments.”

    To make his case, Pondiscio relies on the skill level needed to achieve a proficient score or better on National Assessment of Educational Progress, a level that only 30% of tested students reached on 2024’s Grade 8 reading exam. Only 16% of Black students and 19% of Hispanics were proficient or more.

    Yet naysayers argue that the NAEP standard is simply set too high and that NAEP’s sobering messages are inaccurate. There is no crisis, according to these naysayers.

    So, who is right?

    Well, research on testing performance of eighth graders from Kentucky indicates that it’s Pondiscio, not the naysayers, who has the right message about the NAEP proficiency score. And, Kentucky’s data show this holds true not just for NAEP reading, but for NAEP math, as well.

    Kentucky offered a unique study opportunity. Starting in 2006, the Bluegrass State began testing all students in several grades with exams developed by the ACT, Inc. These tests include the ACT college entrance exam, which was administered to all 11th grade public school students, and the EXPLORE test, which was given to all of Kentucky’s public school eighth graders.

    Both the ACT and EXPLORE featured something unusual: “Readiness Benchmark” scores which ACT, Inc. developed by comparing its test scores to actual college freshman grades years later. Students reaching the benchmark scores for reading or math had at least a 75% chance to later earn a “C” or better in related college freshman courses.

    So, how did the comparisons between Kentucky’s benchmark score performance and the NAEP work out?

     Analysis found close agreement between the NAEP proficiency rates and the share of the same cohorts of students reaching EXPLORE’s readiness benchmarks. ​

    For example, in Grade 8 reading, EXPLORE benchmark performance and NAEP proficiency rates for the same cohorts of students never varied by more than four percentage points for testing in 2008-09, 2010-11, 2012-13 or 2014-15.

    The same, close agreement was found in the comparison of NAEP grade 8 math proficiency rates to the EXPLORE math benchmark percentages. 

    EXPLORE to NAEP results were also examined separately for white, Black and learning-disabled students. Regardless of the student group, the EXPLORE’s readiness benchmark percentages and NAEP’s proficient or above statistics agreed closely.

    Doing an analysis with Kentucky’s ACT college entrance results test was a bit more challenging because NAEP doesn’t provide state test data for high school grades. However, it is possible to compare each student cohort’s Grade 8 NAEP performance to that cohort’s ACT benchmark score results posted four years later when they graduated from high school. Data for graduating classes in 2017, 2019 and 2021 uniformly show close agreement for overall average scores, as well as for separate student group scores.

    It’s worth noting that all NAEP scores have statistical sampling errors. After those plus and minus errors are considered, the agreements between the NAEP and the EXPLORE and ACT test results look even better.

    The bottom line is: Close agreement between NAEP proficiency rates and ACT benchmark score results for Kentucky suggests that NAEP proficiency levels are highly relevant indicators of critical educational performance. ​Those claiming NAEP’s proficiency standard is set too high are incorrect.

    That leaves us with the realization that overall performance of public school students in Kentucky and nationwide is very concerning. Many students do not have the reading and math skills needed to navigate modern life. Instead of simply rejecting the troubling results of the latest round of NAEP, education leaders need to double down on building key skills among all students.


    Did you use this article in your work?

    We’d love to hear how The 74’s reporting is helping educators, researchers, and policymakers. Tell us how

    Source link

  • IELTS apologises after technical issue leads to score changes

    IELTS apologises after technical issue leads to score changes

    • IELTS attributes the situation to a “technical issue” affecting some reading and listening components of its Academic and General Training tests.
    • Testing company says 99% of its tests in the relevant time period were unaffected by the bug and offers apologies and support to test-takers who received incorrect results.
    • Commentators point out the consequences of the score changes could be far-reaching.

    IELTS test-takers around the world have been informed that some results dating back to August 2023 were incorrect, and revised scores have now been issued.

    The incorrect test results are a result of a “technical issue” affecting a number of listening and reading components of some IELTS Academic and General Training tests. Most result corrections are upwards, with some downwards. The majority of test-takers saw changes in component scores, with some experiencing a 0.5 band score change too.

    “IELTS recently identified an issue that led to a small proportion of test-takers receiving incorrect results between August 2023 and September 2025,” the company said in a statement.

    “Over 99% of IELTS tests during this time period were unaffected and there are no continuing issues with current IELTS tests. We have contacted affected test-takers to provide updated results, to offer our sincere apologies, and to provide appropriate support. We have also contacted relevant recognising organisations.”

    The organisation maintained it has “strict quality control procedures” in place to protect the integrity of the millions of IELTS tests it administers each year and assured it has taken “all necessary steps” to prevent this issue from happening again.

    IELTS, co-owned by IDP, Cambridge University Press, and the British Council, has launched a help page addressing the issue. The page provides answers to frequently asked questions and guidance for affected candidates and organisations on next steps, including how to access revised scores.

    Affected test-takers are being offered refunds and free resits.

    Michael Goodine, owner of Test Resources in South Korea, and commentator on the testing industry, said the story highlights “how important it is for test makers to identify problems as quickly as possible so that test takers have sufficient time to protect their interests”.

    Goodine worries that those test-takers who originally received lower scores may have “missed out on life-changing academic and professional opportunities for which they needed a particular IELTS score”.

    The PIE contacted IELTS for comment.

    Goodine also has concerns that the technical issue may have prevented some candidates from meeting immigration or residency requirements.

    “It may be too late for some of these individuals to get back on track. I feel for those people,” he said.

    Testing companies serve as gatekeepers for academia and for immigration.
    Michael Goodine, Test Resources

    And he worries that candidates who received inflated scores may have found themselves struggling in academic settings.

    “Testing companies serve as gatekeepers for academia and for immigration. When they mess up, the consequences can be far-reaching and profound,” said Goodine.

    IELTS describes the issue as “an internal IELTS issue” and said it has completed a thorough investigation into the cause of the issue “to ensure no current or future test takers would be affected, and to rectify the issue for those impacted”.

    Source link

  • Students score universities on experience – Campus Review

    Students score universities on experience – Campus Review

    Three private universities offer the best student experience out of all Australian institutions according to the latest student experience survey, with the University of Divinity ranked number one overall.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Credit Score Penalties in the Home Insurance Market (Nick Graetz)

    Credit Score Penalties in the Home Insurance Market (Nick Graetz)

    On February 4, Nick Graetz joined the University of Michigan’s Stone Center to present “Individualizing Climate Risk: Credit Score Penalties in the Home Insurance Market.”

    Nick Graetz is an Assistant Professor at the University of Minnesota in the Department of Sociology and the Institute for Social Research and Data Innovation. He is also a Fellow at the Climate and Community Institute, a progressive climate policy think tank developing research on the climate and inequality nexus. His work focuses on the intersection of housing, population health, and political economy in the United States. Learn more at ncgraetz.com.

     


     

    Source link