Tag: Scores

  • Advanced Teaching Roles Program Shows Improved Test Scores, but Faces Funding Concerns – The 74

    Advanced Teaching Roles Program Shows Improved Test Scores, but Faces Funding Concerns – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    North Carolina’s Advanced Teaching Roles program, which allows highly effective teachers to receive salary supplements for teaching additional students or supporting other teachers, is having positive effects on math and science test scores, according to an evaluation presented by NC State University’s Friday Institute for Educational Innovation at the State Board of Education meeting last week.

    Since 2016, the ATR initiative has allowed districts to create new career pathways and provide salary supplements for highly effective teachers — or Advanced Teachers — who mentor and support other educators while still teaching part of the day. Their roles include Adult Leadership teachers, who lead small teams and receive at least $10,000 supplements, and Classroom Excellence teachers, who take on larger student loads and receive a minimum of $3,000 supplements. 

    Those in adult leadership roles teach for at least 30% of the day, lead a team of 3-8 classroom teachers, and share responsibility for the performance of all those teachers’ students. Classroom excellence teachers are responsible for at least 20% more students than before they enter the role.

    “Our ATR program was designed to allow highly effective classroom educators to reach more students and to support the professional growth of educators,” said Dr. Callie Edwards, the program’s lead evaluator, at the State Board of Education meeting last Wednesday. “ATR aims to improve the quality of classroom instruction, the recruitment and retention of teachers, as well as ultimately impact student academic achievement.”

    In the 2024-25 school year, 26 districts operated ATR programs across 400 schools — 56% of which were elementary schools — employing 1,494 Advanced Teachers who supported nearly 4,000 classroom teachers statewide, according to the evaluation. Edwards said that 88% of Adult Leadership teachers received at least $10,000, and 85% of Classroom Excellence teachers received $3,000 or more.

    Statistical analysis of the 2023-24 school year’s data found that students in ATR schools outperformed their peers in non-ATR schools in math and science, showing statistically significant learning gains. 

    “Across the various programs I’ve evaluated, these are positive results — especially in math and science — where the impact of ATR is equivalent to about a month of extra learning for students,” said Dr. Lam Pham, the leading quantitative evaluator. “The results in ELA are positive but not statistically significant, which has been consistent for the last three years,” Pham said, referring to English Language Arts.

    These effects on math and science grow over time, according to the evaluation. Math scores improved throughout schools’ first six years of ATR implementation — though they are no longer significant by the seventh year of implementation, according to the presentation. For science scores, statistically significant gains began in the fifth year after schools began implementing ATR.

    Additionally, math teachers in ATR schools reported higher EVAAS growth scores than their peers in comparable schools.

    Teachers in ATR schools also reported feeling like they have more time to do their work compared to teachers in non-ATR schools.

    This year’s report featured data on teachers supported by ATR teachers for the first time. The evaluation found no positive effects on test scores for students taught by supported teachers compared to students taught by teachers who are not in the program. The researchers also found no effect on turnover levels for teachers supported by Advanced Teachers. However, the report says additional years of data will be necessary to verify if those effects appear over time.  

    The evaluation recommended that principals in ATR schools should foster collaboration and communicate strategically about the program with staff, beginning during Advanced Teachers’ hiring and onboarding.

    “It’s important to integrate ATR into those processes,” Edwards told the Board. “That means introducing Advanced Teachers to new staff and making collaboration, especially mentoring and coaching, a structured part of the day.”

    Edwards said these practices have been adopted in some schools, but principals reported needing more time and support to build collaboration opportunities into the school schedule.

    The report also urges district administrators to coordinate with Beginning Teacher (BT) programs, advertise ATR in recruitment materials, and improve their data collection practices. It also calls on state leaders to standardize the program to ensure consistency across participating districts.

    “Districts need standardized messaging, professional learning opportunities, and technical assistance to support implementation,” Edwards said. “The state can also create more opportunities for districts to share what’s working with one another and expand the evaluation beyond test scores to capture things like classroom engagement, social, emotional development, and feedback from teachers and principals.”

    The evaluators also said “there’s more to do” to expand the program in western North Carolina after Board members raised concerns about uneven participation across the state’s regions.

    2026-27 participants

    After the Friday Institute’s presentation, Board members heard a presentation on proposals for the next round of districts to join the ATR program from Dr. Thomas R. Tomberlin, senior director of educator preparation, licensure, and performance.

    Tomberlin said DPI received 15 proposals representing 22 districts. These proposals have been evaluated by seven independent evaluators, Tomberlin said. The Board had to choose the program’s next participants by Oct. 15 to comply with a legislative requirement. 

    The state can only allocate $911,349 for new implementation grants in 2026-27 — less than one-sixth of the funding required to fund all applications. That level of funding is “very low” compared to previous years, Tomberlin said. In the 2023-25 state budget, the General Assembly appropriated $10.9 million in recurring funds for these supplements in each year of the biennium.

    Tomberlin recommended that the Board approve the three highest-scoring proposals for the 2026-27 fiscal year, and fund these districts at 85% of their request. If the Board approves this recommendation, the state would still have $37,981 in planning funds left over for districts approved during the 2026 proposal cycle.

    Tomberlin said districts are already struggling to pay for the program’s salary supplements. The Friday Institute’s report showed that, despite the high median supplements, some districts are offering supplements as little as $1,000.

    “Some districts are not able to pay the full $10,000 because they have more ATR teachers than the funding that we can give them in terms of those allotments,” Tomberlin said. “And we had requested the General Assembly, I think, an additional $14 million to cover those supplements, and we didn’t get any.”

    The Senate’s budget proposal this session included funds to expand the ATR program over the biennium, while the House proposal did not. The General Assembly has not yet passed a comprehensive state budget, and its mini-budget did not include ATR program funding.

    Tomberlin said DPI would be in touch with the three districts to verify if they can proceed with the program despite limited funding.


    This article first appeared on EdNC and is republished here under a Creative Commons Attribution-NoDerivatives 4.0 International License.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • How We Outperformed National Reading Scores – And Kept Students at Grade Level – The 74

    How We Outperformed National Reading Scores – And Kept Students at Grade Level – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    As reading scores remain a top concern for schools nationwide, many districts are experimenting with ability-based grouping in the early grades. The idea is to group students in multiple grade levels by their current reading level — not their grade level. A classroom could have seven kindergartners, 10 first graders, and three second graders grouped together for reading because they all read at the same level.

    While this may work for some schools, in our district, Rockwood School District in Missouri, we’ve chosen a different path. We keep students together in their class during whole-class instruction — regardless of ability level — and provide support or enrichment by creating flexible groups based on instructional needs within their grade level.

    We’re building skilled, confident readers not by separating them, but by growing them together.

    Children, like adults, learn and grow in diverse groups. In a Rockwood classroom, every student contributes to the shared learning environment — and every student benefits from being part of it.

    Our approach starts with whole-class instruction. All students, including English multilingual learners and those working toward grade-level benchmarks, participate in daily, grade-level phonics and comprehension lessons. We believe these shared experiences are foundational — not just for building literacy, but for fostering community and academic confidence.

    After our explicit, whole-group lessons, students move into flexible, needs-based small groups informed by real-time data and observations. Some students receive reteaching, while others take on enrichment activities. During these blocks, differentiation is fluid: A student may need decoding help one day and vocabulary enrichment the next. No one is locked into a static tier. Every day is a new opportunity.

    Students also engage in daily independent and partner reading. In addition, reading specialists provide targeted, research-based interventions for striving readers who need additional instruction.

    We build movement into our instruction, as well — not as a brain break, but as a learning tool. We use gestures for phonemes, tapping for spelling and jumping to count syllables. These are “brain boosts,” helping young learners stay focused and engaged.

    We challenge all students, regardless of skill level. During phonics and word work, advanced readers work with more complex texts and tasks. Emerging readers receive the time and scaffolded support they need — such as visual cues and pre-teaching or exposing students to a concept or skill before it’s formally taught during a whole-class lesson. That can help them fully participate in every class. A student might not yet be able to decode or encode every word, but they are exposed to the grade-level standards and are challenged to meet the high expectations we have for all students.

    During shared and interactive reading lessons, all students are able to practice fluency and build their comprehension skills and vocabulary knowledge. Through these shared experiences, every child experiences success.

    There’s a common misconception that mixed-ability classrooms hold back high achievers or overwhelm striving readers. But in practice, engagement depends more on how we teach rather than who is in the room. With well-paced, multimodal lessons grounded in grade-level content, every learner finds an entry point.

    You’ll see joy, movement, and mutual respect in our classrooms — because when we treat students as capable, they rise. And when we give them the right tools, not labels, they use them.

    While ability grouping may seem like a practical solution, research suggests it can have a lasting downside. A Northwestern University study of nearly 12,000 students found that those placed in the lowest kindergarten reading groups rarely caught up to their peers. For example, when you group a third grader with first graders, when does the older child get caught up? Even if he learns and progresses with his ability group, he’s still two grade levels behind his third-grade peers.

    This study echoes what researchers refer to as the Matthew Effect in reading: The rich get richer, and the poor get poorer. Lower-track students are exposed to less complex vocabulary and fewer comprehension strategies. Once placed on that path, it’s hard to catch up. Once a student is assigned a label, it’s difficult to change it — for both the student and educators.

    In Rockwood, we’re confident in what we’re doing. We have effective, evidence-based curricula for Tier I phonics and comprehension, and every student receives the same whole-class instruction as every other student in their grade. Then, students receive intervention or enrichment as needed.

    At the end of the 2024–25 school year, our data affirmed what we see every day. Our kindergarteners outperformed national proficiency averages in every skill group — in some cases by more than 17 percentage points, according to our Reading Horizons data. Our first and second graders outpaced national averages across nearly every domain. We don’t claim to have solved the literacy crisis — or know that our model will work for every district, school, classroom or student — but we’re building readers before gaps emerge.

    We’ve learned that when every student receives strong Tier I instruction, no one gets left behind. The key isn’t separating kids by ability. It’s designing instruction that’s universally strong and strategically supported.

    We recognize that every community faces distinct challenges. If you’re a district leader weighing the trade-offs of ability grouping, consider this: When you pull students out of the room during critical learning moments, the rich vocabulary, the shared texts and the academic conversation, you are not closing the learning gap, but creating a bigger one. Those critical moments build more than skills; they build readers.

    In Rockwood, our data confirms what we see every day: students growing not only in skills, but also in confidence, stamina and joy. We’re proving that inclusive, grade-level-first instruction can work — and work well — for all learners.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • NAEP scores for class of 2024 show major declines, with fewer students college ready

    NAEP scores for class of 2024 show major declines, with fewer students college ready

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters.

    Students from the class of 2024 had historically low scores on a major national test administered just months before they graduated.

    Results from the National Assessment of Educational Progress, or NAEP, released September 9, show scores for 12th graders declined in math and reading for all but the highest performing students, as well as widening gaps between high and low performers in math. More than half of these students reported being accepted into a four-year college, but the test results indicate that many of them are not academically prepared for college, officials said.

    “This means these students are taking their next steps in life with fewer skills and less knowledge in core academics than their predecessors a decade ago, and this is happening at a time when rapid advancements in technology and society demand more of future workers and citizens, not less,” said Lesley Muldoon, executive director of the National Assessment Governing Board. “We have seen progress before on NAEP, including greater percentages of students meeting the NAEP proficient level. We cannot lose sight of what is possible when we use valuable data like NAEP to drive change and improve learning in U.S. schools.”

    These results reflect similar trends seen in fourth and eighth grade NAEP results released in January, as well as eighth grade science results also released Tuesday.

    In a statement, Education Secretary Linda McMahon said the results show that federal involvement has not improved education, and that states should take more control.

    “If America is going to remain globally competitive, students must be able to read proficiently, think critically, and graduate equipped to solve complex problems,” she said. “We owe it to them to do better.”

    The students who took this test were in eighth grade in March of 2020 and experienced a highly disrupted freshman year of high school because of the pandemic. Those who went to college would now be entering their sophomore year.

    Roughly 19,300 students took the math test and 24,300 students took the reading test between January and March of 2024.

    The math test measures students’ knowledge in four areas: number properties and operations; measurement and geometry; data analysis, statistics, and probability; and algebra. The average score was the lowest it has been since 2005, and 45% of students scored below the NAEP Basic level, even as fewer students scored at NAEP Proficient or above.

    NAEP Proficient typically represents a higher bar than grade-level proficiency as measured on state- and district-level standardized tests. A student scoring in the proficient range might be able to pick the correct algebraic formula for a particular scenario or solve a two-dimensional geometric problem. A student scoring at the basic level likely would be able to determine probability from a simple table or find the population of an area when given the population density.

    Only students in the 90th percentile — the highest achieving students — didn’t see a decline, and the gap between high- and low-performing students in math was higher than on all previous assessments.

    This gap between high and low performers appeared before the pandemic, but has widened in most grade levels and subject areas since. The causes are not entirely clear but might reflect changes in how schools approach teaching as well as challenges outside the classroom.

    Testing officials estimate that 33% of students from the class of 2024 were ready for college-level math, down from 37% in 2019, even as more students said they intended to go to college.

    In reading, students similarly posted lower average scores than on any previous assessment, with only the highest performing students not seeing a decline.

    The reading test measures students’ comprehension of both literary and informational texts and requires students to interpret texts and demonstrate critical thinking skills, as well as understand the plain meaning of the words.

    A student scoring at the basic level likely would understand the purpose of a persuasive essay, for example, or the reaction of a potential audience, while a students scoring at the proficient level would be able to describe why the author made certain rhetorical choices.

    Roughly 32% of students scored below NAEP Basic, 12 percentage points higher than students in 1992, while fewer students scored above NAEP Proficient. An estimated 35% of students were ready for college-level work, down from 37% in 2019.

    In a survey attached to the test, students in 2024 were more likely to report having missed three or more days of school in the previous month than their counterparts in 2019. Students who miss more school typically score lower on NAEP and other tests. Higher performing students were more likely to say they missed no days of school in the previous month.

    Students in 2024 were less likely to report taking pre-calculus, though the rates of students taking both calculus and algebra II were similar in 2019 and 2024. Students reported less confidence in their math abilities than their 2019 counterparts, though students in 2024 were actually less likely to say they didn’t enjoy math.

    Students also reported lower confidence in their reading abilities. At the same time, higher percentages of students than in 2024 reported that their teachers asked them to do more sophisticated tasks, such as identifying evidence in a piece of persuasive writing, and fewer students reported a low interest in reading.

    Chalkbeat is a nonprofit news site covering educational change in public schools.

    For more news on national assessments, visit eSN’s Innovative Teaching hub.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Indiana Middle Schoolers’ English Scores Have Fallen. These Schools are Bucking the Trend. – The 74

    Indiana Middle Schoolers’ English Scores Have Fallen. These Schools are Bucking the Trend. – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    Like their peers nationwide, students at Crawford County Middle School in southern Indiana struggled academically in the pandemic’s wake. Principal Tarra Carothers knew her students needed help to get back on track.

    So two years ago, she decided to double instructional time for math and English. Students now spend two periods per day in these critical subjects. Carothers believes the change has been a success, and a key trend backs her up: Crawford’s ILEARN scores in English language arts increased by over 8 percentage points from 2024 to 2025.

    But overall, Indiana middle schools are heading in the opposite direction when it comes to English. In fact, despite gains in math, middle schoolers are struggling more than students in other grade levels in English, state test scores show. Since 2021, ILEARN English proficiency rates in seventh and eighth grades have fallen, with the dip particularly pronounced for seventh graders. And while their scores are up slightly compared with four years ago, sixth graders’ performance fell over the past year.

    Indiana has made significant and much-publicized investments in early literacy, relying heavily on the science of reading, as many states have in the last few years. But that instructional transformation has come too late for current middle schoolers. Meanwhile, ILEARN English scores for third and fourth graders have risen by relatively small levels since the pandemic, although this improvement has been uneven.

    The Board of Education expressed specific concerns about middle schoolers’ performance at a July 16 meeting. “We’ve gotta pick it up and make sure all of our middle school kids are reading, provide those additional supports,” said Secretary of Education Katie Jenner.

    Some middle school leaders say strategies they’ve used can turn things around. In addition to increasing instructional time for key subjects, they point to participation in a pilot that allows students to take ILEARN at several points over the school year, instead of just once in the spring. Educators say relying on these checkpoints can provide data-driven reflection and remediation for students that shows up in better test scores.

    Middle school an ‘optimum time’ for students’ recovery

    Katie Powell, director for middle level programs at the Association for Middle Level Education, said she often asks teachers if middle schoolers seem different since the pandemic and “heads nod,” she said. These post-pandemic middle schoolers are harder to motivate and engage, self-report more stress, and are less likely to take risks academically, Powell said.

    When the pandemic hit, “they were young, at the age of school where they’re developing basic reading fluency and math fact fluency,” she said. Current eighth graders, for example, were in second grade when the pandemic shut down schools and many learned online for much of their third grade year. Third grade is when students are supposed to stop learning to read and start “reading to learn,” Powell said.

    Powell noted that middle schoolers are in the stage of rapid development with the most changes for the brain and body outside of infancy.

    “This is actually an optimum time to step in and step up for them,” she said. “It is not too late. But it’s critical that we pay attention to them now.”

    Crawford County Middle School has nine periods every day, and students spend two periods each in both math and English. While many schools have some version of block scheduling, many have a model in which students only go to each class every other day. But at Crawford, students attend every class every day. Their version of block scheduling results in double the amount of instructional time in math and English.

    To make this switch, sacrifices had to be made. Periods were shortened, resulting in less time for other subjects. Carothers worried that student scores in subjects like science and social studies would decrease. But the opposite occurred, she said. Sixth grade science scores increased, for example, even though students were spending less time in the science classroom, according to Carothers.

    “If they have better math skills and better reading skills, then they’re gonna perform better in social studies and science,” she said.

    Meanwhile, at Cannelton Jr. Sr. High School, on the state line with Kentucky, the first three periods of the day are 90 minutes, rather than the typical 45. Every student has English or math during these first three periods, allowing for double the normal class time.

    Cannelton’s sixth through eighth grade English language arts ILEARN scores increased by nearly nine percentage points last year.

    Schools use more data to track student performance

    Cannelton Principal Brian Garrett believes his school’s reliance on data, and its new approach to getting it, is also part of their secret.

    Students take benchmark assessments early, in the first two or three weeks of school, so that teachers can track their progress and find gaps in knowledge.

    This year, the state is adopting that strategy for schools statewide. Rather than taking ILEARN once near the end of the year, students will take versions of the test three separate times, with a shortened final assessment in the spring. The state ran a pilot for ILEARN checkpoints last school year, with over 70% of Indiana schools taking part.

    The Indiana Department of Education hopes checkpoints will make the data from the test more actionable and help families and teachers ensure a student is on track throughout the year.

    Kim Davis, principal of Indian Creek Middle School in rural Trafalgar, said she believes ILEARN checkpoints, paired with reflection and targeted remediation efforts by teachers, “helped us inform instruction throughout the year instead of waiting until the end of the year to see did they actually master it according to the state test.”

    The checkpoints identified what standards students were struggling with, allowing Indian Creek teachers to tailor their instruction. Students also benefitted from an added familiarity with the test; they could see how questions would be presented when it was time for the final assessment in the spring.

    “It felt very pressure-free, but very informative for the teachers,” Davis said.

    The type of data gathered matters too. In the past, Washington Township middle schools used an assessment called NWEA, taken multiple times throughout the year, to measure student learning, said Eastwood Middle School Principal James Tutin. While NWEA was a good metric for measuring growth, it didn’t align with Indiana state standards, so the scores didn’t necessarily match how a student would ultimately score on a test like ILEARN.

    Last year, the district adopted ILEARN checkpoints instead, and used a service called Otis to collect weekly data.

    It took approximately six minutes for students to answer a few questions during a class period with information that educators could then put into Otis. That data allowed teachers to target instruction during gaps between ILEARN checkpoints.

    “Not only were they getting the practice through the checkpoints, but they were getting really targeted feedback at the daily and weekly level, to make sure that we’re not waiting until the checkpoint to know how our students are likely going to do,” Tutin said.

    Both Davis and Tutin stressed that simply having students take the checkpoint ILEARN tests was not enough; it had to be paired with reflection and collaboration between teachers, pushing each other to ask the tough questions and evaluate their own teaching.

    “We still have a fire in us to grow further, we’re not content with where we are,” Davis said. “But we’re headed in the right direction and that’s very exciting.”

    This story was originally published by Chalkbeat. Chalkbeat is a nonprofit news site covering educational change in public schools. Sign up for their newsletters at ckbe.at/newsletters


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • Release of NAEP science scores

    Release of NAEP science scores

    UPDATE: After this story was published, the Education Department issued a press release Monday afternoon, July 7, announcing that Matthew Soldner will serve as acting commissioner of the National Center for Education Statistics, in addition to his role as acting director of the Institute of Education Sciences. The job of statistics chief had been vacant since March and had prevented the release of assessment results.

    The repercussions from the decimation of staff at the Education Department keep coming. Last week, the fallout led to a delay in releasing results from a national science test.

    The National Assessment of Educational Progress (NAEP) is best known for tests that track reading and math achievement but includes other subjects, too. In early 2024, when the main reading and math tests were administered, there was also a science section for eighth graders. 

    The board that oversees NAEP had announced at its May meeting that it planned to release the science results in June. But that month has since come and gone. 

    Why the delay? There is no commissioner of education statistics to sign off on the score report, a requirement before it is released, according to five current and former officials who are familiar with the release of NAEP scores, but asked to remain anonymous because they were not authorized to speak to the press or feared retaliation. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Peggy Carr, a Biden administration appointee, was dismissed as the commissioner of the National Center for Education Statistics in February, two years before the end of her six-year term set by Congress. Chris Chapman was named acting commissioner, but he was fired in March, along with half the employees at the Education Department. The role has remained vacant since.

    A spokesman for the National Assessment Governing Board, which oversees NAEP,  said the science scores will be released later this summer, but denied that the lack of a commissioner is the obstacle. “The report building is proceeding so the naming of a commissioner is not a bureaucratic hold-up to its progress,” Stephaan Harris said by email.

    The delay matters. Education policymakers have been keen to learn if science achievement had held steady after the pandemic or tumbled along with reading and math. (Those reading and math scores were released in January.)

    The Trump administration has vowed to dismantle the Education Department and did not respond to an emailed question about when a new commissioner would be appointed. 

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    Researchers hang onto data

    Keeping up with administration policy can be head-spinning these days. Education researchers were notified in March that they would have to relinquish federal data they were using for their studies. (The department shares restricted datasets, which can include personally identifiable information about students, with approved researchers.) 

    But researchers learned on June 30 that the department had changed its mind and decided not to terminate this remote access. 

    Lawyers who are suing the Trump administration on behalf of education researchers heralded this about-face as a “big win.” Researchers can now finish projects in progress. 

    Still, researchers don’t have a way of publishing or presenting papers that use this data. Since the mass firings in mid-March, there is no one remaining inside the Education Department to review their papers for any inadvertent disclosure of student data, a required step before public release. And there is no process at the moment for researchers to request data access for future studies. 

    “While ED’s change-of-heart regarding remote access is welcome,” said Adam Pulver of Public Citizen Litigation Group, “other vital services provided by the Institute of Education Sciences have been senselessly, illogically halted without consideration of the impact on the nation’s educational researchers and the education community more broadly.  We will continue to press ahead with our case as to the other arbitrarily canceled programs.”

    Pulver is the lead attorney for one of three suits fighting the Education Department’s termination of research and statistics activities. Judges in the District of Columbia and Maryland have denied researchers a preliminary injunction to restore the research and data cuts. But the Maryland case is now fast-tracked and the court has asked the Trump administration to produce an administrative record of its decision-making process by July 11. (See this previous story for more background on the court cases.)

    Related: Education researchers sue Trump administration, testing executive power

    Some NSF grants restored in California

    Just as the Education Department is quietly restarting some activities that DOGE killed, so is the National Science Foundation (NSF). The federal science agency posted on its website that it had reinstated 114 awards to 45 institutions as of June 30. NSF said it was doing so to comply with a federal court order to reinstate awards to all University of California researchers. It was unclear how many of these research projects concerned education, one of the major areas that NSF funds.

    Researchers and universities outside the University of California system are hoping for the same reversal. In June, the largest professional organization of education researchers, the American Educational Research Association, joined forces with a large coalition of organizations and institutions in filing a legal challenge to the mass termination of grants by the NSF. Education grants were especially hard hit in a series of cuts in April and May. Democracy Forward, a public interest law firm, is spearheading this case.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about delaying the NAEP science score report was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Texas Students Make Gains in Reading but Struggle with Math, STAAR Scores Show – The 74

    Texas Students Make Gains in Reading but Struggle with Math, STAAR Scores Show – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    Texas’ students saw some wins in reading but continued to struggle to bounce back from pandemic-related learning losses in math, state testing results released Tuesday showed.

    Elementary students who took the State of Texas Assessments of Academic Readiness exam this year made the biggest gains in reading across grade levels. Third graders saw a three percentage point increase in reading, a milestone because early literacy is a strong indicator of future academic success. Progress among middle students in the subject, meanwhile, slowed.

    “These results are encouraging and reflect the impact of the strategic supports we’ve implemented in recent years,” said Texas Education Agency Commissioner Mike Morath. “We are seeing meaningful signs of academic recovery and progress.”

    This year’s third grade test takers have benefited from state investments in early literacy in recent years. Teachers in their classrooms have completed state-led training in early literacy instruction, known as reading academies. The state also expanded pre-K access and enrollment in 2019.

    Morath did acknowledge students needed more help to make similar gains in math. Five years after pandemic-related school closures, students are still struggling to catch up in that subject, the results showed. About 43% of students met grade-level standards for math, a 2 percentage point increase from the previous year, but still shy of the 50% reached in 2019.

    Low performance in math can effectively shut students out of high-paying, in-demand STEM careers. Economic leaders have been sounding the alarm about the implications that weak math skills can have on the state’s future workforce pipeline.

    The STAAR exam tests all Texas public school students in third through eighth grade in math and reading. A science test is also administered for fifth and eighth graders, as well as a social studies test for eighth graders. Science performance improved among fifth and eighth grades by 3 and 4 percentage points respectively, but students in those grades are still below where they were before the pandemic.

    Students in special education also made small gains. English learners, meanwhile, saw drops in all subjects but one — a 4% decrease in reading, a 2% decrease in math, and a 2% decrease in social studies.

    The test scores give families a snapshot of how Texas students are learning. School accountability ratings — which the Texas Education Agency gives out to each district and campus on an A through F scale as a score for their performance — are also largely based on how students do on the standardized tests.

    The test often casts a shadow over classrooms at the end of the year, with teachers across the state saying they lose weeks of valuable instructional time preparing children to take the test. Some parents also don’t like the test because of its high-stakes nature. They have said their kids don’t want to go school because of the enormous pressure the hours-long, end-of-year test puts on them.

    A bill that would have scrapped the STAAR test died in the last days of the 2025 legislative session. Both Republican and Democratic legislators expressed a desire to overhaul STAAR, but in the end, the House and Senate could not align on what they wanted out of an alternative test.

    Legislators this session did approve a sweeping school finance package that included academic intervention for students who are struggling before they first take their STAAR test in third grade. The package also requires teachers get training in math instruction, mirroring existing literacy training mandates.

    Parents can look up their students’ test results here.

    Graphics by Edison Wu

    This article originally appeared in The Texas Tribune, a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • “The accreditors are coming!” 4 ways to use student satisfaction scores to prepare

    “The accreditors are coming!” 4 ways to use student satisfaction scores to prepare

    Does your campus fully utilize its student satisfaction scores at accreditation time? As a reminder, regular assessments of student satisfaction provide data for four key institutional activities:

    • Retention/student success
    • Strategic planning
    • Recruiting new students
    • Accreditation documentation

    The accreditation process can be time-demanding and stressful for your campus staff and leadership, yet it is essential to complete on the designated cycle. And while the official process is something you address once every decade, regularly gathering data from your students and maintaining proactive processes can make the official requirements go much more smoothly.

    My colleague Charles Schroeder likes to say that during self-studies, people on campus begin running around gathering data and shouting, “The accreditors are coming! The accreditors are coming!” To avoid this reaction, our recommendation is don’t just assess student satisfaction as part of your self-study, but assess student satisfaction on a regular cycle, once every two or three years (if not annually).

    4 ways to use student satisfaction scores to prepare for accreditors

    How can you use data from student satisfaction surveys in your accreditation process? I have four suggestions for you.

    1. Match the survey items to your accreditation requirements. As a resource for you, we have mapped the individual items on the Ruffalo Noel Levitz (RNL) Satisfaction-Priorities Surveys (including the Student Satisfaction Inventory, the Adult Student Priorities Survey, and the Priorities Survey for Online Learners) to the individual criteria for all of the regional accreditors across the United States. You can download the relevant mapping document for your survey version and region here. By seeing how the items on each survey are mapped to the regional accrediting agency requirements, you can take  the guesswork out of determining how the student feedback lines up with the documentation you need to provide.

    2. Respond to student-identified challenge items. The RNL Satisfaction-Priorities Surveys identify areas of high importance and low satisfaction as challenge items. These are priority areas for improvement based on the perceptions of your students. By actively working to improve the student experience in these areas, you can potentially improve overall student satisfaction, which studies have correlated with better individual student retention, higher institutional graduation rates, higher institutional alumni giving, and lower loan default rates. Improvements in these areas are going to look good for your accreditation.

    3. Document your student-identified strengths. The RNL Satisfaction-Priorities Surveys also reflect student-identified strengths, which are items of high importance and high satisfaction. These are the areas that your students care about, and where they think you are doing a good job. Mentioning your strengths to your accreditors helps to position you in a positive light and to focus the conversation on where you are meeting or exceeding student expectations.

    4. Show improvements over time. As indicated earlier, student satisfaction surveys should not be a “once and done” activity, or even an activity done just once every five to ten years. The institutions we work with which assess student satisfaction systematically every two or three years, and actively work to improve the student experience in the intervening years, are seeing student satisfaction levels increase year over year. This process shows your commitment to your students and to your accreditors, and reflects that continuous quality improvement is valued by your institution.

    Ready to learn more?

    Are you ready to regularly assess student satisfaction? Are you interested in connecting the results to your accreditation criteria? Do you want to learn more about moving forward with a satisfaction assessment? Contact RNL with any questions you have and we will look forward to assisting you.

    Note: This blog was originally published in November 2016 and was updated with new content in May 2025.

    Source link

  • Liaison Unveils New Intelligent Names Degree Intent Scores, Enhancing Predictive Power and Reach 

    Liaison Unveils New Intelligent Names Degree Intent Scores, Enhancing Predictive Power and Reach 

    Liaison, a leader in education technology and data-driven solutions, is excited to announce the release of its 2025 Intelligent Names Degree Intent Scores. These advanced scores represent a transformative leap in identifying adult learners nationwide with the highest potential for pursuing a degree. 

    The 2025 Degree Intent Scores are powered by cutting-edge data science, advanced modeling techniques, and insights from a national survey conducted in late 2024. Combined with responses from Liaison’s extensive consumer database of over 260 million Americans, this enhanced model offers unparalleled precision and reach into the adult learners market. 

    Recent testing using a national dataset of graduate program applicants showed a 20% improvement in predicting applicant activity within the highest intent band when comparing the new intent scores to the original. Similarly, an analysis of a national dataset of bachelor’s degree seekers found that Liaison’s Bachelor’s Degree Intent model accurately identified 91% of degree seekers under the age of 25 in the top two quintiles. These findings underscore the model’s remarkable accuracy, effectiveness, and value for higher education institutions. 

    “The 2025 Degree Intent Scores mark a major milestone in our mission to connect educational institutions with adult learners who are ready to take the next step in their academic journeys,” said Dr. Mark Voortman, Chief Data Scientist at Liaison. “By leveraging large-scale data and state-of-the-art modeling techniques, we’ve significantly enhanced our ability to help institutions identify adult learners most likely to pursue degree opportunities in the near future.” 

    The updated scoring model empowers colleges, universities, and other education providers with deeper, data-driven insights to refine recruitment strategies, enhance student engagement, and achieve enrollment goals more effectively. 

    Learn more about Intelligent Names here.

    Source link

  • Preliminary results show essentially flat state test scores

    Preliminary results show essentially flat state test scores

    New Mexico students made tiny gains in literacy and dipped slightly in math proficiency last school year, according to preliminary results released Sept. 19 by the Public Education Department.

    Notably, the results look slightly more favorable than those in data also released Sept.19 by the Legislative Education Study Committee, which showed 2023-24 scores flat. PED attributed the discrepancy to an error PED discovered on the data earlier this week.

    Regardless, the results show that a large majority of the state’s public education students continue to fall short of grade-level proficiency. This suggests that New Mexico will remain close to the bottom nationally in student achievement.

    The PED results do not include detailed demographic or grade-level breakdowns. Those results will be released on Oct. 4. 

    According to the PED, 39 percent of K-8 and 11th-grade students scored proficient or better on state literacy assessments, compared to 38 percent in 2023, and up from 34 percent in 2022. The LESC results showed that the literacy proficiency rate was flat at 38 percent.

    In math, PED data show 23 percent of students in grades 3-8 and 11 proficient. That’s down one percentage point from 2023 and two points down from 2022. LESC numbers showed 22 percent proficient in 2024.

    In grades 5, 8 and 11 science, scores were up three percentage points, from 34 percent proficient last year to 37 percent proficient in 2024.

    Source link

  • Changes in AP Scores, 2022 to 2024

    Changes in AP Scores, 2022 to 2024

    Used to be, with a little work, you could download very detailed data on AP results from the College Board website: For every state, and for every course, you could see performance by ethnicity.  And, if you wanted to dig really deep, you could break out details by private and public schools, and by grade level.  I used to publish the data every couple of years.

    Those days are gone.  The transparency The College Board touts as a value seems to have its limits, and I understand this to some extent: Racists loved to twist the data using single-factor analysis, and that’s not good for a company who is trying to make business inroads with under-represented communities as they cloak their pursuit of revenue as an altruistic push toward access.

    They still publish data, but as I wrote about in my last post, it’s far less detailed; what’s more, what is easily accessible is fairly sterile, and what’s more detailed seems to be structured in a way that suggests the company doesn’t want you digging down into it.

    But based on a series of tweets by Marco Learning, based on research by its founder John Moscatiello, I set about scraping the data off the website, as on this page for 2024, this page for 2023, and this page for 2022. After first making a mistake because of the way the data are formatted and laid out, I’ve done manual checks and double-checks on this, especially on the exams where the results look way out of whack with what you would expect.

    Marco Learning’s take was that this was intentional on the part of the College Board, and that it would continue on other exams in the future.

    They also pointed out that this would save students a lot of money in college tuition, and of course, that’s true; if the tests were correctly designed, and students did better, that would be good news.  But the question is really: Should they be getting credit for these results?  Do the changes in performance mean that students are more qualified, or that the tests are easier?  And in some subjects, does giving credit for some courses actually set students up for failure in subsequent classes?

    This is problematic because College Board has spent a lot of money lobbying state legislatures to pass laws requiring public universities grant credit for AP exams (usually a 3 or above).  The assumption on college campuses is that–despite some mistrust of the College Board and their methods–they have good psychometricians who ensure test design meets rigorous standards that ensure a grade of 4, for instance, means the same thing today as it did five years ago.

    But the incentive to enforce that rigor is gone, since states have effectively endorsed the outcomes of these exams as valid and worthy.  College Board can now shift to growing market penetration, as they do when they encourage school districts to push AP, and encourage even students who might not be prepared to take AP classes.

    And, of course, as always seems to be the case, there is some measure of hypocrisy in the current statements of College Board compared to things they’ve said in the past. Remember the book “Measuring Success” which was written in large part by College Board staff members and fans, and railed against grade inflation, using data that suggested otherwise? (College Board disavows any formal connection to the book, but their Communications Staff Members were thanked in the foreword.)

    Paul Tough, in his book “The Years that Matter Most” pointed out that College Board’s own conclusions contradict the evidence they published:

    The data are below, in three views: And before I allow you to leap to conclusions, there are a lot of things that might explain why scores in some exams are swinging so wildly in a year, but College Board’s refusal to publish this data in an easily, machine-readable format makes that insight really hard to get at (and they won’t do it themselves, as they never respond publicly to criticism like this.)

    At a bare minimum, when College Board exam results show wild swings like this (especially if they are intentional) I think they owe it to actively notify every university that accepts scores, and every state legislature they’ve lobbied to approve the tests, of the changes.

    View one (using the tabs across the top) shows thermometer charts: Choose any class using the drop down box.  You’ll find big changes in some of the classes, and some that seem perfectly tuned over time.

    View two shows the same data in a format some might find easier.

    View three shows all exams that have three years of data (thus, excluding African-American Studies and Pre-Calc) for a wider view of the program.

    Source link