Category: Proof Points

  • Trump’s admissions data collection strains college administrators

    Trump’s admissions data collection strains college administrators

    by Jill Barshay, The Hechinger Report
    January 19, 2026

    Lynette Duncan didn’t expect to spend 20 hours over the past two weeks digging through a mothballed computer system, trying to retrieve admissions data from 2019.

    Duncan is the director of institutional research at John Brown University, a small Christian university in northwest Arkansas, an hour’s drive from Walmart’s headquarters. She runs a one-person office that handles university data collections and analyses, both for internal use and to meet government mandates. Just last year, she spent months collecting and crunching new data to comply with a new federal rule requiring that colleges show that their graduates are prepared for good jobs.

    Then, in mid-December, another mandate abruptly arrived — this one at the request of President Donald Trump. Colleges were ordered to compile seven years of admissions data, broken down by race, sex, grades, SAT or ACT scores, and family income.

    “It’s like one more weight on our backs,” Duncan said. “The workload – it’s not fun.”

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    John Brown University is one of almost 2,200 colleges and universities nationwide now scrambling to comply by March 18 with the new federal reporting requirement, formally known as the Admissions and Consumer Transparency Supplement, or ACTS. By all accounts, it’s a ton of work, and at small institutions, the task falls largely on a single administrator or even the registrar. Failure to submit the data can bring steep fines and, ultimately, the loss of access to federal aid for students.

    After the Supreme Court’s 2023 decision banning affirmative action in college admissions, the Trump administration suspected that colleges might covertly continue to give racial preferences. To police compliance, the White House directed the Department of Education to collect detailed admissions data from colleges nationwide.

    The data collection was unusual not only in its scope, but also in its speed. Federal education data collections typically take years to design, with multiple rounds of analysis, technical review panels, and revisions. This one moved from announcement to launch in a matter of months.

    A rush job

    One tiny indication that this was a rush job is in the Federal Register notice. Both enforce and admissions are misspelled in a proposal that’s all about admissions enforcement. Those words are spelled “admssions” and “enforece.” 

    A December filing with the Office of Management and Budget incorrectly lists the number of institutions that are subject to the new data collection. It is nearly 2,200, not 1,660, according to the Association for Institutional Research, which is advising colleges on how to properly report the data. Community colleges are exempt, but four-year institutions with selective admissions or those that give out their own financial aid must comply. Graduate programs are included as well. That adds up to about 2,200 institutions. 

    Related: Inaccurate, impossible: Experts knock new Trump plan to collect college admissions data

    In another filing with the Office of Management and Budget, the administration disclosed that none of the five remaining career Education Department officials with statistical experience had reviewed the proposal, including Matt Soldner, the acting commissioner of the National Center for Education Statistics. Most of the department’s statistical staff were fired earlier this year as a first step to eliminating the Education Department, one of Trump’s campaign promises. RTI International, the federal contractor in North Carolina that already manages other higher education data collections for the Education Department, is also handling the day-to-day work of this new college admissions collection. 

    During two public comment periods, colleges and higher-education trade groups raised concerns about data quality and missing records, but there is little evidence those concerns substantially altered the final design. One change expanded the retrospective data requirement from five to six years so that at least one cohort of students would have a measurable six-year graduation rate. A second relieved colleges of the burden of making hundreds of complex statistical calculations themselves, instead instructing them to upload raw student data to an “aggregator tool” that would do all the math for them. 

    The Trump administration’s goal is to generate comparisons across race and sex categories, with large gaps potentially triggering further scrutiny.

    Missing data

    The results are unlikely to be reliable, experts told me, given how much of the underlying data is missing or incomplete. In a public comment letter, Melanie Gottlieb, executive director of the American Association of Collegiate Registrars and Admissions Officers, warned that entire years of applicant data may not exist at many institutions. Some states advise colleges to delete records for applicants who never enrolled after a year. “If institutions are remaining compliant with their state policies, they will not have five years of data,” Gottlieb wrote.

    The organization’s own guidance recommends that four-year colleges retain admissions records for just one year after an application cycle. One reason is privacy. Applicant files contain sensitive personal information, and purging unneeded records reduces the risk of exposing this data in breaches.

    In other cases, especially at smaller institutions, admissions offices may offload applicant data simply to make room for new student records. Duncan said John Brown University has all seven years of required data, but a switch to a new computer system in 2019 has made it difficult to retrieve the first year.

    Even when historical records are available, key details may be missing or incompatible with federal requirements, said Christine Keller, executive director of the Association for Institutional Research, which previously received a federal contract to train college administrators on accurate data collection until DOGE eliminated it. (The organization now receives some private funds for a reduced amount of training.) 

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    Standardized test scores are unavailable for many students admitted under test-optional policies. The department is asking colleges to report an unweighted grade-point average on a four-point scale, even though many applicants submit only weighted GPAs on a five-point scale. In those cases, and there may be many of them, colleges are instructed to report the GPA as “unknown.”

    Some students decline to report their race. Many holes are expected for family income. Colleges generally have income data only for students who completed federal financial-aid forms, which many applicants never file. 

    Ellen Keast, a spokeswoman for the Education Department, said in an email, “Schools are not expected to provide data they don’t have.” She added, “We know that some schools may have missing data for some data elements. We’ll review the extent of missing data before doing further calculations or analyses.”

    Male or female

    Even the category of sex poses problems. The Education Department’s spreadsheet allows only two options: male or female. Colleges, however, may collect sex or gender information using additional categories, such as nonbinary. 

    “That data is going to be, in my estimation, pretty worthless when it comes to really showing the different experiences of men and women,” Keller said. She is urging the department to add a “missing” option to avoid misleading results. “I think some people in the department may be misunderstanding that what’s needed is a missing-data option, not another sex category.”

    The new “aggregator tool” itself is another source of anxiety. Designed to spare colleges from calculating quintile buckets for grades and test scores by race and sex, it can feel like a black box. Colleges are supposed to fill rows and rows of detailed student data into spreadsheets and then upload the spreadsheets into the tool. The tool generates pooled summary statistics, such as the number of Black female applicants and admitted students who score in the top 20 percent at the college. Only the aggregated data will be reported to the federal government.

    At John Brown University, Duncan worries about what those summaries might imply. Her institution is predominantly white and has never practiced affirmative action. But if high school grades or test scores differ by race — as they often do nationwide — the aggregated results could suggest bias where none was intended.

    “That’s a concern,” Duncan said. “I’m hopeful that looking across multiple years of data, it won’t show that. You could have an anomaly in one year.”

    The problem is that disparities are not anomalies. Standardized test scores and academic records routinely vary by race and sex, making it difficult for almost any institution to avoid showing gaps.

    A catch-22 for colleges

    The stakes are high. In an emailed response to my questions, the Education Department pointed to Trump’s Aug. 7 memorandum, which directs the agency to take “remedial action” if colleges fail to submit the data on time or submit incomplete or inaccurate information.

    Under federal law, each violation of these education data-reporting requirements can carry a fine of up to $71,545. Repeated noncompliance can ultimately lead to the loss of access to federal student aid, meaning students could no longer use Pell Grants or federal loans to pay tuition.

    That leaves colleges in a bind. Failing to comply is costly. Complying, meanwhile, could produce flawed data that suggests bias and invites further scrutiny.

    The order itself contradicts another administration goal. President Trump campaigned on reducing federal red tape and bureaucratic burden. Yet ACTS represents a significant expansion of paperwork for colleges. The Office of Management and Budget estimates that each institution will spend roughly 200 hours completing the survey this year — a figure that higher-education officials say may be an understatement.

    Duncan is hoping she can finish the reporting in less than 200 hours, if there are no setbacks when she uploads the data. “If I get errors, it could take double the time,” she said.

    For now, she is still gathering and cleaning old student records and waiting to see the results… all before the March 18 deadline.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about college admissions data was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-admissions-data-collection-strains-colleges/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=114400&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-admissions-data-collection-strains-colleges/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • Conservatives see married parents as a solution to low student achievement. It’s not that simple

    Conservatives see married parents as a solution to low student achievement. It’s not that simple

    by Jill Barshay, The Hechinger Report
    January 12, 2026

    Conservatives have long argued that unwed motherhood and single parenting are major drivers of poor student achievement. They contend that traditional two-parent families — ideally with a married mother and father — provide the stability children need to succeed in school. Single-parent households, more common among low-income families, are blamed for weak academic outcomes.

    That argument has resurfaced prominently in Project 2025, a policy blueprint developed by the conservative Heritage Foundation that calls for the federal government to collect and publish more education data broken out by family structure.

    Project 2025 acknowledges that the Education Department already collects some of this data, but asserts that it doesn’t make it public. That’s not true, though you need expertise to extract it. When I contacted the Heritage Foundation, the organization responded that the family-structure data should still be “readily available” to a layman, just like student achievement by race and sex. Fair point.

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    With some help, I found the figures and the results complicate the conservative claim.

    Since 2013, the National Assessment of Educational Progress (NAEP), often called the Nation’s Report Card, has asked students about who lives in their home. While the question does not capture every family arrangement, the answers provide a reasonable, albeit imperfect, proxy for family structure and it allows the public to examine how a nationally representative sample of students from different types of households perform academically. 

    I wanted to look at the relationship between family structure and student achievement by family income. Single-parent families are far more common in low-income communities and I didn’t want to conflate achievement gaps by income with achievement gaps by family structure. For example, 43 percent of low-income eighth graders live with only one parent compared with 13 percent of their high-income peers. I wanted to know whether kids who live with only one parent perform worse than kids with the same family income who live with both parents.

    To analyze the most recent data from the 2024 NAEP exam, I used the NAEP Data Explorer, a public tool developed by testing organization ETS for the National Center for Education Statistics (NCES). I told an ETS researcher what I wanted to know and he showed me how to generate the cross-tabulations, which I then replicated independently across four tests: fourth- and eighth-grade reading and math. Finally, I vetted the results with a former senior official at NCES and with a current staff member at the governing board that oversees the NAEP assessment.

    The analysis reveals a striking pattern.

    Among low-income students, achievement differs little by family structure. Fourth- and eighth-grade students from low-income households score at roughly the same level whether they live with both parents or with only one parent. Two-parent households do not confer a measurable academic advantage in this group. Fourth-grade reading is a great example. Among the socioeconomic bottom third of students, those who live with both parents scored a 199. Those who live with just mom scored 200. The results are almost identical and, if anything, a smidge higher for the kids of single moms. 

    As socioeconomic status rises, however, differences by family structure become more pronounced. Among middle- and high-income students, those living with both parents tend to score higher than their peers living with only one parent. The gap is largest among the most affluent students. In fourth grade reading, for example, higher-income kids who live with both parents scored a 238, a whopping 10 points higher than their peers who live with only their moms. Experts argue over the meaning of a NAEP point, but some equate 10 NAEP points to a school year’s worth of learning. It’s substantial.

    Family structure matters less for low-income student achievement

    Still, it’s better to be rich in a single-parent household than poor in a two-parent household. High-income students raised by a single parent substantially outperform low-income students who live with both parents by at least 20 points, underscoring that money and the advantages it brings — such as access to resources, stable housing, and educational support — matter far more than household composition alone. In other words, income far outweighs family structure when it comes to student achievement.

    Despite the NAEP data, Jonathan Butcher, acting director of the center for education policy at the Heritage Foundation, stands by the contention that family structure matters greatly for student outcomes. He points out that research since the landmark Coleman report of 1966 has consistently found a relationship between the two. Most recently, in a 2022 American Enterprise Institute-Brookings report, 15 scholars concluded that children “raised in stable, married-parent families are more likely to excel in school, and generally earn higher grade point averages” than children who are not. Two recent books, Brad Wilcox’s “Get Married” (2024) and Melissa Kearney’s “The Two-Parent Privilege” (2023), make the case, too, and they point out that children raised by married parents are about twice as likely to graduate from college as children who are not. However, it’s unclear to me if all of this analysis has disaggregated student achievement by family income as I did with the NAEP data.

    Related: Trump administration makes good on many Project 2025 education goals

    Family structure is a persistent theme for conservatives. Just last week the Heritage Foundation released a report on strengthening and rebuilding U.S. families. In a July 2025 newsletter, Robert Pondiscio, senior fellow at the American Enterprise Institute, a conservative think tank, wrote that “the most effective intervention in education is not another literacy coach or SEL program. It’s dad.” He cited a June 2025 report, “Good Fathers, Flourishing Kids,” by scholars and advocates. (Disclosure: A group led by one of the authors of this report, Richard Reeves, is among the funders of The Hechinger Report.)

    That conclusion is partially supported by the NAEP data, but only for a relatively small share of students from higher-income families (The share of high-income children living with only their mother ranges between 7 and 10 percent. The single-parent rate is higher for eighth graders than for fourth graders.)  For low-income students, who are Pondiscio’s and the scholars’ main concern, it’s not the case. 

    The data has limitations. The NAEP survey does not distinguish among divorced families, grandparent-led households or same-sex parents. Joint custody arrangements are likely grouped with two-parent households because children may say that they live with both mother and father, if not at the same time. Even so, these nuances are unlikely to alter the core finding: For low-income students, academic outcomes are largely similar regardless of whether they live with both parents all of the time, some of the time or only live with one parent. 

    The bottom line is that calls for new federal data collection by family structure, like those outlined in Project 2025, may not reveal what advocates expect. A family’s bank account matters more than a wedding ring. 

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about family structure and student achievement was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-family-structure-student-achievement/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=114283&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-family-structure-student-achievement/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • Teachers who use math vocabulary help students do better in math

    Teachers who use math vocabulary help students do better in math

    by Jill Barshay, The Hechinger Report
    January 5, 2026

    Students, parents and school principals all instinctively know that some teachers are better than others. Education researchers have spent decades trying — with mixed success — to calculate exactly how much better.

    What remains far more elusive is why.

    A new study suggests that one surprisingly simple difference between stronger and weaker math teachers may be how often they use mathematical vocabulary, words such as “factors,” “denominators” and “multiples,” in class.

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Teachers who used more math vocabulary had students who scored higher on math tests, according to a team of data scientists and education researchers from Harvard University, Stanford University and the University of Maryland. The size of the test score boost was substantial. It amounted to about half of the benefit researchers typically attribute to having a highly effective teacher, which is among the most important school-based factors that help children learn. Students with highly effective teachers can end up months ahead of their peers. 

    “If you’re looking for a good math teacher, you’re probably looking for somebody who’s exposing their students to more mathematical vocabulary,” said Harvard data scientist Zachary Himmelsbach, lead author of the study, which was published online in November 2025.

    The finding aligns with a growing body of research suggesting that language plays a critical role in math learning. A 2021 meta-analysis of 40 studies found that students with stronger math vocabularies tend to perform better in math, particularly on multi-step, complex problems. Understanding what a “radius” is, for example, can make it more efficient to talk about perimeter and area and understand geometric concepts. Some math curricula explicitly teach vocabulary and include glossaries to reinforce these terms.

    Related: Three reasons why so few eighth graders in the poorest schools take algebra

    But vocabulary alone is unlikely to be a magic ingredient.

    “If a teacher just stood in front of the classroom and recited lists of mathematical vocabulary terms, nobody’s learning anything,” said Himmelsbach. 

    Instead, Himmelsbach suspects that vocabulary is part of a broader constellation of effective teaching practices. Teachers who use more math terms may also be providing clearer explanations, walking students through lots of examples step-by-step, and offering engaging puzzles. These teachers might also have a stronger conceptual understanding of math themselves.

    It’s hard to isolate what exactly is driving the students’ math learning and what role vocabulary, in and of itself, is playing, Himmelsbach said.

    Himmelsbach and his research team analyzed transcripts from more than 1,600 fourth- and fifth-grade math lessons in four school districts recorded for research purposes about 15 years ago. They counted how often teachers used more than 200 common math terms drawn from elementary math curriculum glossaries.

    The average teacher used 140 math-related words per lesson. But there was wide variation. The top quarter of the teachers used at least 28 more math terms per lesson than the quarter of the teachers who spoke the fewest math words. Over the course of a school year, that difference amounted to roughly 4,480 additional math terms, meaning that some students were exposed to far richer mathematical language than others, depending on which teacher they happened to have that year.

    The study linked these differences to student achievement. One hundred teachers were recorded over three years, and in the third year, students were randomly assigned to classrooms. That random assignment allowed the researchers to rule out the possibility that higher performing students were simply being clustered with stronger teachers.

    Related: A theory for learning numbers without counting gains popularity

    The lessons came from districts serving mostly low-income students. About two-thirds of students qualified for free or reduced-price lunch, more than 40 percent were Black, and nearly a quarter were Hispanic — the very populations that tend to struggle the most in math and stand to gain the most from effective instruction.

    Interestingly, student use of math vocabulary did not appear to matter as much as teacher use. Although the researchers also tracked how often students used math terms in class, they found no clear link between teachers who used more vocabulary and students who spoke more math words themselves. Exposure and comprehension, rather than verbal facility, may be enough to support stronger math performance.

    The researchers also looked for clues as to why some teachers used more math vocabulary than others. Years of teaching experience made no difference. Nor did the number of math or math pedagogy courses teachers had taken in college. Teachers with stronger mathematical knowledge did tend to use more math terms, but the relationship was modest.

    Himmelsbach suspects that personal beliefs play an important role. Some teachers, he said, worry that formal math language will confuse students and instead favor more familiar phrasing, such as “put together” instead of addition, or “take away” instead of subtraction. While those colloquial expressions can be helpful, students ultimately need to understand how they correspond to formal mathematical concepts, Himmelsbach said.

    This study is part of a new wave of education research that uses machine learning and natural language processing — computer techniques that analyze large volumes of text — to peer inside the classroom, which has long remained a black box. With enough recorded lessons, researchers hope not only to identify which teaching practices matter most, but also provide teachers with concrete, data-driven feedback.

    Related: A little parent math talk with kids might really add up

    The researchers did not examine whether teachers used math terms correctly, but they noted that future models could be trained to do just that, offering feedback on accuracy and context, not just frequency.

    For now, the takeaway is more modest but still meaningful: Students appear to learn more math when their teachers speak the language of math more often.  

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about math vocabulary was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-math-vocabulary/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=114152&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-math-vocabulary/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • Do male teachers make a difference? Not as much as some think

    Do male teachers make a difference? Not as much as some think

    by Jill Barshay, The Hechinger Report
    November 17, 2025

    The teaching profession is one of the most female-dominated in the United States. Among elementary school teachers, 89 percent are women, and in kindergarten, that number is almost 97 percent.

    Many sociologists, writers and parents have questioned whether this imbalance hinders young boys at the start of their education. Are female teachers less understanding of boys’ need to horse around? Or would male role models inspire boys to learn their letters and times tables? Some advocates point to research that lays out why boys ought to do better with male teachers.

    But a new national analysis finds no evidence that boys perform or behave better with male teachers in elementary school. This challenges a widespread belief that boys thrive more when taught by men, and it raises questions about efforts, such as one in New York City, to spend extra to recruit them.

    “I was surprised,” said Paul Morgan, a professor at the University at Albany and a co-author of the study. “I’ve raised two boys, and my assumption would be that having male teachers is beneficial because boys tend to be more rambunctious, more active, a little less easy to direct in academic tasks.”

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    “We’re not saying gender matching doesn’t work,” Morgan added. “We’re saying we’re not observing it in K through fifth grade.”

    Middle and high school students might see more benefits. Earlier research is mixed and inconclusive. A 2007 analysis by Stanford professor Thomas Dee found academic benefits for eighth-grade boys and girls when taught by teachers of their same gender. And studies where researchers observe and interview a small number of students often show how students feel more supported by same-gender teachers. Yet many quantitative studies, like this newest one, have failed to detect measurable benefits for boys. At least 10 since 2014 have found zero or minimal effects. Benefits for girls are more consistent.

    This latest study, “Fixed Effect Estimates of Teacher-Student Gender Matching During Elementary School,” is a working paper not yet published in a peer-reviewed journal.* Morgan and co-author Eric Hu, a research scientist at Albany, shared a draft with me.

    Morgan and Hu analyzed a U.S. Education Department dataset that followed a nationally representative group of 8,000 students from kindergarten in 2010 through fifth grade in 2017. Half were boys and half were girls. 

    More than two-thirds — 68 percent — of the 4,000 boys never had a male teacher in those years while 32 percent had at least one. (The study focused only on main classroom teachers, not extras like gym or music.)

    Among the 1,300 boys who had both male and female teachers, the researchers compared each boy’s performance and behavior across those years. For instance, if Jacob had female teachers in kindergarten, first, second and fifth grades, but male teachers in third and fourth, his average scores and behavior were compared between the teachers of different genders.

    Related: Plenty of Black college students want to be teachers, but something keeps derailing them

    The researchers found no differences in reading, math or science achievement — or in behavioral and social measures. Teachers rated students on traits like impulsiveness, cooperation, anxiety, empathy and self-control. The children also took annual executive function tests. The results did not vary by the teacher’s gender.

    Most studies on male teachers focus on older students. The authors noted one other elementary-level study, in Florida, that also found no academic benefit for boys. This new research confirms that finding and adds that there seems to be no behavioral or social benefits either.

    For students at these young ages, 11 and under, the researchers also didn’t find academic benefits for girls with female teachers. But there were two non-academic ones: Girls taught by women showed stronger interpersonal skills (getting along, helping others, caring about feelings) and a greater eagerness to learn (represented by skills such as keeping organized and following rules).

    When the researchers combined race and gender, the results grew more complex. Black girls taught by women scored higher on an executive function test but lower in science. Asian boys taught by men scored higher on executive function but had lower ratings on interpersonal skills. Black boys showed no measurable differences when taught by male teachers. (Previous research has sometimes found benefits for Black students taught by Black teachers and sometimes hasn’t.)**

    Related: Bright black students taught by black teachers are more likely to get into gifted-and-talented classrooms

    Even if data show no academic or behavioral benefits for students, there may still be compelling reasons to diversify the teaching workforce, just as in other professions. But we shouldn’t expect these efforts to move the needle on student outcomes.

    “If you had scarce resources and were trying to place your bets,” Morgan said, “then based on this study, maybe elementary school isn’t where you should focus your recruitment efforts” to hire more men.

    To paraphrase Boyz II Men, it’s so hard to say goodbye — to the idea that young boys need male teachers.

    *Clarification: The article has not yet been published in a peer-reviewed journal but has undergone some peer review.

    **Correction: An earlier version incorrectly characterized how researchers analyzed what happened to students of different races. The researchers focused only on the gender of the teachers, but drilled down to see how students of different races responded to teachers of different genders. 

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about male teachers was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-male-teachers-elementary-school/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113362&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-male-teachers-elementary-school/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • Advocates warn of risks to higher ed data if Education Department is shuttered

    Advocates warn of risks to higher ed data if Education Department is shuttered

    by Jill Barshay, The Hechinger Report
    November 10, 2025

    Even with the government shut down, lots of people are thinking about how to reimagine federal education research. Public comments on how to reform the Institute of Education Sciences (IES), the Education Department’s research and statistics arm, were due on Oct. 15. A total of 434 suggestions were submitted, but no one can read them because the department isn’t allowed to post them publicly until the government reopens. (We know the number because the comment entry page has an automatic counter.)

    A complex numbers game 

    There’s broad agreement across the political spectrum that federal education statistics are essential. Even many critics of the Department of Education want its data collection efforts to survive — just somewhere else. Some have suggested moving the National Center for Education Statistics (NCES) to another agency, such as the Commerce Department, where the U.S. Census Bureau is housed.

    But Diane Cheng, vice president of policy at the Institute for Higher Education Policy, a nonprofit organization that advocates for increasing college access and improving graduation rates, warns that shifting NCES risks the quality and usefulness of higher education data. Any move would have to be done carefully, planning for future interagency coordination, she said.

    “Many of the federal data collections combine data from different sources within ED,” Cheng said, referring to the Education Department. “It has worked well to have everyone within the same agency.”

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    She points to the College Scorecard, the website that lets families compare colleges by cost, student loan debt, graduation rates, and post-college earnings. It merges several data sources, including the Integrated Postsecondary Education Data System (IPEDS), run by NCES, and the National Student Loan Data System, housed in the Office of Federal Student Aid. Several other higher ed data collections on student aid and students’ pathways through college also merge data collected at the statistical unit with student aid figures. Splitting those across different agencies could make such collaboration far more difficult.

    “If those data are split across multiple federal agencies,” Cheng said, “there would likely be more bureaucratic hurdles required to combine the data.”

    Information sharing across federal agencies is notoriously cumbersome, the very problem that led to the creation of the Department of Homeland Security after 9/11.

    Hiring and $4.5 million in fresh research grants

    Even as the Trump administration publicly insists it intends to shutter the Department of Education, it is quietly rebuilding small parts of it behind the scenes.

    In September, the department posted eight new jobs to replace fired staff who oversaw the National Assessment of Educational Progress (NAEP), the biennial test of American students’ achievement. In November, it advertised four more openings for statisticians inside the Federal Student Aid Office. Still, nothing is expected to be quick or smooth. The government shutdown stalled hiring for the NAEP jobs, and now a new Trump administration directive to form hiring committees by Nov. 17 to approve and fill open positions may further delay these hires.

    At the same time, the demolition continues. Less than two weeks after the Oct. 1 government shutdown, 466 additional Education Department employees were terminated — on top of the roughly 2,000 lost since March 2025 through firings and voluntary departures. (The department employed about 4,000 at the start of the Trump administration.) A federal judge temporarily blocked these latest layoffs on Oct. 15.

    Related: Education Department takes a preliminary step toward revamping its research and statistics arm

    There are also other small new signs of life. On Sept. 30 — just before the shutdown — the department quietly awarded nine new research and development grants totaling $4.5 million. The grants, listed on the department’s website, are part of a new initiative called, “From Seedlings to Scale Grants Program” (S2S), launched by the Biden administration in August 2024 to test whether the Defense Department’s DARPA-style innovation model could work in education. DARPA, the Defense Advanced Research Projects Agency, invests in new technologies for national security. Its most celebrated project became the basis for the internet. 

    Each new project, mostly focused on AI-driven personalized learning, received $500,000 to produce early evidence of effectiveness. Recipients include universities, research organizations and ed tech firms. Projects that show promise could be eligible for future funding to scale up with more students.

    According to a person familiar with the program who spoke on background, the nine projects had been selected before President Donald Trump took office, but the formal awards were delayed amid the department’s upheaval. The Institute of Education Sciences — which lost roughly 90 percent of its staff — was one of the hardest hit divisions.

    Granted, $4.5 million is a rounding error compared with IES’s official annual budget of $800 million. Still, these are believed to be the first new federal education research grants of the Trump era and a faint signal that Washington may not be abandoning education innovation altogether.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about risks to federal education data was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-risks-higher-ed-data/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113283&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-risks-higher-ed-data/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • What research says about Mamdani and Cuomo’s education proposals

    What research says about Mamdani and Cuomo’s education proposals

    by Jill Barshay, The Hechinger Report
    November 3, 2025

    New York City, where I live, will elect a new mayor Tuesday, Nov. 4. The two front runners — state lawmaker Zohran Mamdani, the Democratic nominee, and former Gov. Andrew Cuomo, running as an independent — have largely ignored the city’s biggest single budget item: education. 

    One exception has been gifted education, which has generated a sharp debate between the two candidates. The controversy is over a tiny fraction of the student population. Only 18,000 students are in the city’s gifted and talented program out of more than 900,000 public school students. (Another 20,000 students attend the city’s elite exam-entrance high schools.) 

    But New Yorkers are understandably passionate about getting their kids into these “gated” classrooms, which have some of the best teachers in the city. Meanwhile, the racial composition of these separate (some say segregated) classes — disproportionately white and Asian — is shameful. Even many advocates of gifted education recognize that reform is needed. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Mamdani wants to end gifted programs for kindergarteners and wait until third grade to identify advanced students. Cuomo wants to expand gifted education and open up more seats for more children. 

    The primary justification for gifted programs is that some children learn so quickly that they need separate classrooms to progress at an accelerated pace. 

    But studies have found that students in gifted classrooms are not learning faster than their general education peers. And analyses of curricula show that many gifted classes don’t actually teach more advanced material; they simply group mostly white and Asian students together without raising academic rigor.

    In my reporting, I have found that researchers question whether we can accurately spot giftedness in 4- or 5-year-olds. My colleague Sarah Carr recently wrote about the many methods that have been used to try to identify young children with high potential, and how the science underpinning them is shaky. In addition, true giftedness is often domain-specific — a child might be advanced in math but not in reading, or vice versa — yet New York City’s system labels or excludes children globally rather than by subject. 

    Because of New York City’s size — it’s the nation’s largest public school system, even larger than 30 states — what happens here matters.

    Policy implications

    • Delaying identification until later grades, when cognitive profiles are clearer, could improve accuracy in picking students. 
    • Reforming the curriculum to make sure that gifted classes are truly advanced would make it easier to justify having them. 
    • Educators could consider ways for children to accelerate in a single subject — perhaps by moving up a grade in math or English classes. 
    • How to desegregate these classrooms, and make their racial/ethnic composition less lopsided, remains elusive.

    I’ve covered these questions before. Read my columns on gifted education:

    Size isn’t everything

    Another important issue in this election is class size. Under a 2022 state law, New York City must reduce class sizes to no more than 20 students in grades K-3 by 2028. (The cap will be 23 students per class in grades 4-8 and 25 students per class in high school.) To meet that mandate, the city will need to hire an estimated 18,000 new teachers.

    During the campaign, Mamdani said he would subsidize teacher training, offering tuition aid in exchange for a three-year commitment to teach in the city’s public schools. The idea isn’t unreasonable, but it’s modest — only $12 million a year, expected to produce about 1,000 additional teachers annually. That’s a small fraction of what’s needed.

    The bigger problem may be the law itself: Schools lack both physical space and enough qualified teachers. What parents want — small classes led by excellent, experienced educators — isn’t something the city can scale quickly. Hiring thousands of novices may not improve learning much, and will make the job of school principal, who must make all these hires, even harder.

    For more on the research behind class-size reductions, see my earlier columns:

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about education issues in the New York City mayoral election was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-nyc-mayor-election-education/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113202&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-nyc-mayor-election-education/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • Why one reading expert says ‘just-right’ books are all wrong

    Why one reading expert says ‘just-right’ books are all wrong

    by Jill Barshay, The Hechinger Report
    October 27, 2025

    Timothy Shanahan, a professor emeritus at the University of Illinois at Chicago, has spent his career evaluating education research and helping teachers figure out what works best in the classroom. A leader of the National Reading Panel, whose 2000 report helped shape what’s now known as the “science of reading,” Shanahan has long influenced literacy instruction in the United States. He also served on the National Institute for Literacy’s advisory board in both the George W. Bush and Barack Obama administrations.

    Shanahan is a scholar whom I regularly consult when I come across a reading study, and so I was eager to interview him about his new book, “Leveled Reading, Leveled Lives.” (Harvard Education Press, September 2025). In it, Shanahan takes aim at one of the most common teaching practices in American classrooms: matching students with “just-right” books. 

    He argues that the approach — where students read different texts depending on their assessed reading level — is holding many children back. Teachers spend too much time testing students and assigning leveled books, he says, instead of helping all students learn how to understand challenging texts.

    “American children are being prevented from doing better in reading by a longstanding commitment to a pedagogical theory that insists students are best taught with books they can already read,” Shanahan writes in his book. “Reading is so often taught in small groups — not so teachers can guide efforts to negotiate difficult books, but to ensure the books are easy enough that not much guidance is needed.”

    Comprehension, he says, doesn’t grow that way.

    The trouble with leveled reading

    Grouping students by ability and assigning easier or harder books — a practice known as leveled reading — remains deeply embedded in U.S. schools. A 2018 Thomas B. Fordham Institute survey found that 62 percent of upper elementary teachers and more than half of middle school teachers teach at students’ reading level rather than at grade level.  

    That may sound sensible, but Shanahan says it’s not helping anyone and is even leading teachers to dispense with reading altogether. “In social studies and science, and these days, even in English classes,” he said in an interview, “teachers either don’t assign any readings or they read the texts to the students.” Struggling readers aren’t being given the chance — or the tools — to tackle complex material on their own.

    Instead, Shanahan believes all students should read grade-level texts together, with teachers providing more support for those who need it.

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    “What I’m recommending is instructional differentiation,” he said in our interview. “Everyone will have the same instructional goal — we’re all going to learn to read the fourth-grade text. I might teach a whole-class lesson and then let some kids move on to independent work while others get more help. Maybe the ones who didn’t get it, read the text again with my support. By the end, more students will have reached the learning goal — and tomorrow the whole class can take on another text.”

    27 different ways

    Shanahan’s approach doesn’t mean throwing kids into the deep end without help. His book outlines a toolbox of strategies for tackling difficult texts, such as looking up unfamiliar vocabulary, rereading confusing passages, or breaking down long sentences. “You can tip over into successful reading 27 different ways,” he said, and he hopes future researchers discover many more. 

    He is skeptical of drilling students on skills like identifying the main idea or making inferences. “We’ve treated test questions as the skill,” he said. “That doesn’t work.”

    There is widespread frustration over the deterioration of American reading achievement, especially among middle schoolers. (Thirty-nine percent of eighth graders cannot reach the lowest of three achievement levels, called “basic,” on the National Assessment of Educational Progress.) But there is little agreement among reading advocates on how to fix the problem. Some argue that what children primarily need is more knowledge to grasp unfamiliar ideas in a new reading passage, but Shanahan argues that background knowledge won’t be sufficient or as powerful as explicit comprehension instruction. Other reading experts agree. Nonie Lesaux, dean of the Harvard Graduate School of Education who specializes in literacy in her own academic work, endorsed Shanahan’s argument in an October 2025 online discussion of the new book. 

    Shanahan is most persuasive in pointing out that there isn’t strong experimental evidence to show that reading achievement goes up more when students read a text at their individual level. By contrast, a 2024 analysis found that the most effective schools are those that keep instruction at grade level. Still, Shanahan acknowledges that more research is needed to pinpoint which comprehension strategies work best for which students and in which circumstances.

    Misunderstanding Vygotsky

    Teachers often cite the Russian psychologist Lev Vygotsky’s “zone of proximal development” to justify giving students books that are neither too easy nor too hard. But Shanahan says that’s a misunderstanding of Vygotsky’s work.

    Vygotsky believed teachers should guide students to learn challenging things they cannot yet do on their own, he said.

    He offers an analogy: a mother teaching her child to tie their shoes. At first, she demonstrates while narrating the steps aloud. Then the child does one step, and she finishes the rest. Over time, the mother gradually releases control and the child ties a bow on his own. “Leveled reading,” Shanahan said, “is like saying, ‘Why don’t we just get Velcro?’ This is about real teaching. ‘Boys and girls, you don’t know how to ride this bike yet, but I’m going to make sure you do by the time we’re done.’ ”

    Related: What happens to reading comprehension when students focus on the main idea

    Shanahan’s critique of reading instruction applies mainly from second grade onward, after children learn how to read and are focusing on understanding what they read. In kindergarten and first grade, when children are still learning phonics and how to decode the words on the page, the research evidence against small group instruction with different level texts isn’t as strong, he said. 

    Learning to read first – decoding – is important. Shanahan says there are rare exceptions to teaching all children at grade level. 

    “If a fifth grader still can’t read,” Shanahan said, “I wouldn’t make that child read a fifth-grade text.” That child might need separate instruction from a reading specialist.

    Advanced readers, meanwhile, can be challenged in other ways, Shanahan suggests, through independent reading time, skipping ahead to higher-grade reading classes, or by exploring complex ideas within grade-level texts.

    The role of AI — and parents

    Artificial intelligence is increasingly being used to rewrite texts for different difficulty levels. Shanahan is skeptical of that approach. Simpler texts, whether written by humans or generated by AI, don’t teach students to improve their reading ability, he argues.

    Still, he’s intrigued by the idea of using AI to help students “climb the stairs” by instantly modifying a single text to a range of reading levels, say, to third-, fifth- and seventh-grade levels, and having students read them in quick succession. Whether that boosts comprehension is still unknown and needs to be studied.

    AI might be most helpful to teachers, Shanahan suspects, to help point to a sentence or a passage that tends to confuse students or trip them up. The teacher can then address those common difficulties in class. 

    Shanahan worries about what happens outside of school: Kids aren’t reading much at all.

    He urges parents to let children read whatever they enjoy — regardless if it’s above or below their level — but to set consistent expectations. “Nagging may not be effective,” he said. “But you can be specific: ‘After dinner Thursday, read the first chapter. When you’re done, we’ll talk about it, and then you can play a computer game or go on your phone.’ ”

    Too often, he says, parents back down when kids resist. “They are the kids. We are the adults,” Shanahan said. “We’re responsible. Let’s step up and do what’s right for them.”

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about reading levels was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-shanahan-leveled-reading/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113055&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-shanahan-leveled-reading/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • Cellphone bans can help kids learn — but Black students are suspended more as schools make the shift

    Cellphone bans can help kids learn — but Black students are suspended more as schools make the shift

    Thirty states now limit or ban cellphone use in classrooms, and teachers are noticing children paying attention to their lessons again. But it’s not clear whether this policy — unpopular with students and a headache for teachers to enforce — makes an academic difference. 

    If student achievement goes up after a cellphone ban, it’s tough to know if the ban was the reason. Some other change in math or reading instruction might have caused the improvement. Or maybe the state assessment became easier to pass. Imagine if politicians required all students to wear striped shirts and test scores rose. Few would really think that stripes made kids smarter.

    Two researchers from the University of Rochester and RAND, a nonprofit research organization, figured out a clever way to tackle this question by taking advantage of cellphone activity data in one large school district in Florida, which in 2023 became the first state to institute school cellphone restrictions. The researchers compared schools that had high cellphone activity before the ban with those that had low cellphone usage to see if the ban made a bigger difference for schools that had high usage. 

    Indeed, it did. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Student test scores rose a bit more in high cellphone usage schools two years after the ban compared with schools that had lower cellphone usage to start. Students were also attending school more regularly. 

    The policy also came with a troubling side effect. The cellphone bans led to a significant increase in student suspensions in the first year, especially among Black students. But disciplinary actions declined during the second year. 

    “Cellphone bans are not a silver bullet,” said David Figlio, an economist at the University of Rochester and one of the study’s co-authors. “But they seem to be helping kids. They’re attending school more, and they’re performing a bit better on tests.”

    Figlio said he was “worried” about the short-term 16 percent increase in suspensions for Black students. What’s unclear from this data analysis is whether Black students were more likely to violate the new cellphone rules, or whether teachers were more likely to single out Black students for punishment. It’s also unclear from these administrative behavior records if students were first given warnings or lighter punishments before they were suspended. 

    The data suggest that students adjusted to the new rules. A year later, student suspensions, including those of Black students, fell back to what they had been before the cellphone ban.

    “What we observe is a rocky start,” Figlio added. “There was a lot of discipline.”

    The study, “The Impact of Cellphone Bans in Schools on Student Outcomes: Evidence from Florida,” is a draft working paper and has not been peer-reviewed. It was slated to be circulated by the National Bureau of Economic Research on Oct. 20 and the authors shared a draft with me in advance. Figlio and his co-author Umut Özek at RAND believe it is the first study to show a causal connection between cellphone bans and learning rather than just a correlation.

    The academic gains from the cellphone ban were small, less than a percentile point, on average. That’s the equivalent of moving from the 50th percentile on math and reading tests (in the middle) to the 51st percentile (still close to the middle), and this small gain did not emerge until the second year for most students. The academic benefits were strongest for middle schoolers, white students, Hispanic students and male students. The academic gains for Black students and female students were not statistically significant.  

    Related: Suspended for…what? 

    I was surprised to learn that there is data on student cellphone use in school. The authors of this study used information from Advan Research Corp., which collects and analyzes data from mobile phones around the world for business purposes, such as figuring out how many people visit a particular retail store. The researchers were able to obtain this data for schools in one Florida school district and estimate how many students were on their cellphones before and after the ban went into effect between the hours of 9 a.m. and 1 p.m.

    The data showed that more than 60 percent of middle schoolers, on average, were on their phones at least once during the school day before the 2023 ban in this particular Florida district, which was not named but described as one of the 10 largest districts in the country. (Five of the nation’s 10 largest school districts are in Florida.) After the ban, that fell in half to 30 percent of middle schoolers in the first year and down to 25 percent in the second year.

    Elementary school students were less likely to be on cellphones to start with and their in-school usage fell from about 25 percent of students before the ban to 15 percent after the ban. More than 45 percent of high schoolers were on their phones before the ban and that fell to about 10 percent afterwards.

    Average daily smartphone visits in schools, by year and grade level

    Average daily smartphone visits during regular school days (relative to teacher workdays without students) between 9am and 1pm (per 100 enrolled students) in the two months before and then after the 2023 ban took effect in one large urban Florida school district. Source: Figlio and Özek, October 2025 draft paper, figure 2C, p. 23.

    Florida did not enact a complete cellphone ban in 2023, but imposed severe restrictions. Those restrictions were tightened in 2025 and that additional tightening was not studied in this paper.

    Anti-cellphone policies have become increasingly popular since the pandemic, largely based on our collective adult gut hunches that kids are not learning well when they are consumed by TikTok and SnapChat. 

    This is perhaps a rare case in public policy, Figlio said, where the “data back up the hunches.” 

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about cellphone bans was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Some social emotional lessons improve how kids do at school, Yale study finds

    Some social emotional lessons improve how kids do at school, Yale study finds

    Social emotional learning — lessons in soft skills like listening to people you disagree with or calming yourself down before a test — has become a flashpoint in the culture wars. 

    The conservative political group Moms for Liberty opposes SEL, as it is often abbreviated, telling parents that its “goal is to psychologically manipulate students to accept the progressive ideology that supports gender fluidity, sexual preference exploration, and systemic oppression.” Critics say that parents should discuss social and emotional matters at home and that schools should stick to academics. Meanwhile, some advocates on the left say standard SEL classes don’t go far enough and should include such topics as social justice and anti-racism training. 

    While the political battle rages on, academic researchers are marshalling evidence for what high-quality SEL programs actually deliver for students. The latest study, by researchers at Yale University, summarizes 12 years of evidence, from 2008 to 2020, and it finds that 30 different SEL programs, which put themselves through 40 rigorous evaluations involving almost 34,000 students, tended to produce “moderate” academic benefits.

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    The meta-analysis, published online Oct. 8 in the peer-reviewed journal Review of Educational Research, calculated that the grades and test scores of students in SEL classes improved by about 4 percentile points, on average, compared with students who didn’t receive soft-skill instruction. That’s the equivalent of moving from the 50th percentile (in the middle) to the 54th percentile (slightly above average). Reading gains were larger (more than 6 percentile points) than math gains (fewer than 4 percentile points). Longer-duration SEL programs, extending more than four months, produced double the academic gains — more than 8 percentile points. 

    “Social emotional learning interventions are not designed, most of the time, to explicitly improve academic achievement,” said Christina Cipriano, one of the study’s four authors and an associate professor at Yale Medical School’s Child Study Center. “And yet we demonstrated, through our meta-analytic report, that explicit social emotional learning improved academic achievement and it improved both GPA and test scores.”

    Cipriano also directs the Education Collaboratory at Yale, whose mission is to “advance the science of learning and social and emotional development.”

    The academic boost from SEL in this 2025 paper is much smaller than the 11 percentile points documented in an earlier 2011 meta-analysis that summarized research through 2007, when SEL had not yet gained widespread popularity in schools. That has since changed. More than 80 percent of principals of K-12 schools said their schools used an SEL curriculum during the 2023-24 school year, according to a survey by the Collaborative for Academic, Social, and Emotional Learning (CASEL) and the RAND Corporation. 

    Related: A research update on social-emotional learning in schools

    The Yale researchers only studied a small subset of the SEL market, programs that subjected themselves to a rigorous evaluation and included academic outcomes. Three-quarters of the 40 studies were randomized-controlled trials, similar to pharmaceutical trials, where schools or teachers were randomly assigned to teach an SEL curriculum. The remaining studies, in which schools or teachers volunteered to participate, still had control groups of students so that researchers could compare the academic gains of students who did not receive SEL instruction. 

    The SEL programs in the Yale study taught a wide range of soft skills, from mindfulness and anger management to resolving conflicts and setting goals. It is unclear which soft skills are driving the academic gains. That’s an area for future research.

    “Developmentally, when we think about what we know about how kids learn, emotional regulation is really the driver,” said Cipriano. “No matter how good that curriculum or that math program or reading curriculum is, if a child is feeling unsafe or anxious or stressed out or frustrated or embarrassed, they’re not available to receive the instruction, however great that teacher might be.”

    Cipriano said that effective programs give students tools to cope with stressful situations. She offered the example of a pop quiz, from the perspective of a student. “You can recognize, I’m feeling nervous, my blood is rushing to my hands or my face, and I can use my strategies of counting to 10, thinking about what I know, and use positive self talk to be able to regulate, to be able to take my test,” she said.

    Related: A cheaper, quicker approach to social-emotional learning?

    The strongest evidence for SEL is in elementary school, where the majority of evaluations have been conducted (two-thirds of the 40 studies). For young students, SEL lessons tend to be short but frequent, for example, 10 minutes a day. There’s less evidence for middle and high school SEL programs because they haven’t been studied as much. Typically, preteens and teens have less frequent but longer sessions, a half hour or even 90 minutes, weekly or monthly. 

    Cipriano said that schools don’t need to spend “hours and hours” on social and emotional instruction in order to see academic benefits. A current trend is to incorporate or embed social and emotional learning within academic instruction, as part of math class, for example. But none of the underlying studies in this paper evaluated whether this was a more effective way to deliver SEL. All of the programs in this study were separate stand-alone SEL lessons. 

    Advice to schools

    Schools are inundated by sales pitches from SEL vendors. Estimates of the market size range wildly, but a half dozen market research firms put it above $2 billion annually. Not all SEL programs are necessarily effective or can be expected to produce the academic gains that the Yale team calculated. 

    Cipriano advises schools not to be taken in by slick marketing. Many of the effective programs have no marketing at all and some are free. Unfortunately, some of these programs have been discontinued or have transformed through ownership changes. But she says school leaders can ask questions about which specific skills the SEL program claims to foster, whether those skills will help the district achieve its goals, such as improving school climate, and whether the program has been externally evaluated. 

    “Districts invest in things all the time that are flashy and pretty, across content areas, not just SEL,” said Cipriano. “It may never have had an external evaluation, but has a really great social media presence and really great marketing.” 

    Cipriano has also built a new website, improvingstudentoutcomes.org, to track the latest research on SEL effectiveness and to help schools identify proven programs.

    Cipriano says parents should be asking questions too. “Parents should be partners in learning,” said Cipriano. “I have four kids, and I want to know what they’re learning about in school.”

    This meta-analysis probably won’t stop the SEL critics who say that these programs force educators to be therapists. Groups like Moms for Liberty, which holds its national summit this week, say teachers should stick to academics. This paper rejects that dichotomy because it suggests that emotions, social interaction and academics are all interlinked. 

    Before criticizing all SEL programs, educators and parents need to consider the evidence.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about SEL benefits was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Education Department takes a preliminary step toward revamping its research and statistics arm

    Education Department takes a preliminary step toward revamping its research and statistics arm

    In his first two months in office, President Donald Trump ordered the closing of the Education Department and fired half of its staff. The department’s research and statistics division, called the Institute of Education Sciences (IES), was particularly hard hit. About 90 percent of its staff lost their jobs and more than 100 federal contracts to conduct its primary activities were canceled.

    But now there are signs that the Trump administration is partially reversing course and wants the federal government to retain a role in generating education statistics and evidence for what works in classrooms — at least to some extent. On Sept. 25, the department posted a notice in the Federal Register asking the public to submit feedback by Oct. 15 on reforming IES to make research more relevant to student learning. The department also asked for suggestions on how to collect data more efficiently.

    The timeline for revamping IES remains unclear, as is whether the administration will invest money into modernizing the agency. For example, it would take time and money to pilot new statistical techniques; in the meantime, statisticians would have to continue using current protocols.

    Still, the signs of rebuilding are adding up. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    At the end of May, the department announced that it had temporarily hired a researcher from the Thomas B. Fordham Institute, a conservative think tank, to recommend ways to reform education research and development. The researcher, Amber Northern, has been “listening” to suggestions from think tanks and research organizations, according to department spokeswoman Madi Biedermann, and now wants more public feedback.  

    Biedermann said that the Trump administration “absolutely” intends to retain a role in education research, even as it seeks to close the department. Closure will require congressional approval, which hasn’t happened yet. In the meantime, Biedermann said the department is looking across the government to find where its research and statistics activities “best fit.”

    Other IES activities also appear to be resuming. In June, the department disclosed in a legal filing that it had or has plans to reinstate 20 of the 101 terminated contracts. Among the activities slated to be restarted are 10 Regional Education Laboratories that partner with school districts and states to generate and apply evidence. It remains unclear how all 20 contracts can be restarted without federal employees to hold competitive bidding processes and oversee them. 

    Earlier in September, the department posted eight new jobs to help administer the National Assessment of Educational Progress (NAEP), also called the Nation’s Report Card. These positions would be part of IES’s statistics division, the National Center for Education Statistics. Most of the work in developing and administering tests is handled by outside vendors, but federal employees are needed to award and oversee these contracts. After mass firings in March, employees at the board that oversees NAEP have been on loan to the Education Department to make sure the 2026 NAEP test is on schedule.

    Only a small staff remains at IES. Some education statistics have trickled out since Trump took office, including its first release of higher education data on Sept. 23. But the data releases have been late and incomplete

    It is believed that no new grants have been issued for education studies since March, according to researchers who are familiar with the federal grant making process but asked not to be identified for fear of retaliation. A big obstacle is that a contract to conduct peer review of research proposals was canceled so new ideas cannot be properly vetted. The staff that remains is trying to make annual disbursements for older multi-year studies that haven’t been canceled. 

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    With all these changes, it’s becoming increasingly difficult to figure out the status of federally funded education research. One potential source of clarity is a new project launched by two researchers from George Washington University and Johns Hopkins University. Rob Olsen and Betsy Wolf, who was an IES researcher until March, are tracking cancellations and keeping a record of research results for policymakers. 

    If it’s successful, it will be a much-needed light through the chaos.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about reforming IES was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link