Tag: SAT

  • SAT and ACT participation remains below pre-pandemic levels

    SAT and ACT participation remains below pre-pandemic levels

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • Five years after COVID-19 shut down classrooms and shifted college admissions testing policies, the SAT and ACT are still drawing fewer students than during pre-pandemic years.
    • Some 1.38 million students took the ACT in 2025 compared to 1.78 million in 2019, and about 2 million students took the SAT this year versus 2.22 million in 2019, data released recently by the testing companies show. 
    • ​​​​​​​SAT scores, meanwhile, increased only slightly from the high school class of 2024 to the class of 2025, while ACT scores stayed about level. In both cases, scores fell below those from the pre-pandemic year of 2019. 

    Dive Insight:

    The slight uptick in SAT scores and level ACT scores for the high school graduating class of 2025 are still positive trends compared to last year, when average scores on both tests declined year-over-year compared to 2023.

    Still, SAT scores were still “substantially lower than average scores prior to the pandemic,” said College Board, the organization that publishes the test. In 2025, average SAT scores were 521 in reading and writing and 508 in math. In 2019, those averages were 10 points higher for reading and writing (531) and 20 points higher for math (528).

    The ACT average composite score, 19.4, also fell lower than the 2019 score of 20.7.

    For ACT test-takers, 30% met three or more of the four college readiness benchmarks in English, math, reading and science. The ACT benchmarks indicate that students have a 50% chance of earning a B or better in first-year college courses of the same subject and a 75% chance of a C or better. 

    Meanwhile, the dip in overall test takers for both exams continues a trend that dates to at least the pandemic, when colleges shifted toward test-optional policies. For the ACT, however, the numbers began declining much earlier. 

    While testing experts had expected the pandemic to trigger a shift away from K-12 standardized tests, ​​that didn’t materialize to a great degree and standardized and high-stakes testing are still core to K-12. 

    More than 90% of four-year colleges in the U.S. were not expected to require applicants for fall 2026 admission to submit ACT or SAT scores, according to data released in September by FairTest, a nonprofit that advocates for limiting college entrance exams. That’s over 2,000 of the nation’s bachelor-degree granting institutions. 

    Since fall 2020, the number of test-optional or test-free colleges have increased overall, the organization’s annual count shows.

    In the meantime, FairTest said the number of institutions requiring entrance exams minimally increased — from 154 for fall 2025 admissions to 160 for fall 2026 admissions.

    “While a handful of schools have reinstated testing requirements over the past two admissions cycles for a variety of institutional reasons and in response to external pressures, ACT/SAT-optional and test-blind/score-free policies remain the normal baseline in undergraduate admissions,” said FairTest Executive Director Harry Feder in a September statement. “Test-optional policies continue to dominate at national universities, state flagships, and selective liberal arts colleges.”

    Source link

  • Why high schools should use teaching assistants

    Why high schools should use teaching assistants

    Key points:

    Exam scores always seem to go up. Whether it’s the SAT when applying for college or an AP score to earn college credit, competitive scores seem to be creeping up. While faculty are invaluable, students who recently completed classes or exams offer insight that bridges the gap between the curriculum and the exam. I believe students who recently excelled in a course should be allowed and encouraged to serve as teaching assistants in high school.

    Often, poor preparation contributes to students’ disappointing exam performance. This could be from not understanding content, being unfamiliar with the layout, or preparing the wrong material. Many times, in courses at all levels, educators emphasize information that will not show up on a standardized test or, in some cases, in their own material. This is a massive issue in many schools, as every professor has their own pet project they like to prioritize. For example, a microbiology professor in a medical school may have an entire lecture on a rare microbe because they research it, but nothing about it will be tested on the national board exams, or even their course final exams. This was a common theme in high school, with history teachers loving to share niche facts, or in college, when physics professors loved to ask trick questions. By including these things in your teaching, is it really benefiting the pupil? Are students even being tested correctly over the material if, say, 10 percent of your exam questions are on information that is superfluous?

    Universities can get around this issue by employing teaching assistants (TAs) to help with some of the confusion. Largely, their responsibilities are grading papers, presenting the occasional lecture, and holding office hours. The lesser-known benefit of having and speaking with TAs is the ability to tell you how to prioritize your studying. These are often older students who have been previously successful in the course, and as a result, they can give a student a much better idea of what will be included on an exam than the professor.

    When I was a TA as an undergrad, we were required to hold exam prep sessions the week of a big test. During these sessions, students answered practice questions about concepts similar to what would show up on the exam. All the students who showed up to my sessions performed extremely well on the tests, and they performed well because they were prepared for the exam and knew the concepts being tested. As a result, they would finish the course with a much higher grade because they knew what they should be studying. It is much more effective to give a student a practice question that uses similar concepts to what will be on their exam than it is for a professor to give a list of topics that are covered on a test. For example, studying for a math test is more impactful when answering 50 practice questions versus a teacher handing you a list of general concepts to study, such as: “Be able to manipulate inequalities and understand the order of operations.”

    Universities seem to know that professors might not provide the best advice, or at the least, they have used TAs as a decent solution to the problem. It is my opinion that having this style of assistance in high school would be beneficial to student outcomes. Having, say, a senior help in a junior-level class may work wonders. Teachers would have a decrease in their responsibilities based on what they trust their TA to do. They could help grade, run review sessions, and make and provide exam prep materials. In essence, all the unseen work in teaching that great teachers do could be done more efficiently with a TA. Every student has had an amazing teacher who provides an excellent study guide that is almost identical to the test, making them confident going into test day. In my experience, those guides are not completed for a grade or a course outcome, and effectively become extra work for the educator, all to help the students who are willing to use them effectively. Having a TA would ease that burden–it would encourage students to consider teaching as a profession in a time when there is a shortage of educators.

    There are many ways to teach and learn, but by far the best way to be prepared for a test is by talking to someone who has recently taken it. Universities understand that courses are easier for students when they can talk to someone who has taken it. It is my opinion that high schools would be able to adopt this practice and reduce teacher workload while increasing the student outcomes. 

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Changes in SAT Scores after Test-optional

    Changes in SAT Scores after Test-optional

    One of the intended consequences of test-optional admission policies at some institutions prior to the COVID-19 pandemic was to raise test scores reported to US News and World Report.  It’s rare that you would see a proponent of test-optional admission like me admit that, but to deny it would be foolish.

    Because I worked at DePaul, which was an early adopter of the approach (at least among large universities), I fielded a lot of calls from colleagues who were considering it, some of whom were explicit in their reasons for doing so.  One person I spoke to came right out at the start of the call: She was only calling, she said, because her provost wanted to know how much they could raise scores if they went test-optional.

    If I sensed or heard that motivation, I advised people against it.  In those days, the vast majority of students took standardized admission tests like the SAT or ACT, but the percentage of students applying without tests was still relatively small; the needle would not move appreciably by going to test-optional admission.  

    On the other hand, of course, I knew the pressure admissions offices were under by trustees, presidents, provosts, and faculty, and as Campbell’s Law and its many variants tells us, what gets measured gets produced.  DePaul, a private university with a public mission, was using a test-optional approach to ensure those students who were a part of our mission would not be left behind as applications grew.  (I often say how lucky I was to work at a place where–in 17 years–I was never once asked about how to increase test scores or selectivity, but I heard frequently about the Pell percentage in the class.)

    If you wanted different outcomes, there were lots of ways to manipulate admissions statistics to effect the same outcome, I’d tell the callers.

    Motivations, of course, were different in the summer of 2020, when it had become clear that test-optional admission was a necessary utilitarian decision that also carried with it good reputational benefits: Even if you were doing it to survive, you could at least look like you were being altruistic.  And, of course, you could learn something in the process.

    So what’s happened? About what you would expect.  At the overwhelming majority of colleges, the Mean SAT EWR+M score has risen between the fall of 2019 and 2022.  I used 2019 as the base because the data reported to IPEDS is for enrolling students, and the 2020 term was affected by COVID. 

    It’s dangerous, of course, to try to figure out exactly why they went up, other than the expected sampling bias.  It could be that reputation that drove such things was already increasing.  It could be that the college took a lot more or a lot fewer chances in admission (either is possible); it could be location and migration (out on the west coast, people care about tests, it seems, a lot less than they do in the Eastern Time Zone), and students who cross state lines to attend college tend to be wealthier, and wealthier students tend to score higher on tests.

    Or it could be all of those things, and others.  We’ll never really know.  But it’s still fun to look at.  So here we go, with just one view this time.

    One the left are mean SAT scores in 2019 and 2022, calculated from the reported 25th and 75th percentiles of the two sections.  Numbers are rounded. On the left are gray bars with the 2019 figure, and purple bars with the 2022 score.  On the right are the changes, and the chart is sorted on the value in descending order.

    There are four filters to get the view you want: At top left you can use the control to limit the region; at top right you can look at public and/or private four-year universities.  You can also use the sliders to look at colleges by limiting the 2022 selectivity or class size.

    Again, this is interesting, but not necessarily instructive.  See if you can guess what your favorite college looks like before and after the pandemic.  Have fun.

    (Note: Some institutions that went test-optional stopped reporting test scores as a result, and they are not included here.)

    Source link