Tag: literacy

  • DOGE Education Cuts Hit Students with Disabilities, Literacy Research – The 74

    DOGE Education Cuts Hit Students with Disabilities, Literacy Research – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    When teens and young adults with disabilities in California’s Poway Unified School District heard about a new opportunity to get extra help planning for life after high school, nearly every eligible student signed up.

    The program, known as Charting My Path for Future Success, aimed to fill a major gap in education research about what kinds of support give students nearing graduation the best shot at living independently, finding work, or continuing their studies.

    Students with disabilities finish college at much lower rates than their non-disabled peers, and often struggle to tap into state employment programs for adults with disabilities, said Stacey McCrath-Smith, a director of special education at Poway Unified, which had 135 students participating in the program. So the extra help, which included learning how to track goals on a tool designed for high schoolers with disabilities, was much needed.

    Charting My Path launched earlier this school year in Poway Unified and 12 other school districts. The salaries of 61 school staff nationwide, and the training they received to work with nearly 1,100 high schoolers with disabilities for a year and a half, was paid for by the U.S. Department of Education.

    Jessie Damroth’s 17-year-old son Logan, who has autism, attention deficit hyperactivity disorder, and other medical needs, had attended classes and met with his mentor through the program at Newton Public Schools in Massachusetts for a month. For the first time, he was talking excitedly about career options in science and what he might study at college.

    “He was starting to talk about what his path would look like,” Damroth said. “It was exciting to hear him get really excited about these opportunities. … He needed that extra support to really reinforce that he could do this.”

    Then the Trump administration pulled the plug.

    Charting My Path was among more than 200 Education Department contracts and grants terminated over the last two weeks by the Trump administration’s U.S. DOGE Service. DOGE has slashed spending it deemed to be wasteful, fraudulent, or in service of diversity, equity, inclusion, and accessibility goals that President Donald Trump has sought to ban. But in several instances, the decision to cancel contracts affected more than researchers analyzing data in their offices — it affected students.

    Many projects, like Charting My Path, involved training teachers in new methods, testing learning materials in actual classrooms, and helping school systems use data more effectively.

    “Students were going to learn really how to set goals and track progress themselves, rather than having it be done for them,” McCrath-Smith said. “That is the skill that they will need post-high school when there’s not a teacher around.”

    All of that work was abruptly halted — in some cases with nearly finished results that now cannot be distributed.

    Every administration is entitled to set its own priorities, and contracts can be canceled or changed, said Steven Fleischman, an education consultant who for many years ran one of the regional research programs that was terminated. He compared it to a homeowner deciding they no longer want a deck as part of their remodel.

    But the current approach reminds him more of construction projects started and then abandoned during the Great Recession, in some cases leaving giant holes that sat for years.

    “You can walk around and say, ‘Oh, that was a building we never finished because the funds got cut off,’” he said.

    DOGE drives cuts to education research contracts, grants

    The Education Department has been a prime target of DOGE, the chaotic cost-cutting initiative led by billionaire Elon Musk, now a senior adviser to Trump.

    So far, DOGE has halted 89 education projects, many of which were under the purview of the Institute of Education Sciences, the ostensibly independent research arm of the Education Department. The administration said those cuts, which included multi-year contracts, totaled $881 million. In recent years, the federal government has spent just over $800 million on the entire IES budget.

    DOGE has also shut down 10 regional labs that conduct research for states and local schools and shuttered four equity assistance centers that help with teacher training. The Trump administration also cut off funding for nearly 100 teacher training grants and 18 grants for centers that often work to improve instruction for struggling students.

    The total savings is up for debate. The Trump administration said the terminated Education Department contracts and grants were worth $2 billion. But some were near completion with most of the money already spent.

    An NPR analysis of all of DOGE’s reported savings found that it likely was around $2 billion for the entire federal government — though the Education Department is a top contributor.

    On Friday, a federal judge issued an injunction that temporarily blocks the Trump administration from canceling additional contracts and grants that might violate the anti-DEIA executive order. It’s not clear whether the injunction would prevent more contracts from being canceled “for convenience.”

    Mark Schneider, the recent past IES director, said the sweeping cuts represent an opportunity to overhaul a bloated education research establishment. But even many conservative critics have expressed alarm at how wide-ranging and indiscriminate the cuts have been. Congress mandated many of the terminated programs, which also indirectly support state and privately funded research.

    The canceled projects include contracts that support maintenance of the Common Core of Data, a major database used by policymakers, researchers, and journalists, as well as work that supports updates to the What Works Clearinghouse, a huge repository of evidence-based practices available to educators for free.

    And after promising not to make any cuts to the National Assessment of Educational Progress, known as the nation’s report card, the department canceled an upcoming test for 17-year-olds that helps researchers understand long-term trends. On Monday, Peggy Carr, the head of the National Center for Education Statistics, which oversees NAEP, was placed on leave.

    The Education Department did not respond to questions about who decided which programs to cut and what criteria were used. Nor did the department respond to a specific question about why Charting My Path was eliminated. DOGE records estimate the administration saved $22 million by terminating the program early, less than half the $54 million in the original contract.

    The decision has caused mid-year disruptions and uncertainty.

    In Utah, the Canyons School District is trying to reassign the school counselor and three teachers whose salaries were covered by the Charting My Path contract.

    The district, which had 88 high schoolers participating in the program, is hoping to keep using the curriculum to boost its usual services, said Kirsten Stewart, a district spokesperson.

    Officials in Poway Unified, too, hope schools can use the curriculum and tools to keep up a version of the program. But that will take time and work because the program’s four teachers had to be reassigned to other jobs.

    “They dedicated that time and got really important training,” McCrath-Smith said. “We don’t want to see that squandered.”

    For Damroth, the loss of parent support meetings through Charting My Path was especially devastating. Logan has a rare genetic mutation that causes him to fall asleep easily during the day, so Damroth wanted help navigating which colleges might be able to offer extra scheduling support.

    “I have a million questions about this. Instead of just hearing ‘I don’t know’ I was really looking forward to working with Joe and the program,” she said, referring to Logan’s former mentor. “It’s just heartbreaking. I feel like this wasn’t well thought out. … My child wants to do things in life, but he needs to be given the tools to achieve those goals and those dreams that he has.”

    DOGE cuts labs that helped ‘Mississippi Miracle’ in reading

    The dramatic improvement in reading proficiency that Carey Wright oversaw as state superintendent in one the nation’s poorest states became known as the “Mississippi Miracle.”

    Regional Educational Laboratory Southeast, based out of the Florida Center for Reading Research at Florida State University, was a key partner in that work, Wright said.

    When Wright wondered if state-funded instructional coaches were really making a difference, REL Southeast dispatched a team to observe, videotape, and analyze the instruction delivered by hundreds of elementary teachers across the state. Researchers reported that teachers’ instructional practices aligned well with the science of reading and that teachers themselves said they felt far more knowledgeable about teaching reading.

    “That solidified for me that the money that we were putting into professional learning was working,” Wright said.

    The study, she noted, arose from a casual conversation with researchers at REL Southeast: “That’s the kind of give and take that the RELs had with the states.”

    Wright, now Maryland state superintendent, said she was looking forward to partnering with REL Mid-Atlantic on a math initiative and on an overhaul of the school accountability system.

    But this month, termination letters went out to the universities and research organizations that run the 10 Regional Educational Laboratories, which were established by Congress in 1965 to serve states and school districts. The letters said the contracts were being terminated “for convenience.”

    The press release that went to news organizations cited “wasteful and ideologically driven spending” and named a single project in Ohio that involved equity audits as a part of an effort to reduce suspensions. Most of the REL projects on the IES website involve reading, math, career connections, and teacher retention.

    Jannelle Kubinec, CEO of WestEd, an education research organization that held the contracts for REL West and REL Northwest, said she never received a complaint or a request to review the contracts before receiving termination letters. Her team had to abruptly cancel meetings to go over results with school districts. In other cases, reports are nearly finished but cannot be distributed because they haven’t gone through the review process.

    REL West was also working with the Utah State Board of Education to figure out if the legislature’s investment in programs to keep early career teachers from leaving the classroom was making a difference, among several other projects.

    “This is good work and we are trying to think through our options,” she said. “But the cancellation does limit our ability to finish the work.”

    Given enough time, Utah should be able to find a staffer to analyze the data collected by REL West, said Sharon Turner, a spokesperson for the Utah State Board of Education. But the findings are much less likely to be shared with other states.

    The most recent contracts started in 2022 and were set to run through 2027.

    The Trump administration said it planned to enter into new contracts for the RELs to satisfy “statutory requirements” and better serve schools and states, though it’s unclear what that will entail.

    “The states drive the research agendas of the RELs,” said Sara Schapiro, the executive director of the Alliance for Learning Innovation, a coalition that advocates for more effective education research. If the federal government dictates what RELs can do, “it runs counter to the whole argument that they want the states to be leading the way on education.”

    Some terminated federal education research was nearly complete

    Some research efforts were nearly complete when they got shut down, raising questions about how efficient these cuts were.

    The American Institutes for Research, for example, was almost done evaluating the impact of the Comprehensive Literacy State Development program, which aims to improve literacy instruction through investments like new curriculum and teacher training.

    AIR’s research spanned 114 elementary schools across 11 states and involved more than 23,000 third, fourth, and fifth graders and their nearly 900 reading teachers.

    Researchers had collected and analyzed a massive trove of data from the randomized trial and presented their findings to federal education officials just three days before the study was terminated.

    “It was a very exciting meeting,” said Mike Garet, a vice president and institute fellow at AIR who oversaw the study. “People were very enthusiastic about the report.”

    Another AIR study that was nearing completion looked at the use of multi-tiered systems of support for reading among first and second graders. It’s a strategy that helps schools identify and provide support to struggling readers, with the most intensive help going to kids with the highest needs. It’s widely used by schools, but its effectiveness hasn’t been tested on a larger scale.

    The research took place in 106 schools and involved over 1,200 educators and 5,700 children who started first grade in 2021 and 2022. Much of the funding for the study went toward paying for teacher training and coaching to roll out the program over three years. All of the data was collected and nearly done being analyzed when DOGE made its cuts.

    Garet doesn’t think he and his team should simply walk away from unfinished work.

    “If we can’t report results, that would violate our covenant with the districts, the teachers, the parents, and the students who devoted a lot of time in the hope of generating knowledge about what works,” Garet said. “Now that we have the data and have the results, I think we’re duty-bound to report them.”

    This story was originally published by Chalkbeat. Chalkbeat is a nonprofit news site covering educational change in public schools. Sign up for their newsletters at ckbe.at/newsletters.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • AI Support for Teachers

    AI Support for Teachers

    Collaborative Classroom, a leading nonprofit publisher of K–12 instructional materials, announces the publication of SIPPS, a systematic decoding program. Now in a new fifth edition, this research-based program accelerates mastery of vital foundational reading skills for both new and striving readers.

    Twenty-Five Years of Transforming Literacy Outcomes

    “As educators, we know the ability to read proficiently is one of the strongest predictors of academic and life success,” said Kelly Stuart, President and CEO of Collaborative Classroom. “Third-party studies have proven the power of SIPPS. This program has a 25-year track record of transforming literacy outcomes for students of all ages, whether they are kindergarteners learning to read or high schoolers struggling with persistent gaps in their foundational skills.

    “By accelerating students’ mastery of foundational skills and empowering teachers with the tools and learning to deliver effective, evidence-aligned instruction, SIPPS makes a lasting impact.”

    What Makes SIPPS Effective?

    Aligned with the science of reading, SIPPS provides explicit, systematic instruction in phonological awareness, spelling-sound correspondences, and high-frequency words. 

    Through differentiated small-group instruction tailored to students’ specific needs, SIPPS ensures every student receives the necessary targeted support—making the most of every instructional minute—to achieve grade-level reading success.

    SIPPS is uniquely effective because it accelerates foundational skills through its mastery-based and small-group targeted instructional design,” said Linda Diamond, author of the Teaching Reading Sourcebook. “Grounded in the research on explicit instruction, SIPPS provides ample practice, active engagement, and frequent response opportunities, all validated as essential for initial learning and retention of learning.”

    Personalized, AI-Powered Teacher Support

    Educators using SIPPS Fifth Edition have access to a brand-new feature: immediate, personalized responses to their implementation questions with CC AI Assistant, a generative AI-powered chatbot.

    Exclusively trained on Collaborative Classroom’s intellectual content and proprietary program data, CC AI Assistant provides accurate, reliable information for educators.

    Other Key Features of SIPPS, Fifth Edition

    • Tailored Placement and Progress Assessments: A quick, 3–8 minute placement assessment ensures each student starts exactly at their point of instructional need. Ongoing assessments help monitor progress, adjust pacing, and support grouping decisions.
    • Differentiated Small-Group Instruction: SIPPS maximizes instructional time by focusing on small groups of students with similar needs, ensuring targeted, effective teaching.
    • Supportive of Multilingual Learners: Best practices in multilingual learner (ML) instruction and English language development strategies are integrated into the design of SIPPS.
    • Engaging and Effective for Older Readers: SIPPS Plus and SIPPS Challenge Level are specifically designed for students in grades 4–12, offering age-appropriate texts and instruction to close lingering foundational skill gaps.
    • Multimodal Supports: Integrated visual, auditory, and kinesthetic-tactile strategies help all learners, including multilingual students.
    • Flexible, Adaptable, and Easy to Teach: Highly supportive for teachers, tutors, and other adults working in classrooms and expanded learning settings, SIPPS is easy to implement well. A wraparound system of professional learning support ensures success for every implementer.

    Accelerating Reading Success for Students of All Ages

    In small-group settings, students actively engage in routines that reinforce phonics and decoding strategies, practice with aligned texts, and receive immediate feedback—all of which contribute to measurable gains.

    “With SIPPS, students get the tools needed to read, write, and understand text that’s tailored to their specific abilities,” said Desiree Torres, ENL teacher and 6th Grade Team Lead at Dr. Richard Izquierdo Health and Science Charter School in New York. “The boost to their self-esteem when we conference about their exam results is priceless. Each and every student improves with the SIPPS program.” 

    Kevin Hogan
    Latest posts by Kevin Hogan (see all)

    Source link

  • Indiana First Lady to Raise Money for Dolly Parton’s Library Program – The 74

    Indiana First Lady to Raise Money for Dolly Parton’s Library Program – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    After slashing a popular reading program from the budget, Gov. Mike Braun said Friday he asked First Lady Maureen Braun to spearhead an initiative to keep Dolly Parton’s Imagination Library in Indiana.

    “She has agreed and she will work with philanthropic partners and in consultation with state leadership to identify funding opportunities for the book distribution program,” the governor said in a news release.

    The program gifts free, high quality, age-appropriate books to children from birth to age five on a monthly basis, regardless of family income.

    Former Gov. Eric Holcomb included a statewide expansion of the program in his 2023 legislative agenda. The General Assembly earmarked $6 million for the program in the state’s last biennial budget — $2 million in the first year and $4 million in the second — to ensure that all Hoosier kids qualify to receive free books.

    But when Gov. Braun prepared his budget proposal in January he discontinued the funding as part of an overall effort to rein in state spending.

    “I am honored to lead this work to help ensure our youngest Hoosiers have as much exposure as possible to books and learning,” said First Lady Maureen Braun. “Indiana has many strong community partners and I am confident we will collaborate on a solution that grows children’s love of reading.”

    Jeff Conyers, president of The Dollywood Foundation, said he appreciates Braun’s commitment to early childhood literacy.

    “The Imagination Library brings the joy of reading to over 125,000 Hoosier children each month in all 92 counties across the state, and we are encouraged by Governor and First Lady Braun’s support to ensure its future in Indiana. We look forward to working with the Governor and First Lady, state leaders, and Local Program Partners to keep books in the hands of Indiana’s youngest learners and strengthen this foundation for a lifetime of success,” he said.

    Indiana Capital Chronicle is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Indiana Capital Chronicle maintains editorial independence. Contact Editor Niki Kelly for questions: [email protected].


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • Earning Our AI Literacy License – Faculty Focus

    Earning Our AI Literacy License – Faculty Focus

    Source link

  • The buzz around teaching facts to boost reading is bigger than the evidence for it

    The buzz around teaching facts to boost reading is bigger than the evidence for it

    Over the past decade, a majority of states have passed new “science of reading” laws or implemented policies that emphasize phonics in classrooms. Yet the 2024 results of an important national test, released last month, showed that the reading scores of elementary and middle schoolers continued their long downward slide, hitting new lows.

    The emphasis on phonics in many schools is still relatively new and may need more time to yield results. But a growing chorus of education advocates has been arguing that phonics isn’t enough. They say that being able to decode the letters and read words is critically important, but students also need to make sense of the words. 

    Some educators are calling for schools to adopt a curriculum that emphasizes content along with phonics. More schools around the country, from Baltimore to Michigan to Colorado, are adopting these content-filled lessons to teach geography, astronomy and even art history. The theory, which has been documented in a small number of laboratory experiments, is that the more students already know about a topic, the better they can understand a passage about it. For example, a passage on farming might make more sense if you know something about how plants grow. The brain gets overwhelmed by too many new concepts and unfamiliar words. We’ve all been there. 

    A ‘Knowledge Revival’

    A 2025 book by 10 education researchers in Europe and Australia, “Developing Curriculum for Deep Thinking: The Knowledge Revival,” makes the case that students cannot learn the skills of comprehension and critical thinking unless they know a lot of stuff first. These ideas have revived interest in E.D. Hirsch’s Core Knowledge curriculum, which gained popularity in the late 1980s. Hirsch, a professor emeritus of education and humanities at the University of Virginia, argues that democracy benefits when the citizenry shares a body of knowledge and history, which he calls cultural literacy. Now it’s a cognitive science argument that a core curriculum is also good for our brains and facilitates learning. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    The idea of forcing children to learn a specific set of facts and topics is controversial. It runs counter to newer trends of “culturally relevant pedagogy,” or “culturally responsive teaching,” in which critics contend that students’ identities should be reflected in what they learn. Others say learning facts is unimportant in the age of Google where we can instantly look anything up, and that the focus should be on teaching skills. Content skeptics also point out that there’s never been a study to show that increasing knowledge of the world boosts reading scores.

    It would be nearly impossible for an individual teacher to create the kind of content-packed curriculum that this pro-knowledge branch of education researchers has in mind. Lessons need to be coordinated across grades, from kindergarten onward. It’s not just a random collection of encyclopedia entries or interesting units on, say, Greek myths or the planets in our solar system. The science and social studies topics should be sequenced so that the ideas build upon each other, and paired with vocabulary that will be useful in the future. 

    The big question is whether the theory that more knowledge improves reading comprehension applies to real schools where children are reading below grade level. Does a content-packed curriculum translate into higher reading achievement years later?

    Putting knowledge to the test

    Researchers have been testing content-packed lessons in schools to see how much they boost reading comprehension. A 2023 study of the Core Knowledge curriculum, which was not peer reviewed, received a lot of buzz. The students who attended nine schools that adopted the curriculum were stronger readers. But it was impossible to tell whether the Core Knowledge curriculum itself made the difference or if the boost to reading scores could be attributed to the fact that all nine schools were highly regarded charter schools and were doing something else that made a difference. Perhaps they had hired great teachers and trained them well, for example. Also, the students at these charter schools were largely from middle and upper middle class families. What we really want to know is whether knowledge building at school helps the poorest children, who are less likely to be exposed to the world through travel, live performances, and other experiences that money can buy.

    Another content-heavy curriculum developed by Harvard education professor James Kim produced a modest boost to reading scores in a randomized controlled trial, according to a paper published in 2024. Reading instruction was untouched, but the students received special science and social studies lessons that were intended to boost young children’s knowledge and vocabulary. Unfortunately, the pandemic hit in the middle of the experiment and many of the lessons had to be scrapped. 

    Related: Slightly higher reading scores when students delve into social studies, study finds

    Still, for the 1,000 students who had received some of the special lessons in first and second grades, their reading and math scores on the North Carolina state tests were higher not only in third grade, but also in fourth grade, more than a year after the knowledge-building experiment ended. Most of the students were Black and Hispanic. Forty percent were from poor families.

    The latest study

    The Core Knowledge curriculum was put to the test in another study by a team of eight researchers in two unidentified cities in the mid-Atlantic and the South, where the majority of children were Black and from low income families. More than 20 schools had been randomly assigned to give kindergarteners some lessons from the Core Knowledge curriculum. The schools continued with their usual phonics instruction, but “read aloud” time, when a teacher ordinarily reads a picture book to students, had been replaced with units on plants, farming and Native Americans, for example. More than 500 kindergarteners looked at pictures on a large screen, while a teacher discussed the topics and taught new vocabulary. Additional activities reinforced the lessons. 

    According to a paper published in the February 2025 issue of the Journal of Education Psychology, the 565 children who received the Core Knowledge lessons did better on tests of the topics and words that were taught, compared with 626 children who had learned reading as usual and weren’t exposed to these topics. But they did no better in tests of general language, vocabulary development or listening comprehension. Reading itself was not evaluated. Unfortunately, the pandemic also interfered in the middle of this experiment and cut short the analysis of the students through first and second grades.  

    Related: Inside the latest reading study that’s getting a lot of buzz

    Lead researcher Sonia Cabell, an associate professor at Florida State University, says she is looking at longer term achievement data from these students, who are now in middle school. But she said she isn’t seeing a clear “signal” that the students who had this Core Knowledge instruction for a few months in kindergarten are doing any better. 

    Glimmers of hope

    Cabell did see glimmers of hope. Students in the control group schools, who didn’t receive Core Knowledge instruction, also learned about plants. But the Core Knowledge students had much more to say when researchers asked them the question: “Tell me everything you know about plants.” The results of a test of general science knowledge came just shy of statistical significance, which would have demonstrated that the Core Knowledge students were able to transfer the specific knowledge they had learned in the lessons to a broader understanding of science. 

    “There are pieces of this that are promising and encouraging,” said Cabell, who says that it’s complicated to study the combination of conventional reading instruction, such as phonics and vocabulary, with content knowledge. “We need to better understand what the active ingredient is. Is it the knowledge?” 

    All the latest Core Knowledge study proves is that students are more likely to do well on a test of something they have been taught. Some observers errantly interpreted that as evidence that a knowledge rich curriculum is beneficial

    Related: Learning science might help kids read better

    “If your great new curriculum reads articles about penguins to the kids and your old stupid curriculum reads articles about walruses to them, one of these is going to look more successful when the kids are evaluated with a penguin test,” explained Tim Shanahan, a literacy expert and a professor emeritus at the University of Illinois at Chicago who was not involved in this research.

    Widening achievement gaps

    And distressingly, students who arrived at kindergarten with stronger language skills absorbed a lot more from these content-rich lessons than lower achieving students. Instead of helping low achieving kids catch up, achievement gaps widened.

    People with more knowledge tend to be better readers. That’s not proof that increasing knowledge improves reading. It could be that higher achieving kids like learning about the world and enjoy reading. And if you stuff a child with more knowledge, it’s possible that his reading skills may not improve.

    The long view

    Shanahan speculates that if knowledge building does improve reading comprehension, it would take many, many years for it to manifest. 

    “If these efforts aren’t allowed to elbow sound reading instruction aside, they cannot hurt and, in the long run, they might even help,” he wrote in a 2021 blog post.

    Researchers are still in the early stages of designing and testing the content students need to boost literacy skills. We are all waiting for answers.

    Contact staff writer Jill Barshay at 212-678-3595 or [email protected].

    This story about Core Knowledge was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Colleges promote media literacy skills for students

    Colleges promote media literacy skills for students

    Young people today spend a large amount of time online, with a U.S. Department of Health and Human Services report noting teens ages 12 to 17 had four or more hours of daily screen time during July 2021 to December 2023.

    This digital exposure can impact teens’ mental health, according to Pew Research, with four in 10 young people saying they’re anxious when they don’t have their smartphones and 39 percent saying they have cut back their time on social media. But online presences can also impact how individuals process information, as well as their ability to distinguish between news, advertisement, opinion and entertainment.

    A December Student Voice survey by Inside Higher Ed and Generation Lab found seven out of 10 of college students would rate their current level of media literacy as somewhat or very high, but they consider their college peers’ literacy less highly, with only 32 percent rating students as a whole as somewhat or very highly media literate.

    A majority of students (62 percent) also indicate they are at least moderately concerned about the spread of misinformation among their college peers, with 26 percent saying their concern was very high.

    To address students’ digital literacy, colleges and universities can provide education and support in a variety of ways. The greatest share of Student Voice respondents (35 percent) say colleges and universities should create digital resources to learn about media literacy. But few institutions offer this kind of service or refer students to relevant resources for self-education.

    Methodology

    Inside Higher Ed and Generation Lab polled 1,026 students at 181 two- and four-year institutions from Dec. 19 to 23. The margin of error is 3 percent. Explore the findings yourself  here, here and here.

    What is media literacy? Media literacy, as defined in the survey, is the ability or skills to critically analyze for accuracy, credibility or evidence of bias in the content created and consumed in sources including radio, television, the internet and social media.

    A majority of survey respondents indicate they use at least one measure regularly to check the accuracy of information they’re receiving, including thinking critically about the message delivered, analyzing the source’s perspective or bias, verifying information with other sources, or pausing to check information before sharing with others.

    A missing resource: While there are many groups that offer digital resources or online curriculum for teachers, particularly in the K-12 space, less common are self-guided digital resources tailored to young people in higher education.

    “Create digital resources for students” was the No. 1 response across respondent groups and characteristics and was even more popular among community college respondents (38 percent) and adult learners (42 percent), which may highlight students’ preferences for learning outside the classroom, particularly for those who may be employed or caregivers.

    Arizona State University’s Walter Cronkite School of Journalism offers a free self-directed media literacy course that includes webinars with journalism and media experts, as well as exercises for reflection. Similarly, Baylor University’s library offers a microcourse, lasting 10 minutes, that can be embedded into Canvas and that awards students a badge upon completion.

    The University of North Carolina at Charlotte provides a collection of resources on a Respectful Conversation website that includes information on free expression, media literacy, constructive dialogue and critical thinking. On this website, users can also identify online classes, many of which are free, that provide an overview or a deeper level look at additional topics such as misinformation and deepfakes.

    The American Library Association has a project, Media Literacy Education in Libraries for Adult Audiences, that is designed to assist libraries in their work to improve media literacy skills among adults in the community. The project includes webinars, a resource guide for practitioners.

    Does your college or university have a self-guided digital resource for students to engage in media literacy education? Tell us more.

    Source link

  • Across All Ages & Demographics, Test Results Show Americans Are Getting Dumber – The 74

    Across All Ages & Demographics, Test Results Show Americans Are Getting Dumber – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    There’s no way to sugarcoat it: Americans have been getting dumber.

    Across a wide range of national and international tests, grade levels and subject areas, American achievement scores peaked about a decade ago and have been falling ever since. 

    Will the new NAEP scores coming out this week show a halt to those trends? We shall see. But even if those scores indicate a slight rebound off the COVID-era lows, policymakers should seek to understand what caused the previous decade’s decline. 

    There’s a lot of blame to go around, from cellphones and social media to federal accountability policies. But before getting into theories and potential solutions, let’s start with the data.

    Until about a decade ago, student achievement scores were rising. Researchers at Education Next found those gains were broadly shared across racial and economic lines, and achievement gaps were closing. But then something happened, and scores started to fall. Worse, they fell faster for lower-performing students, and achievement gaps started to grow.

    This pattern shows up on test after test. Last year, we looked at eighth grade math scores and found growing achievement gaps in 49 of 50 states, the District of Columbia and 17 out of 20 large cities with sufficient data.

    But it’s not just math, and it’s not just NAEP. The American Enterprise Institute’s Nat Malkus has documented the same trend in reading, history and civics. Tests like NWEA’s MAP Growth and Curriculum Associates’ i-Ready are showing it too. And, as Malkus found in a piece released late last year, this is a uniquely American problem. The U.S. now leads the world in achievement gap growth.

    What’s going on? How can students here get back on track? Malkus addresses these questions in a new report out last week and makes the point that any honest reckoning with the causes and consequences of these trends must account for the timing, scope and magnitude of the changes.

    Theory #1: It’s accountability

    As I argued last year, my top explanation has been the erosion of federal accountability policies. In 2011 and 2012, the Obama administration began issuing waivers to release states from the most onerous requirements of the No Child Left Behind Act. Congress made those policies permanent in the 2015 Every Student Succeeds Act. That timing fits, and it makes sense that easing up on accountability, especially for low-performing students, led to achievement declines among those same kids.

    However,  there’s one problem with this explanation: American adults appear to be suffering from similar achievement declines. In results that came out late last year, the average scores of Americans ages 16 to 65 fell in both literacy and numeracy on the globally administered Program for the International Assessment of Adult Competencies. 

    And even among American adults, achievement gaps are growing. The exam’s results are broken down into six performance levels. On the numeracy portion, for example, the share of Americans scoring at the two highest levels rose two points, from 10% to 12%, while the percentage of those at the bottom two levels rose from 29% to 34%. In literacy, the percentage of Americans scoring at the top two levels fell from 14% to 13%, while the lowest two levels rose from 19% to 28%. 

    These results caused Peggy Carr, the commissioner of the National Center for Education Statistics, to comment, “There’s a dwindling middle in the United States in terms of skills.” Carr could have made the same comment about K-12 education —  except that these results can’t be explained by school-related causes.

    Theory #2: It’s the phones

    The rise of smartphones and social media, and the decline in reading for pleasure, could be contributing to these achievement declines. Psychologist Jean Twenge pinpointed 2012 as the first year when more than half of Americans owned a smartphone, which is about when achievement scores started to decline. This theory also does a better job of explaining why Americans of all ages are scoring lower on achievement tests.

    But there are some holes in this explanation. For one, why are some of the biggest declines seen in the youngest kids? Are that many 9-year-olds on Facebook or Instagram? Second, why are the lowest performers suffering the largest declines in achievement? Attention deficits induced by phones and screens should affect all students in similar ways, and yet the pattern shows the lowest performers are suffering disproportionately large drops.

    But most fundamentally, why is this mostly a U.S. trend? Smartphones and social media are global phenomena, and yet scores in Australia, England, Italy, Japan and Sweden have all risen over the last decade. A couple of other countries have seen some small declines (like Finland and Denmark), but no one has else seen declines like we’ve had here in the States.

    Other theories: Immigration, school spending or the Common Core

    Other theories floating around have at least some kernels of truth. Immigration trends could explain some portion of the declines, although it’s not clear why those would be affecting scores only now. The Fordham Institute’s Mike Petrilli has partly blamed America’s “lost decade” on economic factors, but school spending has rebounded sharply in recent years without similar gains in achievement. Others, including historian Diane Ravitch and the Pioneer Institute’s Theodor Rebarber, blame the shift to the Common Core state standards, which was happening about the same time. But non-Common Core states suffered similar declines, and scores have also dropped in non-Common Core subjects.

    Note that COVID is not part of my list. It certainly exacerbated achievement declines and reset norms within schools, but achievement scores were already falling well before it hit America’s shores.

    Instead of looking for one culprit, it could be a combination of these factors. It could be that the rise in technology is diminishing Americans’ attention spans and stealing their focus from books and other long-form written content. Meanwhile, schools have been de-emphasizing basic skills, easing up on behavioral expectations and making it easier to pass courses. At the same time, policymakers in too many parts of the country have stopped holding schools accountable for the performance of all students.

    That’s a potent mix of factors that could explain these particular problems. It would be helpful to have more research to pinpoint problems and solutions, but if this diagnosis is correct, it means students, teachers, parents and policymakers all have a role to play in getting achievement scores back on track. 


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • Students on media literacy and how colleges can help

    Students on media literacy and how colleges can help

    Social media is a top source of news for nearly three in four students, and half at least somewhat trust platforms such as Instagram and TikTok to deliver that news and other critical information accurately. As for legacy media sources, namely newspapers, just two in 10 students indicate they regularly rely on them for news. That’s even as most students indicate they trust newspapers to convey accurate information.

    These are some of the findings from Inside Higher Ed’s new Student Voice flash survey with Generation Lab on media literacy, conducted last month. Some of the data seems grim in light of declining public trust in institutions and expertise, and the spread of misinformation—concerns that many of the survey’s 1,026 two-year and four-year respondents share: Some 62 percent express some or a lot of concern about the spread of misinformation among their college peers. (See also this month’s news that Meta is eliminating third-party fact-checkers.) And not quite half of respondents (46 percent) approve of the job colleges and universities as a whole are doing to promote students’ media literacy.

    At the same time, the data suggests that colleges and universities are at least somewhat effective in this area. One example: Just one in 10 students rates their level of media literacy prior to attending college as very high, compared to the quarter of students who rate their current level of media literacy as very high. Nearly all respondents, 98 percent, also indicate they regularly practice at least some basic media literacy skills to check the accuracy of the information they’re consuming. To some degree, this challenges ongoing skepticism about students’ critical thinking abilities and how helpful colleges are in developing them.

    When asked to highlight ways colleges and universities can help them build their awareness and skills, students ranked creating digital resources to learn about media literacy highest on a list of possible actions.

    Inside Higher Ed and Generation Lab defined media literacy in the survey as the ability or skills to critically analyze for accuracy, credibility or evidence of bias in the content created and consumed in sources including radio, television, the internet and social media. Read on for an overview of the findings in six charts, plus some additional analysis—and how colleges can help close some of these gaps.

    Students’ top sources for news are social media and friends and family/word of mouth. Relatively few students indicate they regularly get their news from sources such as newspapers, broadcast/network TV news, radio or magazines. This is relatively consistent across institution type (two-year/four-year and public/private nonprofit), though students at private nonprofits (n=259) are much more likely than their public counterparts (n=767) to indicate they read newspapers, at 38 percent versus 15 percent, respectively. By student type, those 25 and older (n=167) are much less likely than their peers 18 to 24 (n=842) to say they rely on friends and family/word of mouth for news, at 33 percent versus 52 percent, respectively.

    Most students aren’t turning to legacy media as a top source of news, though they generally express trust in sources such as newspapers and broadcast network/TV news to deliver news and other critical information accurately. But more than half also express some or a great deal of trust in social media to deliver accurate information. Same for friends and family/word of mouth.

    When engaging with media of different kinds, about two in three students say they regularly check the accuracy of the information by analyzing the source’s perspective and/or possible biases, thinking critically about the message delivered (such as distinguishing fact from opinion), and verifying the information using other sources.

    Approximately half of students also say they consider the algorithm that is pushing them certain content on websites and/or social media, pause to check the information before sharing with others or on social media, and identify who or what additional sources are being included in the content. While nearly all students indicate they practice some of these skills, some differences emerge by political affiliation, with self-identified Democrats more likely than self-identified Republicans to report analyzing the source’s perspective and/or possible biases, for example, at 68 percent versus 53 percent.

    Many students indicate that their level of media literacy has increased in college. Students also express more confidence in their own level of media literacy than that of their peers, on average: While 72 percent of students rate their own level of media literacy as somewhat or very high, just 32 percent rate their peers’ level of media literacy this way, on average. And students across a range of demographics express at least some concern about the spread of misinformation among their college peers. This includes 63 percent of both Democrats and Republicans. By age, respondents 25 and older are likelier to express a very high level of concern (37 percent of this group versus 24 percent of the 18-to-24 set).

    How are institutions doing when it comes to helping students build their media literacy? As with their own level of media literacy relative to their peers’, respondents have a rosier view of their own institution than they do of higher education as a whole. This is relatively consistent across institution types, though students at private nonprofits are less likely than their public counterparts to approve of the job colleges and universities in general are doing.

    As for how institutions can best help students improve their media literacy, the top pick from a list of options (up to two choices) is creating digital resources for students to learn about media literacy (35 percent). Another relatively popular option is embedding training on media literacy in a first-year seminar or program (31 percent). This option is more popular among four-year college students than it is among two-year students. But creating peer-to-peer education programs on media literacy is more popular among two-year students than it is among four-year students.

    Building Habits and Competencies

    Renee Hobbs, professor of communication studies and director of the Media Education Lab at the University of Rhode Island, says it’s “no surprise that college students rely on their family and friends and social networks for news, as do most Americans.” In one comparison, an Intelligent survey of four-year college students following the 2024 election, respondents cited TikTok and Instagram as their top two news sources. The same survey found that students for voted for President-elect Donald Trump were twice as likely to get their news from podcasts as those who voted for Vice President Kamala Harris. In Inside Higher Ed’s survey, Democrats are somewhat more likely than Republicans to cite news podcasts as a top news source (12 percent versus 4 percent, respectively), but Republicans are somewhat more likely than Democrats to rely on opinion podcasts (12 percent versus 5 percent).

    Hobbs says it’s a “comfort” that even one in five Student Voice respondents relies heavily on newspapers. That the same, relatively small share expresses a very high level of trust in newspapers and broadcast news confirms national trends, she adds; a fall poll from Gallup, for example, found that confidence in mass media remained at a low. Noting the existence of active “news avoiders,” whose ranks are growing, according to data from the Reuters Institute, Hobbs says that her own media literacy students are required to read the newspaper. Turns out, many “appreciate the opportunity to take up the habit.”

    Regarding the ever-expanding space where media literacy overlaps with digital literacy, Hobbs’s own ongoing research suggests that teaching about algorithmic personalization is very low, at least in K-12 education. At the same time, many college students are digitally savvy, and Hobbs says some of her own students have significant followings on platforms such as Instagram, TikTok and Twitch.

    As for how colleges and universities can help, Hobbs says general education requirements—such as those suggested in the survey—“might be the best place for media literacy to thrive in a higher education context.” Learning outcomes from Hobbs’s own digital media literacy course satisfy gen ed requirements regarding effective communication and developing and engaging in civic knowledge and responsibilities.

    Hobbs adds that academic librarians are leaders in media and digital literacy initiatives on many campuses, and that “one of the best ways for college and university students to develop media literacy competencies” is by creating media themselves. Possibilities include creating websites, podcasts, videos for YouTube or other social media, or developing a community public service media campaign or outreach program. Other opportunities? Working at the college newspaper or radio station or managing social media for a college unit or organization.

    “Creating media is a great way to develop media literacy skills, and college faculty may be pleasantly surprised to see what their students can create without any special prompting.”

    What are you and/or your institution doing to promote students’ media literacy? Let us know by submitting one of the forms found here.

    Source link

  • AI Literacy Resource for All – Sovorel

    AI Literacy Resource for All – Sovorel

    There is no longer any way to deny that AI Literacy is a must for all people. Regardless of whether you are a student, faculty, young, or old, all of us must continually develop our AI Literacy to effectively function and excel in our AI-infused world. The importance of everyone developing their AI Literacy has been expressed by virtually all nations and international organizations (UN, 2024; UN, 2024b). Additionally, many business organizations have expressed that in order to be competitive in the workforce, AI Literacy is now an imperative employment skill (Marr, 2024).

    The following Sovorel video and infographic (in addition to the above infographic) provide key components of AI Literacy and specifics regarding prompt engineering and using an advanced prompt formula:

    AI Literacy: Prompt Engineering, Advanced Prompt Formula Infographic (this infographic, the main AI Literacy infographic, and many more are also available within the infographics section: https://sovorelpublishing.com/index.php/infographics)

     

    References

    Cisco. (2024, July 31). AI and the workforce: Industry report calls for reskilling and upskilling as 92 percent of technology roles evolve. Cisco. https://investor.cisco.com/news/news-details/2024/AI-and-the-Workforce-Industry-Report-Calls-for-Reskilling-and-Upskilling-as-92-Percent-of-Technology-Roles-Evolve/default.aspx

    Marr, B. (2024, October 24). The 5 most in-demand skills in 2025. Forbes. https://www.forbes.com/sites/bernardmarr/2024/10/14/the-5-most-in-demand-skills-in-2025/

    UN. (2024). Addendum on AI and Digital Government. United Nations. https://desapublications.un.org/sites/default/files/publications/2024-10/Addendum%20on%20AI%20and%20Digital%20Government%20%20E-Government%20Survey%202024.pdf

    UN. (2024b). Governing AI for humanity. United Nations. https://www.un.org/sites/un2.un.org/files/governing_ai_for_humanity_final_report_en.pdf

    Source link

  • Gaps in sustainability literacy in non-STEM higher education programmes

    Gaps in sustainability literacy in non-STEM higher education programmes

    by Erika Kalocsányiová and Rania Hassan

    Promoting sustainability literacy in higher education is crucial for deepening students’ pro-environmental behaviour and mindset (Buckler & Creech, 2014; UNESCO, 1997), while also fostering social transformation by embedding sustainability at the core of the student experience. In 2022, our group received an SRHE Scoping Award to synthesise the literature on the development, teaching, and assessment of sustainability literacy in non-STEM higher education programmes. We conducted a multilingual systematic review of post-2010 publications from the European Higher Education Area (EHEA), with the results summarised in Kalocsányiová et al (2024).

    Out of 6,161 articles that we identified as potentially relevant, 92 studies met the inclusion criteria and are reviewed in the report. These studies involved a total of 11,790 participants and assessed 9,992 university programmes and courses. Our results suggest a significant growth in research interest in sustainability in non-STEM fields since 2017, with 75 studies published compared to just 17 in the preceding seven years. Our analysis also showed that Spain, the United Kingdom, Germany, Turkey, and Austria had the highest concentration of publications, with 25 EHEA countries represented in total. The 92 reviewed studies were characterised by high methodological diversity: nearly half employed quantitative methods (47%), followed by qualitative studies (40%) and mixed methods research (13%). Curriculum assessments using quantitative content analysis of degree and course descriptors were among the most common study types, followed by surveys and intervention or pilot studies. Curriculum assessments provided a systematic way to evaluate the presence or absence of sustainability concepts within curricula at both single HE institutions and in comparative frameworks. However, they often captured only surface-level indications of sustainability integration into undergraduate and postgraduate programmes, without providing evidence on actual implementation and/or the effectiveness of different initiatives. Qualitative methods, including descriptive case studies and interviews that focused on barriers, challenges, implementation strategies, and the acceptability of new sustainability literacy initiatives, made up 40% of the current research. Mixed methods studies accounted for 13% of the reviewed articles, often applying multiple assessment tools simultaneously, including quantitative sustainability competency assessment instruments combined with open-ended interviews or learning journals.

    In terms of disciplines, Economics, Business, and Administrative Studies held the largest share of reviewed studies (26%), followed by Education (23%). Multiple disciplines accounted for 22% of the reviewed publications, reflecting the interconnected nature of sustainability. Finance and Accounting contributed only 6%, indicating a need for further research. Similarly, Language and Linguistics, Mass Communication and Documentation, and Social Sciences collectively represented only 12% of the reviewed studies. Creative Arts and Design with just 2% was also a niche area. Although caution should be exercised when drawing conclusions from these results, they highlight the need for more research within the underrepresented disciplines. This in turn can help promote awareness among non-STEM students, stimulate ethical discussions on the cultural dimensions of sustainability, and encourage creative solutions through interdisciplinary dialogue.

    Regarding factors and themes explored, the studies focused primarily on the acquisition of sustainability knowledge and competencies (27%), curriculum assessment (23%), challenges and barriers to sustainability integration (10%), implementation and evaluation research (10%), changes in students’ mindset (9%), key competences in sustainability literacy (5%), and active student participation in Education for Sustainable Development (5%). In terms of studies discussing acquisition processes, key focus areas included the teaching of Sustainable Development Goals, awareness of macro-sustainability trends, and knowledge of local sustainability issues. Studies on sustainability competencies focussed on systems thinking, critical thinking, problem-solving skills, ethical awareness, interdisciplinary knowledge, global awareness and citizenship, communication skills, and action-oriented mindset. These competencies and knowledge, which are generally considered crucial for addressing the multifaceted challenges of sustainability (Wiek et al., 2011), were often introduced to non-STEM students through stand-alone lectures, workshops, or pilot studies involving new cross-disciplinary curricula.

    Our review also highlighted a broad range of pedagogical approaches adopted for sustainability teaching and learning within non-STEM disciplines. These covered case and project-based learning, experiential learning methods, problem-based learning, collaborative learning, reflection groups, pedagogical dialogue, flipped classroom approaches, game-based learning, and service learning. While there is strong research interest in the documentation and implementation of these pedagogical approaches, few studies have so far attempted to assess learning outcomes, particularly regarding discipline-specific sustainability expertise and real-world problem-solving skills.

    Many of the reviewed studies relied on single-method approaches, meaning valuable insights into sustainability-focused teaching and learning may have been missed. For instance, studies often failed to capture the complexities surrounding sustainability integration into non-STEM programs, either by presenting positivist results that require further contextualisation or by offering rich context limited to a single course or study group, which cannot be generalised. The assessment tools currently used also seemed to lack consistency, making it difficult to compare outcomes across programmes and institutions to promote best practices. More robust evaluation designs, such as longitudinal studies, controlled intervention studies, and mixed methods approaches (Gopalan et al, 2020; Ponce & Pagán-Maldonado, 2015), are needed to explore and demonstrate the pedagogical effectiveness of various sustainability literacy initiatives in non-STEM disciplines and their impact on student outcomes and societal change.

    In summary, our review suggests good progress in integrating sustainability knowledge and competencies into some core non-STEM disciplines, while also highlighting gaps. Based on the results we have formulated some questions that may help steer future research:

    • Are there systemic barriers hindering the integration of sustainability themes, challenges and competencies into specific non-STEM fields?
    • Are certain disciplines receiving disproportionate research attention at the expense of others?
    • How do different pedagogical approaches compare in terms of effectiveness for fostering sustainability literacy in and across HE fields?
    • What new educational practices are emerging, and how can we fairly assess them and evidence their benefits for students and the environment?

    We also would like to encourage other researchers to engage with knowledge produced in a variety of languages and educational contexts. The multilingual search and screening strategy implemented in our review enabled us to identify and retrieve evidence from 25 EHEA countries and 24 non-English publications. If reviews of education research remain monolingual (English-only), important findings and insights will go unnoticed hindering knowledge exchange, creativity, and innovation in HE.

    Dr. Erika Kalocsányiová is a Senior Research Fellow with the Institute for Lifecourse Development at the University of Greenwich, with research centering on public health and sustainability communication, migration and multilingualism, refugee integration, and the implications of these areas for higher education policies.

    Rania Hassan is a PhD student and a research assistant at the University of Greenwich. Her research centres on exploring enterprise development activities within emerging economies. As a multidisciplinary and interdisciplinary researcher, Rania is passionate about advancing academia and promoting knowledge exchange in higher education.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link