Tag: research

  • Education Department takes a preliminary step toward revamping its research and statistics arm

    Education Department takes a preliminary step toward revamping its research and statistics arm

    In his first two months in office, President Donald Trump ordered the closing of the Education Department and fired half of its staff. The department’s research and statistics division, called the Institute of Education Sciences (IES), was particularly hard hit. About 90 percent of its staff lost their jobs and more than 100 federal contracts to conduct its primary activities were canceled.

    But now there are signs that the Trump administration is partially reversing course and wants the federal government to retain a role in generating education statistics and evidence for what works in classrooms — at least to some extent. On Sept. 25, the department posted a notice in the Federal Register asking the public to submit feedback by Oct. 15 on reforming IES to make research more relevant to student learning. The department also asked for suggestions on how to collect data more efficiently.

    The timeline for revamping IES remains unclear, as is whether the administration will invest money into modernizing the agency. For example, it would take time and money to pilot new statistical techniques; in the meantime, statisticians would have to continue using current protocols.

    Still, the signs of rebuilding are adding up. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    At the end of May, the department announced that it had temporarily hired a researcher from the Thomas B. Fordham Institute, a conservative think tank, to recommend ways to reform education research and development. The researcher, Amber Northern, has been “listening” to suggestions from think tanks and research organizations, according to department spokeswoman Madi Biedermann, and now wants more public feedback.  

    Biedermann said that the Trump administration “absolutely” intends to retain a role in education research, even as it seeks to close the department. Closure will require congressional approval, which hasn’t happened yet. In the meantime, Biedermann said the department is looking across the government to find where its research and statistics activities “best fit.”

    Other IES activities also appear to be resuming. In June, the department disclosed in a legal filing that it had or has plans to reinstate 20 of the 101 terminated contracts. Among the activities slated to be restarted are 10 Regional Education Laboratories that partner with school districts and states to generate and apply evidence. It remains unclear how all 20 contracts can be restarted without federal employees to hold competitive bidding processes and oversee them. 

    Earlier in September, the department posted eight new jobs to help administer the National Assessment of Educational Progress (NAEP), also called the Nation’s Report Card. These positions would be part of IES’s statistics division, the National Center for Education Statistics. Most of the work in developing and administering tests is handled by outside vendors, but federal employees are needed to award and oversee these contracts. After mass firings in March, employees at the board that oversees NAEP have been on loan to the Education Department to make sure the 2026 NAEP test is on schedule.

    Only a small staff remains at IES. Some education statistics have trickled out since Trump took office, including its first release of higher education data on Sept. 23. But the data releases have been late and incomplete

    It is believed that no new grants have been issued for education studies since March, according to researchers who are familiar with the federal grant making process but asked not to be identified for fear of retaliation. A big obstacle is that a contract to conduct peer review of research proposals was canceled so new ideas cannot be properly vetted. The staff that remains is trying to make annual disbursements for older multi-year studies that haven’t been canceled. 

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    With all these changes, it’s becoming increasingly difficult to figure out the status of federally funded education research. One potential source of clarity is a new project launched by two researchers from George Washington University and Johns Hopkins University. Rob Olsen and Betsy Wolf, who was an IES researcher until March, are tracking cancellations and keeping a record of research results for policymakers. 

    If it’s successful, it will be a much-needed light through the chaos.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about reforming IES was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Is research culture really too hard to assess?

    Is research culture really too hard to assess?

    Assessing research culture has always been seen as difficult – some would say too difficult.

    Yet as REF 2029 pauses for reflection, the question of whether and how culture should be part of the exercise is unavoidable. How we answer this has the potential to shape not only the REF, but also the value we place on the people and practices that define research excellence.

    The push to assess research culture emerged from recognition that thriving, well-supported researchers are themselves important outcomes of the research system. The Stern Review highlighted that sustainable research excellence depends not just on research outputs but on developing the people who produce them. The Harnessing the Metric Tide report built on this understanding, recommending that future REF cycles should reward progress towards better research cultures.

    A significant proportion of what we have learnt about assessing research culture came from the People, Culture and Environment indicators project, run by Vitae and Technopolis, and Research England’s subsequent REF PCE pilot exercise. Together with the broader consultation as part of the Future Research Assessment Programme, this involved considerable sector engagement over multiple years.

    Indicators

    Nearly 1,600 people applied to participate in the PCE indicators co-development workshops. Over 500 participated across 137 institutions, with participants at all levels of career stage and roles. Representatives from ARMA, NADSN, UKRN, BFAN, ITSS, FLFDN and NCCPE helped facilitate the discussions and synthesise messages.

    The workshops confirmed what many suspected about assessing research culture. It’s genuinely difficult. Nearly every proposed indicator proved problematic. Participants raised concerns about gaming and burden. Policies could become tick-box exercises. Metrics might miss crucial context. But participants saw that clusters of indicators used together and contextualised could allow institutions to tell meaningful stories about their approach and avoid the potentially misleading pictures painted by isolated indicators.

    A recurring theme was the need to focus on mechanisms, processes and impacts, not on inputs. Signing up for things, collecting badges, and writing policies isn’t enough. We need to make sure that we are doing something meaningful behind these. This doesn’t mean we cannot evidence progress, rather that the evidence needs contextualising. The process of developing evidence against indicators would incentivise institutions to think more carefully about what they’re doing, why, and for whom.

    The crucial point that seems to have been lost is that REF PCE never set out to measure culture directly. Instead, it aimed to assess credible indicators of how institutions enable and support inclusive, sustainable, high-quality research.

    REF PCE was always intended to be an evolution, not a revolution. Culture has long been assessed in the REF, including through the 2021 Environment criteria of vitality and sustainability. The PCE framework aimed to build on this foundation, making assessment more systematic and comprehensive.

    Finance and diversity

    Two issues levelled at PCE have been the sector’s current financial climate and the difficulty of assessing culture fairly across institutional diversity. These are not new revelations. Both were anticipated and debated extensively in the PCE indicators project.

    Workshop participants stressed that the assessment must recognise that institutions operate with different resources and constraints, focusing on progress and commitment rather than absolute spending levels. There is no one-size-fits-all answer to what a good research culture looks like. Excellent research culture can look very different across the sector and even within institutions.

    This led to a key conclusion: fair assessment must recognise different starting points while maintaining meaningful standards. Institutions should demonstrate progress against a range of areas, with flexibility in how they approach challenges. Assessment needs to focus on ‘distance travelled’ rather than the destination reached.

    Research England developed the REF PCE pilot following these insights. This was deliberately experimental, testing more indicators than would ultimately be practical, as a unique opportunity to gather evidence about what works, what doesn’t, what is feasible, and equitable across the sector. Pilot panel members and institutions were co-designers, not assessors and assessees. The point was to develop evidence for a streamlined, proportionate, and robust approach to assessing culture.

    REF already recognises that publications and impact are important outputs of research. The PCE framework extended this logic: thriving, well-supported people working across all roles are themselves crucial outcomes that institutions should develop and celebrate.

    This matters because sustainable research excellence depends on the people who make it happen. Environments that support career development, recognise diverse contributions, and foster inclusion don’t just feel better to work in – they produce better research. The consultation revealed sophisticated understanding of this connection. Participants emphasised that research quality emerges from cultures that value integrity, collaboration, and support for all contributors.

    Inputs

    Some argue that culture is an input to the system that shouldn’t be assessed directly. Others suggest establishing baseline performance requirements as a condition for funding. However, workshop discussions revealed that setting universal standards low enough for all institutions to meet renders them meaningless as drivers of improvement. Baselines are important, but alone they are not sufficient. Research culture requires attention through assessment, incentivisation and reward that goes beyond minimum thresholds.

    Patrick Vallance and Research England now have unprecedented evidence about research culture assessment. Consultation has revealed sector priorities. The pilot has tested practical feasibility. The upcoming results, to be published in October, will show what approaches are viable and proportionate.

    Have we encountered difficulties? Yes. Do we have a perfect solution for assessing culture? No. But this REF is a huge first step toward better understanding and valuing of the cultures that underpin research in HE. We don’t need all the answers for 2029, but we shouldn’t discard the tangible progress made through national conversations and collaborations.

    This evidence base provides a foundation for informed decisions about whether and how to proceed. The question is whether policymakers will use it to build on promising foundations or retreat to assessment approaches that miss crucial dimensions of research excellence.

    The REF pause is a moment of choice. We can step back from culture as ‘too hard’, or build on the most substantial sector-wide collaboration ever undertaken on research environments. If we discard what we’ve built, we risk losing sight of the people and conditions that make UK research excellent.

    Source link

  • Saudi and Australia forge new paths in education and research

    Saudi and Australia forge new paths in education and research

    During the visit, Al-Benyan met with Australia’s minister of education, Jason Clare, where discussions focused on expanding ties in higher education, scientific research, and innovation, with emphasis on joint university initiatives, including twinning programs and faculty and student exchanges designed to build stronger academic links between the two countries.

    The research collaboration was prominently featured on the agenda, with both sides highlighting opportunities in fields such as artificial intelligence, cybersecurity, renewable energy, and health sciences. The minister also discussed investment opportunities in Saudi Arabia’s evolving education sector under Vision 2030, with a view to establishing local branches and research centers.

    Australia’s expertise in technical and vocational training was another focal point, as Saudi looks to enhance human capital development and equip its young population with the skills needed to succeed in the future labor market. Both ministers underlined the importance of supporting Saudi students in Australia by strengthening academic pathways and ensuring a welcoming educational and social environment.

    As well as his meeting with Clare, Al-Benyan held talks with professor Phil Lambert, a leading Australian authority on curriculum development. Their discussions centered on collaboration with Saudi Arabia’s National Curriculum Centre to develop learning programs that promote critical thinking, creativity, and innovation.

    The meeting reviewed best practices in student assessment, teacher training, and professional certification, aligning with global standards. Opportunities for joint research on performance evaluation and digital education methods were also explored with the aim of integrating advanced technologies into classrooms.

    Al-Benyan also took part in the Saudi-Australian Business Council meeting in Sydney, where he highlighted investment opportunities in the Kindgdom’s education sector in line with Vision 2030.

    Education is a key pillar globally and a central focus of Saudi Arabia’s Vision 2030, which aims to create a world class education system that nurtures innovation and drives future ready skills
    Sam Jamsheedi, president and chairman of the Australian Saudi Business Forum

    Conversations covered the launching of scholarship and exchange programs, advancing educational infrastructure and technologies, and promoting joint research in priority fields such as health, energy, and artificial intelligence, underscoring the importance of developing programs to enhance academic qualifications and support initiatives for persons with disabilities, while reaffirming Saudi Arabia’s commitment to supporting investors through regulatory incentives and strategic backing.

    “It was a pleasure to welcome the Minister of Education, His Excellency Yousef Al Benyan, as part of the official Ministry of Education, Saudi Arabia delegation from the Kingdom of Saudi Arabia to Australia,” said Sam Jamsheedi, president and chairman of the Australian Saudi Business Forum.

    “Education is a key pillar globally and a central focus of Saudi Arabia’s Vision 2030, which aims to create a world class education system that nurtures innovation and drives future ready skills.”

    “Our Council was proud to host a roundtable with leading Australian universities and training providers, giving Ministerial attendees first hand insights into Australia’s capabilities across higher education, vocational training, and research collaboration.”

    “Australian education already has a strong presence in the Kingdom, with a growing number of partnerships across early childhood education, schooling, technical training & university programs,” he added.

    Source link

  • 10+ Years of Lasting Impact and Local Commitment

    10+ Years of Lasting Impact and Local Commitment

    Over 60,000 students have benefited from the math program built on how the brain naturally learns

    A new analysis shows that students using ST Math at Phillips 66-funded schools are achieving more than twice the annual growth in math performance compared to their peers. A recent analysis by MIND Research Institute, which included 3,240 students in grades 3-5 across 23 schools, found that this accelerated growth gave these schools a 12.4 percentile point advantage in spring 2024 state math rankings.

    These significant outcomes are the result of a more than 10-year partnership between Phillips 66 and MIND Research Institute. This collaboration has brought ST Math, created by MIND Education, the only PreK–8 supplemental math program built on the science of how the brain learns, fully funded to 126 schools, 23 districts, and more than 60,000 students nationwide. ST Math empowers students to explore, make sense of, and build lasting confidence in math through visual problem-solving.

    “Our elementary students love JiJi and ST Math! Students are building perseverance and a deep conceptual understanding of math while having fun,” said Kim Anthony, Executive Director of Elementary Education, Billings Public Schools. “By working through engaging puzzles, students are not only fostering a growth mindset and resilience in problem-solving, they’re learning critical math concepts.”

    The initiative began in 2014 as Phillips 66 sought a STEM education partner that could deliver measurable outcomes at scale. Since then, the relationship has grown steadily, and now, Phillips 66 funds 100% of the ST Math program in communities near its facilities in California, Washington, Montana, Oklahoma, Texas, Illinois, and New Jersey. Once involved, schools rarely leave the program.

    To complement the in-class use of ST Math, Phillips 66 and MIND introduced Family Math Nights. These events, hosted at local schools, bring students, families, and Phillips 66 employee volunteers together for engaging, hands-on activities. The goal is to build math confidence in a fun, interactive setting and to equip parents with a deeper understanding of the ST Math program and new tools to support their child’s learning at home.

    “At Phillips 66, we believe in building lasting relationships with the communities we serve,” said Courtney Meadows, Manager of Social Impact at Phillips 66. “This partnership is more than a program. It’s a decade of consistent, community-rooted support to build the next generation of thinkers and improve lives through enriching educational experiences.”

    ST Math has been used by millions of students across the country and has a proven track record of delivering a fundamentally different approach to learning math. Through visual and interactive puzzles, the program breaks down math’s abstract language barriers to benefit all learners, including English Learners, Special Education students, and Gifted and Talented students.

    “ST Math offers a learning experience that’s natural, intuitive, and empowering—while driving measurable gains in math proficiency,” said Brett Woudenberg, CEO of MIND Education. “At MIND, we believe math is a gateway to brighter futures. We’re proud to partner with Phillips 66 in expanding access to high-quality math learning for thousands of students in their communities.”

    Explore how ST Math is creating an impact in Phillips 66 communities with this impact story: https://www.mindeducation.org/success-story/brazosport-isd-texas/

    About MIND Education
    MIND Education engages, motivates and challenges students towards mathematical success through its mission to mathematically equip all students to solve the world’s most challenging problems. MIND is the creator of ST Math, a pre-K–8 visual instructional program that leverages the brain’s innate spatial-temporal reasoning ability to solve mathematical problems; and InsightMath, a neuroscience-based K-6 curriculum that transforms student learning by teaching math the way every brain learns so all students are equipped to succeed. Since its inception in 1998, MIND Education and ST Math has served millions and millions of students across the country. Visit MINDEducation.org.

    About Phillips 66
    Phillips 66 (NYSE: PSX) is a leading integrated downstream energy provider that manufactures, transports and markets products that drive the global economy. The company’s portfolio includes Midstream, Chemicals, Refining, Marketing and Specialties, and Renewable Fuels businesses. Headquartered in Houston, Phillips 66 has employees around the globe who are committed to safely and reliably providing energy and improving lives while pursuing a lower-carbon future. For more information, visit phillips66.com or follow @Phillips66Co on LinkedIn.

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link

  • The Shrinking Research University Business Model

    The Shrinking Research University Business Model

    For most of the past 30 or so years, big Canadian universities have all been working off more or less the same business model: find areas where you can make big profits and use those profits to make yourself more research-intensive.

    That’s it. That’s the whole model.

    International students? Big profit centres. Professional programs? You better believe those are money-makers. Undergraduate studies – well, they might not make that much money in toto but holy moly first-year students are taken advantage of quite hideously to subsidize other activities, most notably research-intensity.

    Just to be clear, when I talk about “research-intensity”, I am not really talking about laboratories or physical infrastructure. I am talking about the entire financial superstructure that allows profs to teach 2 courses per semester and to be paid at rates which are comparable to those at (generally better-funded) large public research universities in the US. It’s about compensation, staffing complements, the whole shebang – everything that allows our institutions to compete internationally for research talent. Governments don’t pay enough, directly, for institutions to do that. So, universities have found ways to offer new products, or re-arrange the products they offer, in such a way as to support these goals of competitive hiring.

    Small universities do not have quite the same imperatives with respect to research, but this business model affects them nonetheless. To the extent that they wish to compete for staff with the research-intensive institutions, they have to pay higher salaries as well. Maybe the most extreme outcome of that arms race occurred at Laurentian, whose financial collapse was at least in part due to the university implicitly trying to align itself to U15 universities’ pay scales rather than, say, the pay scale at Lakehead (unions, which like to write ambitious pay “comparables” into institutional collective agreements, are obviously also a factor here).

    Anyways, the issue is that for one reason or another, governments have been chipping away at these various sources of profit that have been used to cross-subsidize research-intensity. The situation with international students is an obvious one, but this is happening in other ways too. Professional master’s degrees are not generating the returns they used to as private universities, both foreign and domestic, begin to compete, particularly in the business sector. (A non-trivial part of the reason that Queen’s found itself in financial difficulty last year was because its business school didn’t turn a profit for the first time in years. I don’t know the ins and outs of this, but I would be surprised if Northeastern’s aggressive push into Toronto wasn’t eating some of its executive education business). 

    Provincial governments – some of them, anyway – are also setting up colleges to compete with universities in a number of areas for undergraduate students. In Ontario, that has been going on for 20-25 years, but in other places like Nova Scotia it is just beginning. Some on the university side complain about these programs, primarily in polytechnics, being preferred by government because they are “cheap”, but they rarely get into specifics about quality. One reason college programs are often better on a per-dollar measure? The colleges aren’t building in a surplus to pay for research-intensity – this is precisely what allows them to do revolutionary things like not stuffing 300 first-year students in a single classroom.  

    In brief then: the feds have taken away a huge source of cross-subsidy. Provinces, to varying degrees (most prominently in Ontario), have been introducing competition to chip-away at other sources of surplus that allowed universities to cross-subsidize research intensity. Together, these two processes are putting the long-standing business model of big Canadian universities at risk.

    The whole issue of cross-subsidization raises two policy questions which are not often discussed in polite company – in Canada, at least. The first has to do with cross-subsidization and whether it is the correct policy or not. I suspect there is a strong majority among higher education’s interested public that think it probably is a good policy; we just don’t know for sure because the policy emerged, as so many Canadian policies do, through a process of extreme passive-aggressiveness. Institutions were mad at governments for not directly funding what they wanted to do, so they went off and did their own thing. Governments, grateful not to be harassed for money, said nothing, which institutions took for approval whereas in fact it was just (temporary) non-disapproval. 

    (I should add here – precisely because of all the passive-aggressiveness – it is not 100% clear to me the extent to which provincial governments understand the implications of introducing competition. When they allow new private or college degree programs, they likely think “we are improving options for students” not “I wonder how this might degrade the ability of institutions to conduct research”. And, of course, the reason they don’t think that is precisely because Canadians achieve everything through passive-aggression rather than open policy debates which might illuminate choices and trade-offs. Yay, us.)

    The second policy question – which we really never ever raise – is whether or not research-intensity, as it is practiced in Canadian universities, is worth subsidizing in the first place. I know, you’re all reading that in shock and horror because what is a university if it is not about research? Well, that’s a pretty partial view, and historically, a pretty recent one.  Even among the U15, there are several institutions whose commitment to being big research enterprises is less than 40 years old. And, of course, we already have plenty of universities (e.g. the Maple League) where research simply isn’t a focus – what’s to say the current balance of research-intensive to non-research-intensive universities is the correct one?

    Now add the following thought: if the country clearly doesn’t think that university research matters because the knowledge economy doesn’t matter and we should all be out there hewing wood and drawing water, and if the federal government not only chops the budget 2024 promises on research but then also cuts deeply into existing budgets, what compelling policy reason is there to keep arranging our universities the way we do?  Why not get off the cross-subsidization treadmill and think of ways of spending money on actually improving undergraduate education (which the sector always claims to be doing, but isn’t much, really).

    I am not, of course, advocating this as a course of policy. But given the way both the politics of research universities and the economics of their business models are heading, we might need to start discussing this stuff. Maybe even openly, for a change.

    Source link

  • Lab Logos and Visual Branding for Your Research with Dr. Makella Coudray

    Lab Logos and Visual Branding for Your Research with Dr. Makella Coudray

    Dr. Makella Coudray cares about how her research shows up online. She knows that when your science is visually engaging, it can reach more people. We worked together on a number of elements that make up her visual online presence for the Sexual Health, Equity, and Empowerment Research Lab (SHEER Lab).

    In this conversation, we talk about branding for your research. Specifically, her logos, social media, and what it’s like to share her online presence with people. We also chat about what prompted her to create a personal academic website and her research lab website.

    A full text transcript will be added to this blog in the coming week, along with English captions for the YouTube live. Thank you!

    Mentioned in this episode:

    Makella Coudray, PhD, MPH, CPH is an Assistant Professor of Medicine in the Department of Population Health Sciences at the University of Central Florida (UCF). She is the Director and Principal Investigator of the Sexual Health, Equity, and Empowerment Research (SHEER) Lab.

    Dr. Makella Courday

    She is an epidemiologist and implementation scientist whose work focuses on improving sexual and reproductive health outcomes for communities historically underserved by healthcare systems. Her research prioritizes STI prevention and access to care, especially in populations facing barriers to care. Through the SHEER Lab, she leads efforts to design practical, evidence-informed solutions. Her STRiP project explores innovative testing approaches to reduce STI burden and improve access.
    Dr. Makella Coudray has a PhD in Public Health (Epidemiology) from Florida International University. She has a CPH (Certified in Public Health) credential from the National Board of Public Health Examiners (NBPHE). She got her Masters in Public Health (MPH) and Bachelor of Science (BSc) in Biology from St. George’s University in Grenada.
    She was born and raised in Trinidad and Tobago. Visit her website MakellaSCoudray.com

    Source link

  • New research highlights the importance and challenges of K-12 student engagement

    New research highlights the importance and challenges of K-12 student engagement

    This press release originally appeared online.

    Key points:

    While there is wide agreement that student engagement plays a vital role in learning, educators continue to face uncertainty about what engagement looks like, how best to measure it, and how to sustain it, according to a new study from Discovery Education

    Education Insights 2025–2026: Fueling Learning Through Engagement captures prevailing attitudes and beliefs on the topic of engagement from 1,398 superintendents, teachers, parents, and students from across the United States. Survey data was collected in May 2025 by Hanover Research on behalf of Discovery Education

    Discovery Education conducted the Education Insights report to gain a deeper understanding of how engagement is defined, observed, and nurtured in K-12 classrooms nationwide, and we are thankful to the participants who shared their perspectives and insights with us,” said Brian Shaw, Discovery Education’s Chief Executive Officer. “One of the most important findings of this report is that engagement is seen as essential to learning, but is inconsistently defined, observed, and supported in K-12 classrooms. I believe this highlights the need for a more standardized approach to measuring student engagement and connecting it to academic achievement. Discovery Education has embarked on an effort to address those challenges, and we look forward to sharing more as our work progresses.” 

    Key findings of the Education Insights 2025–2026: Fueling Learning Through Engagement report include: 

    Engagement is broadly recognized as a key driver of learning and success. Ninety-three percent of educators surveyed agreed that student engagement is a critical metric for understanding overall achievement, and 99 percent of superintendents polled believe student engagement is one of the top predictors of success at school. Finally, 92 percent of students said that engaging lessons make school more enjoyable. 

    But educators disagree on the top indicators of engagement. Seventy-two percent of teachers rated asking thoughtful questions as the strongest indicator of student engagement. However, 54 percent of superintendents identified performing well on assessments as a top engagement indicator. This is nearly twice as high as teachers, who rank assessments among the lowest indicators of engagement. 

    School leaders and teachers disagree on if their schools have systems for measuring engagement. While 99 percent of superintendents and 88 percent of principals said their district has an intentional approach for measuring engagement, only 60 percent of teachers agreed. Further, nearly one-third of teachers said that a lack of clear, shared definitions of student engagement is a top challenge to measuring engagement effectively. 

    Educators and students differ on their perceptions of engagement levels. While 63 percent of students agreed with the statement “Students are highly engaged in school,” only 45 percent of teachers and 51 percent of principals surveyed agreed with the same statement.  

    Students rate their own engagement much higher than their peers. Seventy percent of elementary students perceived themselves as engaged, but only 42 percent perceived their peers as engaged. Fifty-nine percent of middle school students perceived themselves engaged in learning, but only 36 percent perceived their peers as engaged. Finally, 61 percent of high school students perceived themselves as engaged, but only 39 percent described their peers as engaged. 

    Proximity to learning changes impressions of AI. Two-thirds of students believe AI could help them learn faster, yet fewer than half of teachers report using AI themselves to complete tasks. Only 57 percent of teachers agreed with the statement “I frequently learn about positive ways students are using AI,” while 87 percent of principals and 98 percent of superintendents agree. Likewise, only 53 percent of teachers agreed with the statement “I am excited about the potential for AI to support teaching and learning,” while 83 percent of principals and 94 percent of superintendents agreed. 

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Prioritizing behavior as essential learning

    Prioritizing behavior as essential learning

    Key points:

    In classrooms across the country, students are mastering their ABCs, solving equations, and diving into science. But one essential life skill–behavior–is not in the lesson plan. For too long, educators have assumed that children arrive at school knowing how to regulate emotions, resolve conflict, and interact respectfully. The reality: Behavior–like math or reading–must be taught, practiced, and supported.

    Today’s students face a mounting crisis. Many are still grappling with anxiety, disconnection, and emotional strain following the isolation and disruption of the COVID pandemic. And it’s growing more serious.

    Teachers aren’t immune. They, too, are managing stress and emotional overload–while shouldering scripted curricula, rising expectations, and fewer opportunities for meaningful engagement and critical thinking. As these forces collide, disruptive behavior is now the leading cause of job-related stress and a top reason why 78 percent of teachers have considered leaving the profession.

    Further complicating matters is social media and device usage. Students and adults alike have become deeply reliant on screens. Social media and online socialization–where interactions are often anonymous and less accountable–have contributed to a breakdown in conflict resolution, empathy, and recognition of nonverbal cues. Widespread attachment to cell phones has significantly disrupted students’ ability to regulate emotions and engage in healthy, face-to-face interactions. Teachers, too, are frequently on their phones, modeling device-dependent behaviors that can shape classroom dynamics.

    It’s clear: students can’t be expected to know what they haven’t been taught. And teachers can’t teach behavior without real tools and support. While districts have taken well-intentioned steps to help teachers address behavior, many initiatives rely on one-off training without cohesive, long-term strategies. Real progress demands more–a districtwide commitment to consistent, caring practices that unify educators, students, and families.

    A holistic framework: School, student, family

    Lasting change requires a whole-child, whole-school, whole-family approach. When everyone in the community is aligned, behavior shifts from a discipline issue to a core component of learning, transforming classrooms into safe, supportive environments where students thrive and teachers rediscover joy in their work. And when these practices are reinforced at home, the impact multiplies.

    To help students learn appropriate behavior, teachers need practical tools rather than abstract theories. Professional development, tiered supports, targeted interventions, and strategies to build student confidence are critical. So is measuring impact to ensure efforts evolve and endure.

    Some districts are leading the way, embracing data-driven practices, evidence-based strategies, and accessible digital resources. And the results speak for themselves. Here are two examples of successful implementations.

    Evidence-based behavior training and mentorship yields 24 percent drop in infractions within weeks

    With more than 19,000 racially diverse students across 24 schools east of Atlanta, Newton County Schools prioritized embedded practices and collaborative coaching over rigid compliance. Newly hired teachers received stipends to complete curated, interactive behavior training before the school year began. They then expanded on these lessons during orientation with district staff, deepening their understanding.

    Once the school year started, each new teacher was partnered with a mentor who provided behavior and academic guidance, along with regular classroom feedback. District climate specialists also offered further support to all teachers to build robust professional learning communities.

    The impact was almost immediate. Within the first two weeks of school, disciplinary infractions fell by 24 percent compared to the previous year–evidence that providing the right tools, complemented by layered support and practical coaching, can yield swift, sustainable results.

    Pairing shoulder coaching with real-time data to strengthen teacher readiness

    With more than 300,000 students in over 5,300 schools spanning urban to rural communities, Clark County School District in Las Vegas is one of the largest and most diverse in the nation.

    Recognizing that many day-to-day challenges faced by new teachers aren’t fully addressed in college training, the district introduced “shoulder coaching.” This mentorship model pairs incoming teachers with seasoned colleagues for real-time guidance on implementing successful strategies from day one.

    This hands-on approach incorporates videos, structured learning sessions, and continuous data collection, creating a dynamic feedback loop that helps teachers navigate classroom challenges proactively. Rather than relying solely on reactive discipline, educators are equipped with adaptable strategies that reflect lived classroom realities. The district also uses real-time data and teacher input to evolve its behavior support model, ensuring educators are not only trained, but truly prepared.

    By aligning lessons with the school performance plan, Clark County School District was able to decrease suspensions by 11 percent and discretionary exclusions by 17 percent.  

    Starting a new chapter in the classroom

    Behavior isn’t a side lesson–it’s foundational to learning. When we move beyond discipline and make behavior a part of daily instruction, the ripple effects are profound. Classrooms become more conducive to learning. Students and families develop life-long tools. And teachers are happier in their jobs, reducing the churn that has grown post-pandemic.

    The evidence is clear. School districts that invest in proactive, strategic behavior supports are building the kind of environments where students flourish and educators choose to stay. The next chapter in education depends on making behavior essential. Let’s teach it with the same care and intentionality we bring to every other subject–and give every learner the chance to succeed.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • What’s Next for Concussion and CTE Research?

    What’s Next for Concussion and CTE Research?

    The Higher Education Inquirer is calling on both the National Collegiate Athletic Association (NCAA) and the U.S. Department of Defense (DoD) to explain the suspension of the Concussion Assessment, Research and Education (CARE) Consortium, the largest concussion study in U.S. history. Since 2014, CARE has sought to illuminate the effects of concussion and repetitive head impact exposure (HIE) on student-athletes and military service members.

    A Decade of Groundbreaking Work

    Funded through an initial $30 million “Grand Alliance,” CARE enrolled more than 53,000 athletes and cadets and tracked over 5,500 diagnosed concussions across more than two dozen universities and four service academies. Its successive phases—CARE 1.0 (acute effects), CARE 2.0 (cumulative impacts), and CARE-SALTOS Integrated (long-term outcomes)—provided unprecedented insights into how concussions affect recovery, cognition, mood, sleep, and overall well-being.

    The CARE study generated more than 90 peer-reviewed publications, influencing safety protocols, athletic training practices, and public health debates in both NCAA settings and the U.S. military.

    CTE and the Need for Decades-Long Research

    The suspension comes at a critical moment. Concerns about chronic traumatic encephalopathy (CTE)—a degenerative brain disease linked to repetitive head trauma—are rising. Because CTE’s symptoms often surface decades after injuries, researchers emphasize that only long-term, continuous studies can reveal who develops CTE and why.

    Pausing or dismantling CARE risks losing continuity in precisely the kind of data needed to connect the dots between adolescent or collegiate injuries and late-life neurodegenerative conditions.

    Collateral Damage: Workers Left Behind

    The disruption of CARE has already produced casualties beyond lost data. At the University of Michigan, one of the leading CARE sites, about two dozen research workers were abruptly laid off. Without union protections, they had little recourse. This underscores how fragile large research consortia can be—dependent not only on grants and institutional goodwill, but also on a workforce often treated as disposable.

    These layoffs raise troubling questions: If the workers who made CARE possible are discarded without warning, what does that say about the broader commitment to athlete and cadet safety?

    Outstanding Questions for NCAA and DoD

    The Higher Education Inquirer is pressing for answers:

    • Why was CARE suspended? Was this due to funding shortfalls, shifting priorities, or political pressure?

    • Will existing data remain accessible? The CARE Consortium has been a vital contributor to the Federal Interagency Traumatic Brain Injury Research (FITBIR) database.

    • What about the workforce? Why were employees terminated without protections, and what obligations do the NCAA, DoD, and participating universities have to them?

    • What is the long-term plan for concussion research? Without decades-long studies, the risks of CTE and other late-life conditions will remain poorly understood.

    Big Loss for Athletes

    If CARE is permanently suspended, the consequences will extend far beyond academia. Athletes and cadets will lose a vital source of protection, science will lose irreplaceable data, and workers will continue to bear the costs of institutional indifference.

    The Higher Education Inquirer urges the NCAA and DoD to clarify CARE’s future and recommit to the kind of decades-long research that brain science demands. Anything less is a betrayal—to athletes, to service members, and to the very workers who made this research possible.


    Sources

    • NCAA. NCAA-DOD Grand Alliance: CARE Consortium. ncaa.org

    • CARE Consortium. About the Consortium. careconsortium.net

    • NCAA. NCAA and Department of Defense expand concussion study with $22.5 million. (October 31, 2018). ncaa.org

    • U.S. Army Medical Research and Development Command. Research Supporting a Lifetime of Brain Injury. mrdc.health.mil

    • NIH. Concussion Assessment, Research, and Education Consortium (CARE) Study Data. ncbi.nlm.nih.gov

    Source link

  • Job Descriptions – Research Professionals

    Job Descriptions – Research Professionals

    Job Description Index

    Research Professionals

    Developed with the help of volunteer leaders and member institutions across the country, The Job Descriptions Index provides access to sample job descriptions for positions unique to higher education.

    Descriptions housed within the index are aligned with the annual survey data collected by the CUPA-HR research team. To aid in the completion of IPEDS and other reporting, all position descriptions are accompanied by a crosswalk section like the one below.

    Crosswalk Example

    Position Number: The CUPA-HR position number
    BLS SOC#: Bureau of Labor Statistics occupation classification code
    BLS Standard Occupational Code (SOC) Category Name: Bureau of Labor Statistics occupation category title
    US Census Code#: U.S. Census occupation classification code
    VETS-4212 Category: EEO-1 job category title used on VETS-4212 form

    ***SOC codes are provided as suggestions only. Variations in the specific functions of a position may cause the position to better align with an alternate SOC code.

    Sample Job Descriptions

    Instructional Lab Manager

    Medical Sciences, Research Assistant

    Medical Sciences, Research Associate

    Physical Sciences, Research Associate

    The post Job Descriptions – Research Professionals appeared first on CUPA-HR.

    Source link