Tag: Data

  • AI Runs on Data — And Higher Ed Is Running on Empty

    AI Runs on Data — And Higher Ed Is Running on Empty

    Let’s cut to it: Higher ed is sprinting toward the AI revolution with its shoelaces untied.

    Presidents are in boardrooms making bold declarations. Provosts are throwing out buzzwords like “machine learning” and “predictive modeling.” Enrollment and marketing teams are eager to automate personalization, deploy chatbots, and rewrite campaigns using tools like ChatGPT.

    The energy is real. The urgency is understandable. But there’s an uncomfortable truth institutions need to face: You’re not ready.

    Not because you’re not visionary. Not because your teams aren’t capable. But because your data is a disaster.

    AI is not an easy button

    Somewhere along the way, higher ed started treating AI like a miracle shortcut — a shiny object that could revolutionize enrollment, retention, and student services overnight.

    But AI isn’t a magic wand. It’s more like a magnifying glass, exposing what’s underneath.

    If your systems are fragmented, your records are outdated, and your departments are still hoarding spreadsheets like it’s 1999, AI will only scale the chaos. It won’t save you – it’ll just amplify your problems.

    When AI goes sideways

    Take the California State University system. They announced their ambition to become the nation’s first AI-powered public university system. But after the headlines faded, faculty across the system were left with more questions than answers. Where was the strategy? Who was in charge? What’s the plan?

    The disconnect between vision and infrastructure was glaring.

    Elsewhere, institutions have already bolted AI tools onto outdated systems, without first doing the foundational work. The result? Predictive models that misidentify which students are at risk. Dashboards that contradict themselves. Chatbots that confuse students more than they support them.

    This isn’t an AI failure. It’s a data hygiene failure.

    You don’t need hype — You need hygiene

    Before your institution invests another dollar in AI, ask these real questions:

    • Do we trust the accuracy of our enrollment, academic, and financial data?
    • Are we still manually wrangling CSVs each month just to build reports?
    • Do our systems speak the same language, or are they siloed and outdated?
    • Is our data governance robust enough to ensure privacy, security, and usefulness?
    • Have we invested in the unglamorous but essential work (e.g., integration pipelines, metadata management, and cross-functional alignment)?

    If the answer is “not yet,” then congratulations — you’ve found your starting point. That’s your AI strategy.

    Because institutions that are succeeding with AI, like Ivy Tech Community College, didn’t chase the trend. They built the infrastructure. They did the work. They cleaned up first.

    What true AI readiness looks like (a not-so-subtle sales pitch)

    Let’s be honest: there’s no shortage of vendors selling the AI dream right now. Slick demos, lofty promises, flashy outcomes. But most of them are missing the part that actually matters — a real, proven plan to get from vision to execution.

    This is where Collegis is different. We don’t just sell transformation. We deliver it. Our approach is grounded in decades of experience, built for higher ed, and designed to scale.

    Here’s how we help institutions clean up the mess and build a foundation that makes AI actually work:

    Connected Core®: Your data’s new best friend

    Our proprietary Connected Core solution connects systems, eliminates silos, and creates a single source of truth. It’s the backbone of innovation — powering everything from recruitment to reporting with real-time, reliable data.

    Strategy + AI alignment: Tech that knows where it’s going

    We don’t just implement tools. We align technology to your mission, operational goals, and student success strategy. And we help you implement AI ethically, with governance frameworks that prioritize transparency and accountability.

    Analytics that drive action

    We transform raw data into real insights. From integration and warehousing to dashboards and predictive models, we help institutions interpret what’s really happening — and act on it with confidence.

    Smarter resource utilization

    We help you reimagine how your institution operates. By identifying inefficiencies and eliminating redundancies, we create more agile, collaborative workflows that maximize impact across departments.

    Boosted conversion and retention

    Our solutions enable personalized student engagement, supporting the full lifecycle from inquiry to graduation. That means better conversion rates, stronger persistence, and improved outcomes.

    AI wins when the infrastructure works

    Clean data isn’t a project — it’s a prerequisite. It’s the thing that makes AI more than a buzzword. More than a dashboard. It’s what turns hype into help.

    And when you get it right, the impact is transformational.

    “The level of data mastery and internal talent at Collegis is some of the best-in-class we’ve seen in the EdTech market. When you pair that with Google Cloud’s cutting-edge AI innovation and application development, you get a partnership that can enable transformation not only at the institutional level but within the higher education category at large.”

    — Brad Hoffman, Director, State & Local Government and Higher Education, Google

    There are no shortcuts to smart AI

    AI can only be as effective as the foundation it’s built on. Until your systems are aligned and your data is trustworthy, you’re not ready to scale innovation.

    If you want AI to work for your institution — really work — it starts with getting your data house in order. Let’s build something that lasts. Something that works. Something that’s ready.

    Curious what that looks like? Let’s talk. We’ll help you map out a real, achievable foundation for AI in higher ed.

    You stuck with me to the end? I like you already! Let’s keep the momentum going. If your wheels are turning and you’re wondering where to start, our Napkin Sketch session might be the perfect next step. It’s a fast, collaborative way to map out your biggest data and tech challenges—no pressure, no sales pitch, just a conversation. Check it out!

    Innovation Starts Here

    Higher ed is evolving — don’t get left behind. Explore how Collegis can help your institution thrive.

    Source link

  • Data shows growing GenAI adoption in K-12

    Data shows growing GenAI adoption in K-12

    Key points:

    • K-12 GenAI adoption rates have grown–but so have concerns 
    • A new era for teachers as AI disrupts instruction
    • With AI coaching, a math platform helps students tackle tough concepts
    • For more news on GenAI, visit eSN’s AI in Education hub

    Almost 3 in 5 K-12 educators (55 percent) have positive perceptions about GenAI, despite concerns and perceived risks in its adoption, according to updated data from Cengage Group’s “AI in Education” research series, which regularly evaluates AI’s impact on education.  

    More News from eSchool News

    HVAC projects to improve indoor air quality. Tutoring programs for struggling students. Tuition support for young people who want to become teachers in their home communities.

    Our school has built up its course offerings without having to add headcount. Along the way, we’ve also gained a reputation for having a wide selection of general and advanced courses for our growing student body.

    When it comes to visual creativity, AI tools let students design posters, presentations, and digital artwork effortlessly. Students can turn their ideas into professional-quality visuals, sparking creativity and innovation.

    Ensuring that girls feel supported and empowered in STEM from an early age can lead to more balanced workplaces, economic growth, and groundbreaking discoveries.

    In my work with middle school students, I’ve seen how critical that period of development is to students’ future success. One area of focus in a middle schooler’s development is vocabulary acquisition.

    For students, the mid-year stretch is a chance to assess their learning, refine their decision-making skills, and build momentum for the opportunities ahead.

    Middle school marks the transition from late childhood to early adolescence. Developmental psychologist Erik Erikson describes the transition as a shift from the Industry vs. Inferiority stage into the Identity vs. Role Confusion stage.

    Art has a unique power in the ESL classroom–a magic that bridges cultures, ignites imagination, and breathes life into language. For English Language Learners (ELLs), it’s more than an expressive outlet.

    In the year 2025, no one should have to be convinced that protecting data privacy matters. For education institutions, it’s really that simple of a priority–and that complicated.

    Teachers are superheroes. Every day, they rise to the challenge, pouring their hearts into shaping the future. They stay late to grade papers and show up early to tutor struggling students.

    Want to share a great resource? Let us know at [email protected].

    Source link

  • Sexual misconduct data is coming – here’s what universities should do to prepare

    Sexual misconduct data is coming – here’s what universities should do to prepare

    In 2024, the Office for Students (OfS) launched a pilot survey asking UK students about sexual misconduct during their time in higher education.

    For the first time, there is now a national attempt to capture data on how widespread such incidents are, and how effectively students are supported when they come forward.

    The release of the survey’s results will be a moment that reflects a growing reckoning within the sector: one in which the old tools and quiet handling of disclosures are no longer fit for purpose, and the need for culture change is undeniable.

    This new initiative – known as the Sexual Misconduct Survey (SMS) – ran as a supplement to the National Student Survey (NSS), which since 2005 has become a familiar, if evolving, feature of the higher education calendar.

    While the NSS focuses on broad measures of the student experience, the SMS attempts to delve into one of its most difficult and often under-reported aspects – sexual harassment, violence, and misconduct.

    Its arrival comes against the backdrop of high-profile criticisms of university handling of disclosures, including the misuse of non-disclosure agreements (NDAs), and a new OfS regulatory condition (E6) requiring institutions to take meaningful steps to tackle harassment.

    Understanding the SMS

    The Sexual Misconduct Survey collects both qualitative and quantitative data on students’ experiences. It examines the prevalence of misconduct, the extent to which students are aware of reporting mechanisms, and whether they feel able to use them. Its core aim is clear – to ensure students’ experiences are not just heard, but systematically understood.

    Previous, disparate studies — many led by the National Union of Students and grassroots campaigners — have long indicated that sexual misconduct in higher education is significantly under-reported. This is especially true for marginalised groups, including LGBTQ+ students, Black and disabled students, and students engaged in sex work. The SMS marks an attempt to reach further, with standardised questions asked at scale, across providers.

    Despite its intention, the SMS is not without issues. A key concern raised by student support professionals is the opt-out design. Students were automatically enrolled in the survey unless they actively declined – a move which risks retraumatising victim-survivors who may not have realised the nature of the questions until too late.

    Timing has also drawn criticism. Coming immediately after the exhaustive NSS — with its 26 questions and optional free-text fields — the SMS may suffer from survey fatigue, especially during an already intense period in the academic calendar. Low response rates could undermine the richness or representativeness of the data gathered.

    There are also complex ethical questions about the language used in the survey. In striving for clarity and precision, the SMS employs explicitly descriptive terminology. This can potentially open up difficult experiences unrelated to higher education itself, including childhood abuse or incidents beyond university campuses. Anonymous surveys, by nature, can surface trauma but cannot respond to it — and without parallel safeguarding or signposting mechanisms, the risk of harm increases.

    Lastly, the handling of disclosures matters. While survey responses are anonymous, students need to trust that institutions — and regulators — will treat the findings with sensitivity and respect. Transparency about how data will be used, how institutions will be supported to act on it, and how students will see change as a result is essential to building that trust.

    What to do next?

    The data from the pilot survey will be shared with institutions where response rates and anonymity thresholds allow. But even before the results arrive, universities have an opportunity — and arguably a duty — to prepare.

    Universities should start by preparing leadership and staff to anticipate that the results may reveal patterns or prevalence of sexual misconduct that are difficult to read or acknowledge. Institutional leaders must ensure they are ready to respond with compassion and commitment, not defensiveness or denial.

    Universities should be prepared to review support systems and communication now. Are reporting tools easy to find, accessible, and trauma-informed? Is the student community confident that disclosures will be taken seriously? These questions are important and there is potential for the survey to act as a prompt to review what is already in place as well as what might need urgent attention.

    Universities should also engage students meaningfully. Institutions must commit to involving students — especially survivor advocates and representative bodies — in analysing findings and shaping the response. The worst outcome would be seeing the SMS as a tick-box exercise. The best would be for it to spark co-produced action plans.

    When data is released, institutions avoid the urge to benchmark or downplay. Instead, they should be ready to own the story the data tells and act on the issues it raises. A lower prevalence rate does not necessarily mean a safer campus; it may reflect barriers to disclosure or fear of speaking out. Each result will be different, and a patchwork of responses is no bad thing.

    Finally, it is important to look beyond the numbers and see the person. Qualitative insights from the SMS will be just as important as the statistics. Stories of why students did not report, or how they were treated when they did, offer vital direction for reform and should be something which university leaders and policy makers take time to think about.

    This is only the first year of the SMS, and it is not yet clear whether it will become a permanent feature alongside the NSS. That said, whether the pilot continues or evolves into something new, the challenge it presents is real and overdue.

    The sector cannot afford to wait passively for data. If the SMS is to be more than a compliance exercise, it must be the beginning of a broader culture shift – one that faces up to what students have long known, listens without defensiveness, and builds environments where safety, dignity, and justice are non-negotiable.

    Lasting change will not come from surveys alone. Asking the right questions — and acting with purpose on the answers — is a critical start.

    Source link

  • Data breach affects 10,000 Western Sydney University students – Campus Review

    Data breach affects 10,000 Western Sydney University students – Campus Review

    Students from Western Sydney University (WSU) have had their data accessed and likely posted to the dark web in a data breach event.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Effective academic support requires good data transparency

    Effective academic support requires good data transparency

    Academic support for students is an essential component of their academic success. At a time when resources are stretched, it is critical that academic support structures operate as a well-oiled machine, where each component has a clearly defined purpose and operates effectively as a whole.

    We previously discussed how personal and pastoral tutoring, provided by academic staff, needs to be supplemented by specialist academic support. A natural next step is to consider what that specialist support could look like.

    A nested model

    We’ve identified four core facets of effective academic support, namely personal tutoring (advising/coaching/mentoring etc), the development of academic skills and graduate competencies, all supported by relevant student engagement data. The nested model below displays this framework.

    We also suggest two prerequisites to the provision of academic support.

    Firstly, a student must have access to information related to what academic support entails and how to access this. Secondly, a student’s wellbeing means that they can physically, mentally, emotionally and financially engage with their studies, including academic support opportunities.

    Figure 1: Academic support aspects within a student success nested model

    Focusing on academic support

    Personal tutoring has a central role to play within the curriculum and within academic provision more broadly in enabling student success.

    That said, “academic support” comprises much more than a personal tutoring system where students go for generic advice and support.

    Rather, academic support is an interconnected system with multiple moving parts tailored within each institution and comprising different academic, professional and third-space stakeholders.

    Yet academics remain fundamental to the provision of academic support given their subject matter expertise, industry knowledge and their proximity to students. This is why academics are traditionally personal tutors and historically, this is where the academic support model would have ended. Changes in student needs means the nature of personal tutoring has needed to be increasingly complemented by other forms of academic support.

    Skills and competencies

    Academic skills practitioners can offer rich insights in terms of how best to shape and deliver academic support.

    A broad conception of academic skills that is inclusive of academic literacies, maths, numeracy and stats, study skills, research and information literacy and digital literacy is a key aspect of student academic success. Student acquisition of these skills is complemented by integrated and purposeful involvement of academic skills practitioners across curriculum design, delivery and evaluation.

    Given regulatory focus on graduate outcomes, universities are increasingly expected to ensure that academic support prepares students for graduate-level employability or further study upon graduation. Much like academic skills practitioners, this emphasises the need to include careers and employability consultants in the design and delivery of integrated academic support aligned to the development of both transferable and subject-specific graduate competencies.

    Engaging data

    Data on how students are participating in their learning provides key insights for personal tutors, academic skills practitioners and colleagues working to support the development of graduate competencies.

    Platforms such as StREAM by Kortext enable a data-informed approach to working with students to optimise the provision of academic support. This holistic approach to the sharing of data alongside actionable insights further enables successful transition between support teams.

    Knowing where the support need is situated means that these limited human and financial resources can be directed to where support is most required – whether delivered on an individual or cohort basis. Moreover, targeted provision can be concentrated at relevant points over the academic year. Using engagement data contributes to efficiency drives through balancing the provision of information and guidance to all students. The evidence shows it’s both required and likely to prove effective.

    Academic support is increasingly complicated in terms of how different aspects overlap and interplay within a university’s student success ecosystem. Therefore, when adopting a systems-thinking approach to the design and delivery of academic support, universities must engage key stakeholders, primarily students, academic skills practitioners and personal tutors themselves.

    A priority should be ensuring varied roles of academic support providers are clearly defined both individually and in relation to each other.

    Similarly, facilitating the sharing of data at the individual student level about the provision of academic support should be prioritised to ensure that communication loops are closed and no students fall between service gaps.

    Given that academic support is evolving, we would welcome readers’ views of what additional aspects of academic support are necessary to student success.

    To find out more about how StREAM by Kortext can enable data-informed academic support at your institution, why not arrange a StREAM demonstration.

    Source link

  • HESA has published full student data for 2023–24

    HESA has published full student data for 2023–24

    The 2023-24 HESA student data release was delayed by three rather than six months this year.

    We’re clearly getting better at dealing with the output (and the associated errors) of data collected through the HESA Data Platform. While there are not as many identified issues as in last year’s cycle the list is long and occasionally unnerving.

    Some represent a winding back from last year’s errors (we found 665 distance learning students at the University of Buckingham that should have been there last year), some are surprisingly big (the University of East London should have marked an extra 2,550 students as active), and some are just perplexing: the University of South Wales apparently doesn’t know where 1,570 students came from , the University of Portsmouth doesn’t actually have 470 students from the tiny Caribbean island of St Barthélemy.

    Access and participation

    It is surprising how little access and participation data, in the traditional sense, that HESA now publishes. We get an overview at sector level (the steady growth of undergraduate enrollments from the most deprived parts of England is notable and welcome), but there is nothing by provider level.

    [Full screen]

    One useful proxy for non-traditional students is entry qualifications. We get these by subject area – we learn that in 2025 a quarter of full time undergraduate business students do not have any formal qualifications at all.

    [Full screen]

    With the UK’s four regulators far from agreement as to what should be monitored to ensure that participation reflects potential rather than privilege, it’s not really worth HESA publishing a separate set of UK wide statistics. The closest we get is SEISA, which is now official statistics. I look forward to seeing SEISA applied to UK-domiciled students at individual providers, and published by HESA.

    Student characteristics

    We get by subject area data on disability (at a very general, “marker” – known disability – level) which I have plotted on a by year basis. The axis here is the proportion of students with a known disability – the colours show the total number of students. For me the story here is that creative subjects appear to attract students who disclose disabilities – art and creative writing in particular.

    [Full screen]

    I’ve also plotted ethnicity by provider, allowing you to see the ways in which the ethnic make up of providers has changed over time.

    [Full screen]

    Student domicile (UK)

    UK higher education includes four regulated systems – one in each of the four nations. Although in the main students domiciled in a given nation study at providers based in that same nation, there is a small amount of cross-border recruitment.

    [Full screen]

    Notably nearly three in ten Welsh students study at English providers, including more than a third of Welsh postgraduates. And two in ten Northern Irish students study in England. The concern is that if you move to study, you are less likely to move home – so governments in Wales and Northern Ireland (where there are student number controls) will be thinking carefully about the attractiveness and capacity of their respective systems to avoid a “brain drain”.

    Within these trends, particular providers have proven particularly efficient in cross-border recruitment. If you are a Northern Irish student studying in England chances are you are at Liverpool John Moores University, Liverpool, or Northumbria. If you are Welsh and studying in England your destination may well be UWE, Bristol, Chester – or again, Liverpool John Moores.

    There is a proximity effect – where students are crossing the border, they are likely to stay close to it – but also (if we look at Northern Ireland domiciled students looking at Glasgow or Newcastle) evidence of wider historic cultural links.

    [Full screen]

    Student domicile (international)

    Thinking about providers recruiting from Northern Ireland made me wonder about students from the Republic of Ireland – do we see similar links? As you might expect, the two larger providers in Northern Ireland recruit a significant share, but other winners include the University of Edinburgh and St Margaret’s. UCL has the biggest population among English providers.

    You can use this chart to look at where students from any country in the world end up when they study in the UK (I do insist you look at those St Barthélemy students – literally all at the University of Portsmouth apparently).

    [Full screen]

    An alternate view lets you look at the international population of your institution – the established pattern (China in the Russell Group, India elsewhere) still holds up.

    [Full screen]

    What’s of interest to nervous institutional managers is the way international recruitment is changing over time. This is a more complicated dashboard that helps you see trends at individual providers for a given country, seen along with how your recruitment sits within the sector (mouse over an institution to activate the time series at the bottom.

    [Full screen]

    Source link

  • What the End of DoED Means for the EdTech Industry

    What the End of DoED Means for the EdTech Industry

    The Fed’s influence over school districts had implications beyond just funding and data. Eliminating The Office of Education Technology (OET) will create significant gaps in educational technology research, validation, and equity assurance. Kris Astle, Education Strategist for SMART Technologies, discusses how industry self-governance, third-party organizations, and increased vendor responsibility might fill these gaps, while emphasizing the importance of research-backed design and implementation to ensure effective technology deployment in classrooms nationwide. Have a listen:

    Key Takeaways

    More News from eSchool News

    In recent years, the rise of AI technologies and the increasing pressures placed on students have made academic dishonesty a growing concern. Students, especially in the middle and high school years, have more opportunities than ever to cheat using AI tools.

    As technology trainers, we support teachers’ and administrators’ technology platform needs, training, and support in our district. We do in-class demos and share as much as we can with them, and we also send out a weekly newsletter.

    Math is a fundamental part of K-12 education, but students often face significant challenges in mastering increasingly challenging math concepts.

    Throughout my education, I have always been frustrated by busy work–the kind of homework that felt like an obligatory exercise rather than a meaningful learning experience.

    During the pandemic, thousands of school systems used emergency relief aid to buy laptops, Chromebooks, and other digital devices for students to use in remote learning.

    Education today looks dramatically different from classrooms of just a decade ago. Interactive technologies and multimedia tools now replace traditional textbooks and lectures, creating more dynamic and engaging learning environments.

    There is significant evidence of the connection between physical movement and learning.  Some colleges and universities encourage using standing or treadmill desks while studying, as well as taking breaks to exercise.

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters. In recent weeks, we’ve seen federal and state governments issue stop-work orders, withdraw contracts, and terminate…

    English/language arts and science teachers were almost twice as likely to say they use AI tools compared to math teachers or elementary teachers of all subjects, according to a February 2025 survey from the RAND Corporation.

    During the seven years I served on the Derry School Board in New Hampshire, the board often came first. During those last two years during COVID, when I was chair, that meant choosing many late-night meetings over dinner with my family.

    Want to share a great resource? Let us know at [email protected].

    Source link

  • Losing homeschool data Losing homeschool statistics

    Losing homeschool data Losing homeschool statistics

    The Trump administration says one of its primary goals in education is to expand school choice and put power back in the hands of parents. Yet it has killed the main way to track one of the most rapidly growing options — learning at home. 

    The Education Department began counting the number of homeschooled children in 1999, when fewer than 2 percent of students were educated this way. Homeschooling rose by 50 percent in the first decade of the 2000s and then leveled off at around 3 percent

    The most recent survey of families took place in 2023, and it would have been the first indication of the growth of homeschooling since the pandemic. The data collection was nearly finished and ready to be released to the public, but in February, Elon Musk’s Department of Government Efficiency (DOGE) terminated the contract for this data collection, which is part of the National Household Education Survey, along with 88 other education contracts. Then in March, the federal statisticians who oversee the data collection and could review the final figures were fired along with almost everyone else at the National Center for Education Statistics (NCES). As things stand now, this federal homeschool data is unlikely to ever be released. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    “Work on these data files has stopped and there are no current plans for that work to continue,” said a spokesman for the American Institutes for Research, a nonprofit research organization that had held the contract to collect and analyze the data before DOGE canceled it. 

    The loss of this data upset both avid supporters and watchdogs of school choice, particularly now that some states are expanding their Education Savings Account (ESA) programs to transfer public funds directly to families who homeschool their children. Angela Watson, a prominent Johns Hopkins University researcher who runs the Homeschool Research Lab, called it a “massive loss.” Robert Maranto, a professor in the department of education reform at the University of Arkansas, said that in the past, the federal statistics have helped “dispel some of the myths” that homeschooling is “overwhelmingly white,” when, in fact, a more diverse population is learning this way. Maranto also serves as the editor of the Journal of School Choice. The most recent issue was devoted to homeschooling and about half the articles in it cited NCES reports, he said. 

    “There is a certain irony that a pro-school choice administration would cut objective data that might help increase acceptance of homeschooling,” said Maranto. 

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    It is unclear what will happen to the unreleased 2023 homeschooling data or if the Education Department will ever collect homeschool statistics in the future. 

    In response to questions about the fate of the homeschooling data, Education Department spokeswoman Madison Biedermann said that its research arm, the Institute of Education Sciences, is in possession of the data and that it is “reviewing how all its contractual activities can best be used to meet its statutory obligations.”

    Last September, the Education Department released some preliminary statistics from the 2023 survey. It noted a small increase in traditional homeschooling since 2019 but a large increase in the number of students who were enrolled in an online virtual school and learning from home full time. Together, more than 5 percent of U.S. students were learning at home in one of these two ways. Fewer than 4 percent were learning at home in 2019. 

    Source: National Center for Education Statistics, September 2024 media briefing slide.

    Researchers were keen to dig into the data to understand the different flavors of homeschooling, from online courses to microschools, which are tiny schools that often operate in private homes or places of worship. Researchers also want to understand why more parents are opting for homeschooling and which subjects they are directly teaching their children, all questions that are included in the parent survey conducted by the Education Department. 

    Related: Tracking Trump: His actions on education

    Tracking homeschooling is notoriously difficult. Families who choose this option can be distrustful of government, but this was one of the few surveys that homeschool advocates cited to document the growth in their numbers and they advised the writers of the federal survey on how to phrase questions. 

    Beginning in 2020, the U.S. Census Bureau also began collecting some data on homeschooling, but those statistics cannot be directly compared with the Education Department data and without a historical record, the census data is less useful, researchers said. It is also unclear if this census data will continue. Some states collect data on homeschooling, but researchers said they do it in different ways, making it impossible to compare homeschooling across states.

    Patrick Wolf, a professor of education policy who studies school choice at the University of Arkansas, was also dismayed by the loss of the Education Department’s statistics. 

    “A federal government agency has been collecting national statistics on education since 1867,” he said. “State and local policy makers and practitioners will be severely challenged in doing their work if they don’t have good data from the feds regarding public schooling, private schooling, and homeschooling. Sending education authority to the states only will work well if the federal government continues to collect and publish comprehensive data on schooling. Otherwise, state and local officials are being asked to fly blind.”

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about homeschool statistics was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Professional services staff need equal recognition – visibility in sector data would be a good start

    Professional services staff need equal recognition – visibility in sector data would be a good start

    Achieving recognition for the significant contribution of professional services staff is a collaborative, cross-sector effort.

    With HESA’s second consultation on higher education staff statistics welcoming responses until 3 April, AGCAS has come together with a wide range of membership bodies representing professional services staff across higher education to release a statement warmly welcoming HESA’s proposal to widen coverage of the higher education staff record to include technical staff and professional and operational staff.

    By creating a more complete staff record, HESA aim to deliver better understanding of the diverse workforce supporting the delivery of UK higher education. AGCAS, together with AHEP, AMOSSHE, ASET, CRAC-Vitae, NADP and UMHAN, welcome these proposals. We have taken this collaborative approach because we have a common goal of seeking wider recognition for the outstanding contributions and work of our members in professional services roles, and the impact they make on their institutions, regions, graduates and students.

    A matter of visibility

    Since the 2019–20 academic year, higher education providers in England and Northern Ireland have had the option to return data on non-academic staff to HESA. However, this has led to a lack of comprehensive visibility for many professional services staff. In the 2023–24 academic year, out of 228 providers only 125 opted to return data on all their non-academic staff – leaving 103 providers opting out.

    This gap in data collection has raised concerns about the recognition and visibility of these essential staff members – and has not gone unnoticed by professional services staff themselves. As one AGCAS member noted:

    Professional service staff have largely remained invisible when reporting on university staff numbers. Professional services provide critical elements of student experience and outcomes, and this needs to be recognised and reflected better in statutory reporting.

    This sentiment underscores the importance of the proposed changes by HESA, and the reason for our shared response.

    Who is and is not

    A further element of the consultation considers a move away from the term “non-academic” to better reflect the roles and contributions of these staff members and proposes to collect data on staff employment functions.

    Again, we collectively strongly support these proposed changes, which have the potential to better understand and acknowledge the wide range of staff working to deliver outstanding higher education across the UK. The term non-academic has long been contentious across higher education. While continuing to separate staff into role types may cause issues for those in the third space, shifting away from a term and approach that defines professional services staff by othering them is a welcome change.

    As we move forward, it is essential to continue fostering collaboration and mutual respect between academic and professional services staff. Challenging times across higher education can create or enhance partnership working between academic and professional services staff, in order to tackle shared difficulties, increase collaboration and form strategic alliances.

    A better environment

    By working in this way, we can create a more inclusive and supportive environment that recognises the diverse contributions of all staff members, ultimately enhancing outcomes for all higher education stakeholders, particularly students.

    Due to the nature of our memberships, our shared statement focuses on professional services staff in higher education – but we also welcome the clear focus on operational and technical staff from HESA, who again make vital contributions to their institutions.

    We all know that representation matters to our members, and the higher education staff that we collectively represent. HESA’s proposed changes could help to start a move towards fully and equitably recognising the vital work of professional services staff across higher education. By expanding data collection to include wider staff roles and moving away from the term “non-academic”, we can better understand and acknowledge the wide range of contributions that support the higher education sector.

    This is just the first step towards better representation and recognition, but it is an important one.

    Source link

  • Strengthening data and insights into our changing university research landscape by Jessica Corner

    Strengthening data and insights into our changing university research landscape by Jessica Corner

    The UK continues as a global leader in research and innovation and our universities are uniquely strong contributors, among which are the highest performing in the world. We have some of the highest-intensity innovation ecosystems in the world, with universities as the core driver. As a country, our invention record is well recognised. The UK, with its powerful life sciences effort, delivered one of the first UK COVID-19 vaccines, saving millions of lives around the world and only possible because of long-standing investment in research that became serendipitously essential. In cities across the UK, universities act as pillar institutions with positive and reinforcing effects on their local economies. We have a rich network of specialist institutions that excel in music, the arts, medicine and life sciences. Our universities continue to deliver discoveries, technologies, creative insights, talent for our industries and public services and so much more. Many have the scale and reach to deliver across the full span of research and innovation to enterprise and commercialisation.

    A unique feature, and underpinning this extraordinary record, is our dual support funding system. That system balances competitive grant funding from UKRI Research Councils, charities, business, and others with long-term stable underpinning funds to enable universities to pursue ambitious and necessary strategies, develop research strengths, foster talent, pivot towards new fields, collaborate and maintain research infrastructure.

    However, the sector faces unprecedented challenges. Erosion of the value of student fees and the growing costs of delivering education, disruptions to anticipated income from international student fees, a slow erosion of the value of QR, rising costs of research and a mismatch between this and cost recovery from grants has created a perfect storm and unsustainable operating models for most institutions. The additional £5bn a year in funding from universities’ own surpluses towards research and innovation is no longer guaranteed. The sector has and continues to evolve in response to a changing landscape, but consideration is needed about how best to support the sector to change.

    Research England’s role is to support a healthy, dynamic, diverse, and inclusive, research and innovation system in universities in England6. We work by facilitating and incentivising system coherence, acting as both champion and challenger. In partnership we aim to create and sustain the conditions for the system to continue delivering excellence and leverage resources far beyond funding provided by government. We are working to enhance the data and evidence to support our role as expert, evidence-based funder and on the outcomes that the funding delivers. In fulfilling this role and against the current context, Research England has two initiatives that we will be taking forward in the coming weeks.

    Our ongoing programme to review the principles underpinning our funding and mechanisms by which we allocate research funds to institutions has reached a point where we are seeking to increase the visibility and transparency of how these funds are deployed by institutions. We are developing an approach, designed to be light touch and low burden that asks universities to report back on their use of strategic institutional research funding. We will begin testing the approach with a selection of institutions in the coming months and, subject to the outcomes of this initial engagement, aim to roll out a pilot with institutions in the 2025/26 academic year. We will be communicating to institutions directly about the pilot in the early Autumn. In the second phase of this work, we intend to work with institutions to develop a forward-looking strategic element that will give insight into plans and then how decisions are made about the deployment of funding. For the programme, we are also reviewing the effectiveness of the different unhypothecated and ring-fenced research funds provided to institutions. When fully implemented, the information we will acquire will enable Research England greater visibility of the role of institutions and the contribution of our formula-based research funding (including QR) to the research and innovation system while also contributing to efforts to have more systematic and timely data.

    A second strand of work is our programme to monitor the implications for the sustainability of research in universities against the current financial context. We are seeking to better understand how challenges are impacting universities’ ability to deliver research and innovation and maintain research capabilities, capacity, and facilities and, in turn, further strengthen assurance with more robust data. In partnership with the Department for Innovation Science and Technology, we have commissioned the Innovation Research Caucus with OMB Research Ltd to undertake a survey into how institutions are responding to current pressures with respect to research and innovation. The survey will provide important data that can support advice to government and others on the extent of universities’ financial challenges, how these issues are being managed, and how this impacts their investment and planning in the research and innovation space. The approach is to provide insights that are currently not available at an aggregate level or in a timely way through national data sets. Additionally, Research England will be asking institutions to report on material changes they are making to research and innovation capabilities and capacity or in relation to wider changes in institutional form or organisation when these may affect the basis on which our funding is awarded.

    We continue to see our role as facilitator, enabler and partner and believe we have a strong reputation for having timely and robust insights into the conditions underpinning our great research and innovation system. These two programmes of work are being taken forward in support of universities and, against the current backdrop, will strengthen Research England’s fundamental role in the research and innovation system. We look forward to working in close partnership with universities as we take these critical work programmes forward.

    Source link