Tag: Data

  • Trump Orders Colleges to Supply Data on Race in Admissions

    Trump Orders Colleges to Supply Data on Race in Admissions

    Brendan Smialowski/AFP/Getty Images

    President Donald Trump issued an executive action Thursday afternoon mandating colleges and universities submit data to verify that they are not unlawfully considering race in admissions decisions.

    The order also requires the Department of Education to update the Integrated Postsecondary Education Data System to make its data more legible to students and parents and to “increase accuracy checks for data submitted by institutions through IPEDS,” penalizing them for late, incomplete or inaccurate data. 

    Opponents of race-conscious admissions have hailed the mandate as a victory for transparency in college admissions, but others in the sector have criticized its vague language and question who at the department is left to collect and analyze the data.

    “American students and taxpayers deserve confidence in the fairness and integrity of our Nation’s institutions of higher education, including confidence that they are recruiting and training capable future doctors, engineers, scientists, and other critical workers vital to the next generations of American prosperity,” the order reads. “Race-based admissions practices are not only unfair, but also threaten our national security and well-being.”

    It’s now up to the secretary of education, Linda McMahon, to determine what new admissions data institutions will be required to report. The administration’s demands of Columbia and Brown Universities in their negotiations to reinstate federal funding could indicate what the requirements will be. In its agreement with Brown, the government ordered the university to submit annual data “showing applicants, admitted students, and enrolled students broken down by race, color, grade point average, and performance on standardized tests.” Colleges will be expected to submit their admissions data for the 2025–26 academic year, according to the order.

    What resources are in place to enforce the new requirements remains to be seen. Earlier this year the administration razed the staff at the Department of Education who historically collected and analyzed institutional data. Only three staff members remain in the National Center for Education Statistics, which operates IPEDS.

    ‘It’s Not Just as Easy as Collecting Data’

    Since taking office, the Trump administration has launched a crusade against diversity, equity and inclusion in higher education, often using the Supreme Court’s 2023 ruling against race-conscious admissions as a weapon in the attacks.

    Students for Fair Admissions, the anti–affirmative action advocacy group that was the plaintiff in the 2023 cases, called the action a “landmark step” toward transparency and accountability for students, parents and taxpayers.

    “For too long, American colleges and universities have hidden behind opaque admissions practices that often rely on racial preferences to shape their incoming classes,” Edward Blum, SFFA president and longtime opponent of race-conscious admissions, said in a press release.

    But college-equity advocates sounded the alarm, arguing that the order—which also claims that colleges have been using diversity and other “overt and hidden racial proxies” to continue race-conscious admissions post-SFFA—aims to intimidate colleges into recruiting fewer students of color.

    “I will say something that my members in the higher education community cannot say. What the Trump administration is really saying is that you will be punished if you do not admit enough white students to your institution,” Angel B. Pérez, CEO of the National Association for College Admission Counseling, told Inside Higher Ed.

    Like many of Trump’s other orders targeting DEI, that mandate relies on unclear terms and instructions. It does not define “racial proxies”—although a memo by the Department of Justice released last week provides examples—nor does it outline what data would prove an institution is or is not considering race in its admissions process.

    In an interview with Inside Higher Ed, Paul Schroeder, the executive director of the Council of Professional Associations on Federal Statistics, questioned the government’s capacity to carry out the president’s order.

    “Without NCES, who’s going to actually look at this data? Who’s going to understand this data? Are we going to have uniform reporting or is it going to be just a mess coming in from all these different colleges?” Schroeder said.

    “It’s not just as easy as collecting data. It’s not just asking a couple questions about the race and ethnicity of those who were admitted versus those who applied. It’s a lot of work. It’s a lot of hours. It’s not going to be fast.”

    Source link

  • OfS Outcomes (B3) data, 2025

    OfS Outcomes (B3) data, 2025

    The Office for Students’ release of data relating to Condition of Registration B3 is the centerpiece of England’s regulator’s quality assurance approach.

    There’s information on three key indicators: continuation (broadly, the proportion of students who move from year one to year two), completion (pretty much the proportion who complete the course they sign up for), and progression (the proportion who end up in a “good” destination – generally high skilled employment or further study).

    Why B3 data is important

    The power comes from the ability to view these indicators for particular populations of students – everything from those studying a particular subject and those with a given personal characteristic, through to how a course is delivered. The thinking goes that this level of resolution allows OfS to focus in on particular problems – for example a dodgy business school (or franchise delivery operation) in an otherwise reasonable quality provider.

    The theory goes that OfS uses these B3 indicators – along with other information such as notifications from the public, Reportable Event notifications from the provider itself, or (seemingly) comment pieces in the Telegraph to decide when and where to intervene in the interests of students. Most interventions are informal, and are based around discussions between the provider and OfS about the identified problem and what is being done to address it. There have been some more formal investigations too.

    Of course, providers themselves will be using similar approaches to identify problems in their own provision – in larger universities this will be built into a sophisticated data-driven learner analytics approach, while some smaller providers primarily what is in use this release (and this is partly why I take the time to build interactives that I feel are more approachable and readable than the OfS versions).

    Exploring B3 using Wonkhe’s interactive charts

    These charts are complicated because the data itself is complicated, so I’ll go into a bit of detail about how to work them. Let’s start with the sector as a whole:

    [Full screen]

    First choose your indicator: Continuation, completion, and progression.

    Mode (whether students are studying full time, part time, or on an apprenticeship) and level (whether students are undergraduate, postgraduate, and so on) are linked: there are more options for full and part time study (including first degree, taught postgraduate, and PhD) and less for apprenticeships (where you can see either all undergraduates or all postgraduates).

    The chart shows various splits of the student population in question – the round marks show the actual value of the indicator, the crosses show the current numeric threshold (which is what OfS has told us is the point below which it would start getting stuck in to regulating).

    Some of the splits are self-explanatory, others need a little unpacking. The Index of Multiple Deprivation (IMD) is a standard national measure of how socio-economically deprived a small area is – quintile 1 is the most deprived, quintile 5 is the least deprived. Associations Between Characteristics of Students (ABCs) is a proprietary measure developed by OfS which is a whole world of complexity: here all you need to know is that quintile five is more likely to have good outcomes on average, and quintile 1 are least likely to have good outcomes.

    If you mouse over any of the marks you will get some more information: the year(s) of data involved in producing the indicator (by definition most of this data refers to a number of years ago and shouldn’t really be taken as an indication of a problem that is happening right now), and the proportion of the sample that is above or below the threshold. The denominator is simply the number of students involved in each split of the population.

    There’s also a version of this chart that allows you to look at an individual provider: choose that via the drop down in the middle of the top row.

    [Full screen]

    You’ll note you can select your population: Taught or registered includes students taught by the provider and students who are registered with a provider but taught elsewhere (subcontracted out), taught only is just those students taught by a provider (so, no subcontractual stuff), partnership includes only students where teaching is contracted out or validated (the student is both registered and taught elsewhere, but the qualification is validated by this provider)

    On the chart itself, you’ll see a benchmark marked with an empty circle: this is what OfS has calculated (based on the characteristics of the students in question) the value of the indicator should be – the implications being that the difference from the benchmark is entirely the fault of the provider. In the mouse-over I’ve also added the proportion of students in the sample above and below the benchmark.

    OfS take great pains to ensure that B3 measures can’t be seen as a league table, as this would make their quality assurance methodology look simplistic and context-free. Of course, I have built a league table anyway just to annoy them: the providers are sorted by the value of the indicator, with the other marks shown as above (note that not all options have a benchmark value). Here you can select a split indicator type (the group of characteristics you are interested in) and then the split indicator (specific characteristic) you want to explore using the menus in the middle of the top row – the two interact and you will need to set them both.

    You can find a provider of interest using the highlighter at the bottom, or just mouse over a mark of interest to get the details on the pop-up.

    [Full screen]

    With so much data going on there is bound to be something odd somewhere – I’ve tried to spot everything but if there’s something I’ve missed please let me know via an email or a comment. A couple of things you may stumble on – OfS has suppressed data relating to very small numbers of students, and if you ever see a “null” value for providers it refers to the averages for the sector as a whole.

    Yes, but does it regulate?

    It is still clear that white and Asian students have generally better outcomes than those from other ethnicities, that a disadvantaged background makes you less likely to do well in higher education, and that students who studied business are less likely to have a positive progression outcome than those who studied the performing arts.

    You might have seen The Times running with the idea that the government is contemplating restrictions on international student visas linked to the completion rates of international students. It’s not the best idea for a number of reasons, but should it be implemented a quick look at the ranking chart (domicile; non-uk) will let you know which providers would be at risk in that situation: for first degree it’s tending towards the Million Plus end of things, for taught Masters provision we are looking at smaller non-traditional providers.

    Likewise, the signs are clear that a crackdown on poorly performing validated provision is incoming – using the ranking chart again (population type: partnership, splits: type of partnerships – only validated) shows us a few places that might have completion problems when it comes to first degree provision.

    If you are exploring these (and I bet you are!) you might note some surprisingly low denominator figures – surely there has been an explosion in this type of provision recently? This demonstrates the achillies heel of the B3 data: completion data relates to pre-pandemic years (2016-2019), continuation to 2019-2022. Using four years of data to find an average is useful when provision isn’t changing much – but given the growth of validation arrangements in recent years, what we see here tells us next to nothing about the sector as it currently is.

    Almost to illustrate this point, the Office for Students today announced an investigation into the sub-contractual arrangement between Buckinghamshire New University and the London School of Science and Technology. You can examine these providers in B3 and if you look at the appropriate splits you can see plenty of others that might have a larger problem – but it is what is happening in 2025 that has an impact on current students.

    Source link

  • More comprehensive EDI data makes for a clearer picture of staff social mobility

    More comprehensive EDI data makes for a clearer picture of staff social mobility

    Asking more granular EDI questions of its PGRs and staff should be a sector priority. It would enable universities to assess the diversity of their academic populations in the same manner they have done for our undergraduate bodies – but with the addition of a valuable socio-economic lens.

    It would equip us more effectively to answer basic questions regarding how far the diversity in our undergraduate community leads through to our PGT, PGR and academic populations, as well as see where ethnicity and gender intersect with socio-economic status and caring responsibilities to contribute to individuals falling out of (or choosing to leave) the “leaky” academic pipeline.

    One tool to achieve this is the Diversity and Inclusion Survey (DAISY), a creation of Equality, Diversity and Inclusion in Science and Health (EDIS) and the Wellcome Trust. This toolkit outlines how funders and universities can collect more detailed diversity monitoring data of their staff and PGRs as well as individuals involved in research projects.

    DAISY suggests questions regarding socio-economic background and caring responsibilities that nuance or expand upon those already in “equal opportunities”-type application forms that exist in the sector. DAISY asks, for example, whether one has children and/or adult dependents, and how many of each, rather than the usual “yes” or “no” to “do you have caring responsibilities?” Other questions include the occupation of your main household earner when aged 14 (with the option to pick from categories of job type), whether your parents attended university before you were 18, and whether you qualified for free school meals at the age of 14.

    EDI data journeys across the sector

    As part of an evolving data strategy, UCAS already collects several DAISY data points on their applicants, such as school type and eligibility for free school meals, with the latter data point is gaining traction across the university sector and policy bodies as a meaningful indicator for disadvantage.

    Funders are interested in collecting more granular EDI data. The National Institute for Health and Care Research (NIHR), for example, invested around £800 million in the creation of Biomedical Research Centres in the early 2020s. The NIHR encouraged the collection of DAISY data specifically on both the researchers each centre would employ and the individuals they would research upon, in the belief (see theme four of their research inclusion strategy) that a diverse researcher workforce will make medical science more robust.

    The diversity monitoring templates attached to recent UKRI funding schemes similarly highlight the sector’s desire for more granular EDI data. UKRI’s Responsive Mode Scheme, for example, requires institutions to benchmark their applicants against a range of protected characteristics, including ethnicity, gender, and disability, set against the percentage of the “researcher population” at the institution holding those characteristics. The direction of travel in the sector is clear.

    What can universities do?

    Given the data journeys of UCAS and funding bodies, it is sensible and proportionate, therefore, that universities ask more granular EDI questions of their PGRs and their staff. Queen Mary began doing so, using the DAISY toolkit as guide, for its staff and PGRs in October 2024, alongside work to capture similar demographic data in the patient population involved in clinical trials supported by Queen Mary and Barts NHS Health Trust.

    While we have excellent diversity in our undergraduate community, we see less in our PGR and staff communities, and embedding more granular data collection into our central HR processes for staff and admissions processes for PGRs allows us to assess (eventually, at least, given adequate disclosure rates) how far the diversity in our undergraduate population leads through to our PGT, PGR and academic population.

    Embedding the collection of more granular EDI data into central HR and admissions systems required collaboration across Queen Mary’s Research Culture, EDI, and HR teams, creating new information forms and systems to collect the data while ensuring it could be linked to other datasets. The process was also quickened by a clinical trials unit in our Faculty of Medicine & Dentistry who had piloted the collection of this data already on a smaller scale, providing a proof of concept for our colleagues in HR.

    EDI data and the PGR pipeline

    Securing the cooperation of our HR and EDI colleagues was made easier thanks to our doctoral college, who had already incorporated the collection of more granular EDI data into an initiative aimed at increasing the representation of Black British students in our PGR community: the STRIDE programme.

    Standing for “Summer Training Research Initiative to Support Diversity and Equity”, STRIDE gives our BAME undergraduate students the opportunity to undertake an eight-week paid research project over the summer, alongside a weekly soft skills programme including presentation and leadership training. Although the programme has run annually since 2020 with excellent outcomes (almost 70 per cent of the first cohort successfully applied to funded research programmes), incorporating more granular EDI questions into the application form for the 2024 cohort of 425 applicants highlighted intersectional barriers to postgraduate study faced by our applicants that would have been obscured had we only collected basic EDI data.

    Among other insights, 47 per cent of applicants to STRIDE had been eligible at some point for free school meals. This contrasts with our broader undergraduate community, 22 per cent of whom were eligible for free school meals. Some 55 per cent of applicants reported that neither of their parents went to university, and 27 per cent reported that their parents had routine or semi-routine manual jobs. Asking questions beyond the usual suite of EDI questions allows us here to picture more clearly the socio-economic and cultural barriers that intersect with ethnicity to make entry into postgraduate study more difficult for members of underrepresented communities.

    The data chimed with internal research we conducted in 2021, where we discovered that many of the key barriers to our undergraduates engaging in postgraduate research were the same as those who were first in family to go to university, namely lack of family understanding of a further degree and lack of understanding regarding the financial benefits of completing a postgraduate research degree.

    Collecting more granular EDI data will allow us to understand and support diversity that is intersectional, while enabling more effective assessment of whether Queen Mary is moving in the right direction in terms of making research degrees (and research careers) accessible to traditionally underrepresented communities at our universities. But collecting such data on our STRIDE applicants makes little sense without equivalent data from our PGR and academic community – hence Queen Mary’s broader decision to embed DAISY data collection into its systems.

    The potential of DAISY

    As Queen Mary’s experience with STRIDE demonstrates, nuancing our collection of EDI data comes with clear potential. Given adequate disclosure rates, collecting more granular EDI data makes possible more effective intersectional analyses of our PGRs and staff across our sector, and helps understand the social mobility of our PGRs and staff with more nuance, leading to a clearer image of the journey that those from less privileged social backgrounds and/or those with caring responsibilities face across our sector.

    More broadly, universities will always be crucial catalysts of social mobility, and collecting more granular data on socio-economic background alongside the personal data they already collect – such as gender, ethnicity, religion and other protected characteristics – is a logical and necessary next step.

    Source link

  • Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Key points:

    In 2025, schools are sitting on more data than ever before. Student records, attendance, health information, behavioral logs, and digital footprints generated by edtech tools have turned K-12 institutions into data-rich environments. As artificial intelligence becomes a central part of the learning experience, these data streams are being processed in increasingly complex ways. But with this complexity comes a critical question: Are schools doing enough to protect that data?

    The answer, in many cases, is no.

    The rise of shadow AI

    According to CoSN’s May 2025 State of EdTech District Leadership report, a significant portion of districts, specifically 43 percent, lack formal policies or guidance for AI use. While 80 percent of districts have generative AI initiatives underway, this policy gap is a major concern. At the same time, Common Sense Media’s Teens, Trust and Technology in the Age of AI highlights that many teens have been misled by fake content and struggle to discern truth from misinformation, underscoring the broad adoption and potential risks of generative AI.

    This lack of visibility and control has led to the rise of what many experts call “shadow AI”: unapproved apps and browser extensions that process student inputs, store them indefinitely, or reuse them to train commercial models. These tools are often free, widely adopted, and nearly invisible to IT teams. Shadow AI expands the district’s digital footprint in ways that often escape policy enforcement, opening the door to data leakage and compliance violations. CoSN’s 2025 report specifically notes that “free tools that are downloaded in an ad hoc manner put district data at risk.”

    Data protection: The first pillar under pressure

    The U.S. Department of Education’s AI Toolkit for Schools urges districts to treat student data with the same care as medical or financial records. However, many AI tools used in classrooms today are not inherently FERPA-compliant and do not always disclose where or how student data is stored. Teachers experimenting with AI-generated lesson plans or feedback may unknowingly input student work into platforms that retain or share that data. In the absence of vendor transparency, there is no way to verify how long data is stored, whether it is shared with third parties, or how it might be reused. FERPA requires that if third-party vendors handle student data on behalf of the institution, they must comply with FERPA. This includes ensuring data is not used for unintended purposes or retained for AI training.

    Some tools, marketed as “free classroom assistants,” require login credentials tied to student emails or learning platforms. This creates additional risks if authentication mechanisms are not protected or monitored. Even widely-used generative tools may include language in their privacy policies allowing them to use uploaded content for system training or performance optimization.

     

    Data processing and the consent gap

    Generative AI models are trained on large datasets, and many free tools continue learning from user prompts. If a student pastes an essay or a teacher includes student identifiers in a prompt, that information could enter a commercial model’s training loop. This creates a scenario where data is being processed without explicit consent, potentially in violation of COPPA (Children’s Online Privacy Protection Act) and FERPA. While the FTC’s December 2023 update to the COPPA Rule did not codify school consent provisions, existing guidance still allows schools to consent to technology use on behalf of parents in educational contexts. However, the onus remains on schools to understand and manage these consent implications, especially with the rule’s new amendments becoming effective June 21, 2025, which strengthen protections and require separate parental consent for third-party disclosures for targeted advertising.

    Moreover, many educators and students are unaware of what constitutes “personally identifiable information” (PII) in these contexts. A name combined with a school ID number, disability status, or even a writing sample could easily identify a student, especially in small districts. Without proper training, well-intentioned AI use can cross legal lines unknowingly.

    Cybersecurity risks multiply

    AI tools have also increased the attack surface of K-12 networks. According to ThreatDown’s 2024 State of Ransomware in Education report, ransomware attacks on K-12 schools increased by 92 percent between 2022 and 2023, with 98 total attacks in 2023. This trend is projected to continue as cybercriminals use AI to create more targeted phishing campaigns and detect system vulnerabilities faster. AI-assisted attacks can mimic human language and tone, making them harder to detect. Some attackers now use large language models to craft personalized emails that appear to come from school administrators.

    Many schools lack endpoint protection for student devices, and third-party integrations often bypass internal firewalls. Free AI browser extensions may collect keystrokes or enable unauthorized access to browser sessions. The more tools that are introduced without IT oversight, the harder it becomes to isolate and contain incidents when they occur. CoSN’s 2025 report indicates that 60 percent of edtech leaders are “very concerned about AI-enabled cyberattacks,” yet 61 percent still rely on general funds for cybersecurity efforts, not dedicated funding.

    Building a responsible framework

    To mitigate these risks, school leaders need to:

    • Audit tool usage using platforms like Lightspeed Digital Insight to identify AI tools being accessed without approval. Districts should maintain a living inventory of all digital tools. Lightspeed Digital Insight, for example, is vetted by 1EdTech for data privacy.
    • Develop and publish AI use policies that clarify acceptable practices, define data handling expectations, and outline consequences for misuse. Policies should distinguish between tools approved for instructional use and those requiring further evaluation.
    • Train educators and students to understand how AI tools collect and process data, how to interpret AI outputs critically, and how to avoid inputting sensitive information. AI literacy should be embedded in digital citizenship curricula, with resources available from organizations like Common Sense Media and aiEDU.
    • Vet all third-party apps through standards like the 1EdTech TrustEd Apps program. Contracts should specify data deletion timelines and limit secondary data use. The TrustEd Apps program has vetted over 12,000 products, providing a valuable resource for districts.
    • Simulate phishing attacks and test breach response protocols regularly. Cybersecurity training should be required for staff, and recovery plans must be reviewed annually.

    Trust starts with transparency

    In the rush to embrace AI, schools must not lose sight of their responsibility to protect students’ data and privacy. Transparency with parents, clarity for educators, and secure digital infrastructure are not optional. They are the baseline for trust in the age of algorithmic learning.

    AI can support personalized learning, but only if we put safety and privacy first. The time to act is now. Districts that move early to build policies, offer training, and coordinate oversight will be better prepared to lead AI adoption with confidence and care.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Beyond the Latest Data from the National Student Clearinghouse

    Beyond the Latest Data from the National Student Clearinghouse

    EducationDynamics Transforms Insights into Action for Higher Ed Leaders

    The higher education landscape is in constant motion. To truly thrive, institutions committed to student success must not just keep pace but anticipate what’s next. The National Student Clearinghouse (NSC) recently released two crucial reports in June 2025—one on “some college, no credential” (SCNC) undergraduates and another on overall undergraduate student retention and persistence. These aren’t just statistics. They are the roadmap for strategic action.

    At EducationDynamics, we don’t merely react to these insights. We proactively integrate them into data-driven solutions that empower our partners to excel. Our deep understanding of the higher education market, sharpened by years of proprietary research, allows us to translate these macro trends into micro-level strategies that deliver tangible results for your institution.

    Strategic Implications from the NSC June Update

    The latest NSC findings highlight several critical areas demanding immediate attention from higher education leaders:

    Persistence and Retention Gaps

    While overall persistence is at 78% and retention at 70%, a significant disparity exists. Bachelor’s and certificate-seeking students show much higher rates than those pursuing associate degrees. Generalized support isn’t enough. Tailored academic and financial aid advising, particularly for associate-degree pathways, is essential to prevent attrition at critical junctures.

    The Part-Time Student Paradox

    Persistence and retention rates for part-time students are a staggering 30% lower than their full-time peers. Part-time learners often juggle work and family. Institutions must design flexible and accessible support systems, including asynchronous learning, evening/weekend advising, and re-evaluating traditional program structures.

    Sectoral Disparities

    For-profit institutions demonstrate significantly lower retention and persistence rates compared to not-for-profit counterparts. Regardless of sector, consistent and proactive communication focused on evolving student needs is crucial. This means dedicated engagement strategies, not just reactive responses.

    Equity in Outcomes

    White and Asian students continue to exhibit the highest persistence and retention rates. Achieving equitable outcomes demands meticulously analyzing data by affinity group, identifying specific barriers faced by underserved populations, and then designing targeted, culturally competent support programs.

    The Power of Re-Engagement

    The share of re-enrollees earning a credential in their first year has increased by nearly five percent, with students who have at least two full years of credits being most likely to re-enroll and persist. Notably, 36% re-enroll at the same school. Your “stopped out” student population is a goldmine for re-enrollment. Proactive, personalized outreach, highlighting clear paths to completion, is a win-win for both institutions seeking to boost enrollment and students aiming to achieve their academic aspirations.

    The Online Advantage

    In almost all cases, a plurality of re-enrolling students chose primarily online schools. Even if your institution isn’t primarily online, a robust and well-promoted suite of online program options is vital. Flexibility in format and delivery is critical to meet the diverse needs of today’s learners.

    Certificate Pathways as Catalysts

    Nearly half of re-enrolled SCNC students who earned a credential in their first year attained an undergraduate certificate. Expanding and actively promoting undergraduate certificate programs, especially those aligning with in-demand skills or acting as stepping stones to degrees, can significantly boost completion rates among the SCNC population.

    How EducationDynamics Turns Insights into Action for Our Partners

    Tailored Support for the Modern Learner

    We partner with institutions to develop AI-powered communication workflows and personalized engagement platforms that proactively address the specific needs of part-time, non-traditional, and diverse student populations. For instance, our work with one regional university saw a 15% increase in part-time student retention within two semesters by implementing automated check-ins and flexible advising scheduling based on our Engaging the Modern Learner report findings.

    Optimizing Re-Engagement Pipelines

    Our “Education Reengagement Report: Inspiring Reenrollment in Some College No Credential Students” anticipated the NSC’s findings on the SCNC population. We’ve since refined our “Lost Student Analysis” methodology, which identifies high-potential stopped-out students and crafts targeted re-enrollment campaigns. For a recent partner, this resulted in re-enrolling over 200 SCNC students in a single academic year, directly contributing to enrollment growth.

    Strategic Program Portfolio Development

    Understanding the demand for online and certificate options, we guide institutions in developing and promoting flexible program offerings. This includes comprehensive market research to identify in-demand certificate programs and optimizing their visibility through targeted marketing. Our expertise helps institutions strategically align their offerings with what NSC data shows students are seeking.

    Equity-Driven Enrollment & Retention

    We help institutions implement data segmentation and predictive analytics to identify students at risk of stopping out based on various demographic and academic factors. This enables early intervention and the allocation of resources to underserved groups, fostering a more equitable and supportive learning environment.

    Proactive Market Intelligence

    Our partners gain an unparalleled advantage with early access to our market research reports and bespoke analyses. These reports, often preceding or complementing national findings like the NSC’s, provide actionable recommendations that allow institutions to adapt their strategies ahead of the curve, rather than playing catch-up.

    Your Partner in Data-Driven Student Success

    EducationDynamics is more than a service provider. We are a strategic partner dedicated to empowering higher education leaders with the insights and tools needed to navigate an evolving landscape and maximize student success. We combine cutting-edge market intelligence with proven strategies, transforming data into actionable plans that boost retention, drive re-enrollment and foster a truly student-centric institution.

    Source link

  • The Unignorable Data on AI in Higher Ed Marketing and Enrollment Management

    The Unignorable Data on AI in Higher Ed Marketing and Enrollment Management

    Just a few years ago, AI in higher education was largely a topic for innovation labs and speculative white papers. Today, it has moved from the periphery to the absolute core of institutional viability, particularly in the critical areas of marketing and enrollment management. Leaders who still view AI as a future investment, rather than an immediate operational imperative, risk being outmaneuvered by a competitive landscape that is already embracing this transformative power.

    The global AI software market is projected to hit an astounding $126 billion by the end of 2025. From healthcare to transportation AI is now an integral part of daily operations, with a significant 78% of organizations reporting AI usage in 2024—a sharp increase from 55% in 2023. Generative AI specifically saw its usage in at least one business function jump from 33% in 2023 to a staggering 71% in 2024.

    The critical question is no longer if AI should be used, but how quickly institutions can integrate it to avoid not just falling behind but becoming irrelevant in a rapidly evolving landscape. The recent Marketing and Enrollment Management AI Readiness Report 2025, produced by UPCEA, the Online and Professional Education Association, and EducationDynamics, the only higher education agency building revenue and reputation that drives results, provides an in-depth look at institutional perceptions and AI readiness.

    Here’s the uncomfortable truth: while your most proactive staff are already leveraging AI to drive results, many institutions are held back by analysis paralysis and strategic inaction. This is a direct threat to talent retention and competitive advantage.

    While the general sentiment toward AI is increasingly positive, the report highlights that individual university staff are often far more receptive to using emerging technologies than their institutions. This leads to a significant gap between receptivity to AI in marketing and enrollment management and organization-changing operationalization of AI at an institutional level.

    In 2025, 65% of survey respondents reported actively using emerging technologies like AI in their marketing and enrollment efforts, a substantial increase from 40% in 2024. However, this leaves over one-third of higher education marketing and enrollment management professionals on the outside of the AI revolution, falling further behind by the day. More troubling, only 61% indicated their institution is open to using these technologies. While the evidence suggests a growing openness to adopting critical technology, only 56 percent of institutions have a plan for upskilling staff in AI-driven tools.

    Many respondents recognize a gap in their institutional AI readiness. A striking 56% of respondents don’t consider their institution a leader in implementing AI for marketing and enrollment management functions. When compared to peer institutions, 38% felt they were on pace, but 36% believed they were behind, with only 21% considering themselves ahead. This sentiment underscores a growing urgency to adopt AI, coupled with a pervasive feeling of being “behind the curve.”

    AI is a core component embedded directly in the recruitment, engagement and conversion platforms institutions already rely on. This widespread integration is transforming daily operations, as the 2025 survey highlights:

    • Nearly two-thirds (65%) of institutions utilize AI-enhanced creative and design tools.
    • Over half (51%) use social media management tools with embedded AI.
    • Customer relationship management (CRM) systems and data analytics platforms with AI features are used by 31% of institutions respectively.

    The perceived effectiveness of these AI-powered tools is on the rise. Content generation, the most widely used AI application, was rated most effective, with 47% deeming it “very effective” or “effective.” Other applications like content optimization (41% effective) and customized ad and message delivery (39% effective) also showed strong results.

    Moving beyond perceived effectiveness, AI integration is yielding direct, quantifiable improvements across marketing and enrollment operations:

    • 69% of respondents reported improved efficiency in their workflows due to AI.
    • More than half (52%) observed an increase in the quality of their work.
    • Nearly half (48%) believe AI tool integration has positively impacted their enrollment funnel.

    The study identified key areas where AI is delivering the strongest return on investment (ROI) including customized ad messaging, lead generation and creative content development. Content optimization also stood out, with 36% of respondents noting a “very high” or “high” ROI. If nearly 70% see efficiency gains and almost half see a positive impact on enrollment, why aren’t more institutions fully embracing this?

    Student engagement is AI-dependent. For Modern Learners artificial intelligence is a fundamental tool in their college search, essential for information discovery. This profound shift in how the next generation interacts with information demands institutions meet this baseline expectation. Otherwise, they risk being perceived as outdated, irrelevant or having their reputation pre-determined by AI itself.

    Modern Learners are using AI to seek information on:

    • Tuition fees (57%)
    • Course offerings (51%)
    • Admission requirements (43%)
    • Campus facilities (37%)
    • Student reviews (35%)

    This highlights the imperative for institutions to ensure their AI-accessible content, whether via chatbots or search optimization, directly aligns with what students are actively seeking.

    Looking ahead, institutional leaders envision even greater potential for AI-driven tools. Within the next two years, innovations such as:

    …are expected to have a significant transformative impact on higher education marketing and enrollment management. These tools promise to address persistent challenges like the need for personalized outreach, improved insights into student behaviors and increased efficiency with limited resources.

    Despite the growing enthusiasm and proven benefits, institutions continue to face significant barriers to full AI integration. The top challenges cited by respondents include:

    • Budget constraints (76%)
    • Technical infrastructure readiness (64%)
    • Data privacy and security concerns (52%)
    • Staff readiness (50%)

    Notably, these barriers have become even more pronounced since 2024, underscoring the urgent need for strategic investment and institutional alignment. Alarmingly, 44% of respondents reported their institution lacks a plan to upskill or support staff in adopting AI-driven technologies. This is a leadership failure, not a staff deficiency. Your most valuable asset, your people, are signaling a readiness for growth, yet nearly half of institutions are failing to provide the essential support.

    The findings from the UPCEA and EducationDynamics study present clear implications for higher education leaders. The time for passive observation is over. Decisive action is required.

    • Invest Where Impact Is Proven
      Focus resources on AI applications already delivering proven ROI, starting with content creation, personalized ads and lead generation. Maximize every dollar in a constrained environment and accelerate returns and free up capacity for further innovation by allocating strategically.
    • Upskill Teams
      Invest in targeted training for both technical skills and change management is crucial to empower staff to effectively use AI tools and build confidence. Providing clear growth pathways tied to AI fluency can significantly improve staff engagement and retention, especially given that 34% of staff now report that their institution’s stance on AI impacts their likelihood of staying at that institution—a dramatic jump from just 1% in 2024. Furthermore, 90% of respondents view AI as a useful tool for their own professional growth. Failing to invest in AI fluency for your teams is effectively disarming them in a rapidly escalating competitive battle.
    • Align Leadership with Operational Readiness
      The nearly doubling of “lack of alignment with strategic priorities” as a major barrier (from 18% in 2024 to 33% in 2025) is an indictment of existing leadership structures. Institutional leaders must move beyond passive support and commit to actionable strategies for AI integration at an institutional level. This involves benchmarking adoption progress, embedding AI into strategic plans and allocating necessary resources to scale effective tools.
    • Establish Institutional AI Governance
      Without robust governance, AI adoption will be chaotic, risky and unsustainable. Creating governance structures that include marketing, enrollment, IT and data privacy leaders is essential. These groups should collaborate to develop responsible AI use policies, establish ethical guidelines and transparently communicate data privacy practices to prospective students. Only 49% of institutions currently have measures in place for ensuring student data security and privacy when using AI tools, though this is an improvement from 30% in 2024. Protect your institution’s reputation, ensuring ethical practice and safeguarding student data in an increasingly scrutinized environment.

    The 2025 study is a revelation of present realities. AI is the operational backbone of competitive higher education marketing and enrollment management. Institutions that have adopted AI are reporting measurable gains in effectiveness efficiency and ROI. The report unequivocally reinforces that delaying implementation means facing the significant risk of falling permanently behind, not only compared to AI-embracing peers but also in meeting the evolving expectations of students and staff.

    For higher education, the challenge now lies in converting receptivity into decisive action, and scattered AI adoption into a cohesive institutional strategy. EducationDynamics provides the expertise, data-driven strategies and solutions to help institutions navigate the complexities of AI integration, meet the expectations of Modern Learners and secure a competitive edge in marketing and enrollment management. The future of higher education is AI-expected, and with EducationDynamics, your institution can lead the charge.

    Source link

  • Reflections on the demand for higher education – and what UCAS data reveal ahead of Results Day 2025

    Reflections on the demand for higher education – and what UCAS data reveal ahead of Results Day 2025


    This HEPI blog was kindly authored by Maggie Smart, UCAS Director of Data and Analysis

    As we pass the 30 June deadline for this year’s undergraduate admissions cycle, UCAS’ data offers an early view of applicant and provider behaviour as we head into Confirmation and Clearing. It also marks a personal milestone for me, as it’s my first deadline release since rejoining UCAS. I wanted to take a deeper look at the data to reflect on how much things have changed since I worked here 10 years ago.

    Applicant demand has always been shaped by two key elements: the size of the potential applicant pool, and their propensity to apply. Since I last worked at UCAS in 2016, these two factors have continuously interchanged over the better part of the past decade – sometimes increasing or decreasing independently but often counterbalancing each other. Let’s take a look at how things are shaping up this year.

    Overall, by the 30 June there have been 665,070 applicants (all ages, all domiciles) this year, compared to 656,760 (+1.3%) in 2024. This is an increase in applicants of over 64,000 since UCAS last reported in January, although the profile of these additional applicants is very different. At the January Equal Consideration Deadline (ECD), over half of the total number of applicants were UK 18-year olds, who are the most likely group to have applied by that stage in the cycle. They represent just 8% of the additional applicants since January, among a much larger proportion of UK mature and international students.

    As we saw at January, the differences in demand for places between young people from the most advantaged (POLAR4 Quintile 5) and most disadvantaged (POLAR4 Quintile 1) areas at June remain broadly the same as last year – with the most advantaged 2.15 times more likely to apply to HE than those from the least advantaged backgrounds, compared to 2.17 last year.

    UK 18-year-old demand

    Demand for UK higher education (HE) has long been shaped by the 18-year-old population – the largest pool of applicants. Despite the well-known challenges facing the HE sector at present, at the 30 June deadline we see record numbers of UK 18-year-old applicants, with 328,390 applicants this year – up from 321,410 (+2%) in 2024. This trend was almost entirely locked in by the January deadline, given the vast majority of UK 18-year-old applicants have applied at this stage in the cycle.

    During my previous tenure at UCAS, the size of the UK 18-year-old population had been falling year on year but from 2020, it began to increase. This continued growth drives the increase in UK 18-year-old applicant numbers we have observed in recent cycles. But when we look at their overall application rate to understand the strength of demand among this group, the data shows a marginal decline again this year – down to 41.2% from 41.9% in 2024. The historically strong growth in the propensity of UK 18-year-olds to apply for HE, which we’ve observed across the last decade, has clearly plateaued.

    This could be due to a range of factors, such as young people choosing to take up work or an apprenticeship, or financial barriers. We know that cost of living is increasingly influencing young people’s decisions this year, with pre-applicants telling us that financial support – such as scholarships or bursaries – ranks as the second most important consideration for them (46%), followed closely by universities’ specific cost-of-living support (34%).

    Interesting to note is the number of UK 19-year-old applicants. When separating the data to distinguish 19-year-olds applying for the first time (as opposed to those reapplying), there has been a decent increase – from 46,680 last year to 48,890 this year (+4.7%). For many years, the number of first-time UK 19-year-old applicants had been falling year on year, but since 2023 this trend has started to reverse. This suggests that demand among young people may be holding up as they decide to take a year out before applying to university or college.

    Mature students

    For UK mature students (aged 21+), the picture looks very different. The number of mature students applying to university or college ebbs and flows depending on the strength of the job market, so since I was last at UCAS, we have typically seen applications decrease when employment opportunities are strong and vice versa. Alongside fluctuations linked to the employment market, rising participation at age 18 means there is a smaller pool of potential older applicants who have not already entered HE. The falling demand from mature students continues in 2025, although in recent years there have been small but significant increases in the volume of mature applicants applying after the 30 June deadline and directly into Clearing. 

    As of this year’s 30 June deadline there have been 86,310 UK mature (21+) applicants, compared to 89,690 (-3.8%) in 2024, meaning a fall in demand compared to the previous year at this point in the cycle for the fourth year in a row. However, whereas at the January deadline mature applicants were down 6.4% compared to the same point last year, at June the figure is only 3.8% down showing some recovery in the numbers. This is another indication that mature students are applying later in the cycle. While it remains too early to say whether we will see continued growth in mature direct to Clearing applicants in 2025, last year 9,390 UK mature students who applied direct to Clearing were accepted at university or college, an increase of 7.4% on 2023 and 22.7% higher than 2022.

    International students

    When looking at the UCAS data through the lens of international students, the landscape has changed significantly since 2016. Brexit led to a sharp decline in EU applicants, offset by strong growth elsewhere, the pandemic caused disruption to international student mobility, and we’ve seen intensified global competition, shifting market dynamics and geopolitics which are increasingly influencing where they choose to study. This year we’re seeing growth once more, with 138,460 international applicants compared to 135,460 in 2024 (+2.2%) – although this stood at +2.7% at January. It should be noted that UCAS does only see a partial view of undergraduate international admissions (we tend to get a more complete picture by the end of the cycle) and we don’t capture data on postgraduate taught and research pathways.

    Interest among Chinese students in UK education has held firm since my time at UCAS, and this year we’re seeing a record number of applicants from China – 33,870, up from 30,860 (+10%) in 2024. This year’s data also shows increases in applicants from Ireland (6,060 applicants, +15%), Nigeria (3,170 applicants, +23%) and the USA (7,930 applicants, +14%). 

    Offer-making

    We are releasing a separate report on offer-making this year, alongside the usual data dashboard for applications. This additional data covers offers and offer rates over the past three years, from the perspective of applicants according to their age and where they live, and from the perspective of providers by UK nation and tariff group.

    What we’re seeing as the natural consequence of increased applications this year is an uplift in offers. Universities have made more offers than ever before this year, with 2.0 million main scheme offers to January deadline applicants overall, largely driven by the rise in UK 18-year-olds applicants (who are the most likely to use their full five choices while applying). This record high surpasses the previous peak of 1.9 million offers set last year (+3.8%).

    While the main scheme offer rate has increased across all provider tariff groups, the most notable uplift is for higher tariff providers – up 3.2 percentage points to 64.4% this year.  Despite the increase in offer rates, higher tariffs do still remain the lowest, partly due to being the most selective institutions. Offer rates by medium and lower tariff providers have also increased, by 0.9 percentage points to 77.0% among medium tariff providers, and by 1.5 percentage points to 81.7% among lower tariff providers. This means that, among those who applied by the Equal Consideration Deadline in January, 72.5% of main scheme applications received an offer this year, also a record high, and 1.8 percentage points higher than in 2024.

    It’s worth noting that we’ll be updating our provider tariff groupings in time for the 2026 cycle, to reflect changes in the higher education landscape.

    Looking ahead

    For students who are intent on going to university or college, it makes this a very good year, with more opportunities than ever before. A record 94.5% of students who applied by the January deadline will be approaching the critical summer period having received at least one offer. High levels of offer-making by universities and colleges typically translates into more acceptances, which should give applicants plenty of confidence heading into results day. 

    I’m delighted to be back at UCAS, and my team will continue to dig further into the data as Confirmation and Clearing draws nearer to see how demand translates into accepted places come results day.

    UCAS

    UCAS, the Universities and Colleges Admissions Service, is an independent charity, and the UK’s shared admissions service for higher education.

    UCAS’ services support young people making post-18 choices, as well as mature learners, by providing information, advice, and guidance to inspire and facilitate educational progression to university, college, or an apprenticeship.

    UCAS manages almost three million applications, from around 700,000 people each year, for full-time undergraduate courses at over 380 universities and colleges across the UK.

    UCAS is committed to delivering a first-class service to all our beneficiaries — they’re at the heart of everything we do.

    Source link

  • Europe Must Do More to Protect Data Under Trump

    Europe Must Do More to Protect Data Under Trump

    Europe “needs to do more” to protect scientific data threatened by the Trump administration, the president of the European Research Council has said.

    Speaking at the Metascience 2025 conference in London, Maria Leptin said such data is in a “very precarious” position. Since Donald Trump began his second term as U.S. president, researchers have raced to archive or preserve access to U.S.-hosted data sets and other resources at risk of being taken down as the administration targets research areas including public health, climate and fields considered to be related to diversity.

    “We’ve heard the situation from the U.S. where some data are disappearing, where databases are being stopped, and this is really a wake-up call that we as a community need to do more about this and Europe needs to do more about it,” Leptin said.

    The ERC president highlighted the Global Biodata Coalition, which aims to “safeguard the world’s open life science, biological and biomedical reference data in perpetuity,” noting that the European Commission recently published a call to support the initiative.

    “Medical research critically depends on the maintenance and the availability of core data resources, and that is currently at risk. Some of these resources may disappear,” she said. “I really encourage all policymakers and funders to join the coalition.”

    “Right now is the worst time to not have access to data in view of the power of AI and the advances in computing, large language models, et cetera,” Leptin told the conference, noting that the Trump administration is not the only threat to accessible data. “The value of the data that are held across Europe is unfortunately massively reduced because of fragmentation, siloing, and uneven access.”

    A recent ERC workshop involving researchers, policymakers, industry representatives and start-ups raised some “shocking” concerns about health data, she added. “Even in the same town where researchers wanted to access the huge numbers of data that the hospitals in that town had, it was impossible because the hospitals couldn’t even share data with each other, because they used totally different data formats.”

    Boosting access to data will require “a huge effort,” Leptin acknowledged. “We of course need technical, legal and financial frameworks that make this possible and practical, [as well as] interoperable formats and common standards.”

    While not a data infrastructure in itself, the ERC “has a role to play” in improving accessibility, she said. “What we try to do is to set expectations around good data practices.”

    “We do need European-level solutions,” Leptin stressed. “The scientific questions we face, whether in climate or health or technology or [other fields], don’t stop at national borders—in fact, they are global.”

    Source link

  • The Digital Twin: How to Connect and Enable Your Student Data for Outreach, Personalization, and Predictive Insights [Webinar]

    The Digital Twin: How to Connect and Enable Your Student Data for Outreach, Personalization, and Predictive Insights [Webinar]

    You’re sitting on mountains of student data scattered across CRMs, SIS, LMS, and advising tools. Systems don’t talk. Dashboards are disconnected. And AI? Not even close. Without connection, context, or clarity, that data is nothing more than a headache and a barrier to impact. 

    The Digital Twin: How to Connect and Enable Your Student Data for Outreach, Personalization, and Predictive Insights 
    Thursday, July 24 
    2:00 pm ET / 1:00 pm CT 

    In this webinar, Bryan Chitwood, Director of Data Enablement, breaks down how you can start building your students’ Digital Twin and turn your fragmented data into real-time, actionable intelligence. We’ll show you how unified student data profiles fuel more innovative outreach, personalized engagement, and predictive insights across the student lifecycle. 

    You’ll walk away knowing: 

    • How to connect siloed data sources into a unified, reliable student profile 
    • What a Digital Twin is and how it differs from your CRM or SIS data 
    • Use-cases for personalization, predictive engagement, and lifecycle outreach 
    • Real examples of how institutions are putting Digital Twins to work right now 

    If your campus is drowning in data but starving for strategy, this is the conversation you need. 

    Who Should Attend: 

    If you are a data-minded decision-maker in higher ed or a cabinet-level leader being asked to do more with less, this webinar is for you. 

    • Presidents and Provosts 
    • VPs of Enrollment, Marketing, and Student Success 
    • Leaders charged with driving digital transformation and data-enabled decision making 

    Meet Your Presenter

    Bryan Chitwood

    Director of Data Enablement, Collegis Education

    Complete the form on the right to reserve your spot! We look forward to seeing you on Thursday, July 24. 

    Source link

  • IPEDS Data Collection Schedule (US Department of Education)

    IPEDS Data Collection Schedule (US Department of Education)

    The IPEDS data collection calendar for 2025-26 has now been posted and is available within the Data Collection System’s (DCS) Help menu, and on the DCS login page at: https://surveys.nces.ed.gov/ipeds/public/data-collection-schedule

    What is IPEDS?

    IPEDS is the Integrated Postsecondary Education Data System. It is a system of interrelated surveys conducted annually by the U.S. Department of Education’s National Center for Education Statistics (NCES). IPEDS gathers information from every college, university, and technical and vocational institution that participates in the federal student financial aid programs. The Higher Education Act of 1965, as amended, requires that institutions that participate in federal student aid programs report data on enrollments, program completions, graduation rates, faculty and staff, finances, institutional prices, and student financial aid. These data are made available to students and parents through the College Navigator college search Web site and to researchers and others through the IPEDS Data Center. To learn more about IPEDS Survey components, visit https://nces.ed.gov/Ipeds/use-the-data/survey-components.

    How is IPEDS Used?

    IPEDS provides basic data needed to describe — and analyze trends in — postsecondary education in the United States, in terms of the numbers of students enrolled, staff employed, dollars expended, and degrees earned. Congress, federal agencies, state governments, education providers, professional associations, private businesses, media, students and parents, and others rely on IPEDS data for this basic information on postsecondary institutions.

    IPEDS forms the institutional sampling frame for other NCES postsecondary surveys, such as the National Postsecondary Student Aid Study.

    Which Institutions Report to IPEDS?

    The completion of all IPEDS surveys is mandatory for institutions that participate in or are applicants for participation in any federal student financial aid program (such as Pell grants and federal student loans) authorized by Title IV of the Higher Education Act of 1965, as amended (20 USC 1094, Section 487(a)(17) and 34 CFR 668.14(b)(19)).

    Institutions that complete IPEDS surveys each year include research universities, state colleges and universities, private religious and liberal arts colleges, for-profit institutions, community and technical colleges, non-degree-granting institutions such as beauty colleges, and others.

    To find out if a particular institution reports to IPEDS, go to College Navigator and search by the institution name.

    What Data are Collected in IPEDS?

    IPEDS collects data on postsecondary education in the United States in eight areas: institutional characteristics; institutional prices; admissions; enrollment; student financial aid; degrees and certificates conferred; student persistence and success; and institutional resources including human, resources, finance, and academic libraries.

    Source link