Tag: Data

  • Keep talking about data | Wonkhe

    Keep talking about data | Wonkhe

    How’s your student data this morning?

    Depending on how close you sit to your institutional student data systems, your answer may range from a bemused shrug to an anguished yelp.

    In the most part, we remain blissfully unaware of how much work it currently takes to derive useful and actionable insights from the various data traces our students leave behind them. We’ve all seen the advertisements promising seamless systems integration and a tangible improvement in the student experience, but in most cases the reality is far different.

    James Gray’s aim is to start a meaningful conversation about how we get there and what systems need to be in place to make it happen – at a sector as well as a provider level. As he says:

    There is a genuine predictive value in using data to design future solutions to engage students and drive improved outcomes. We now have the technical capability to bring content, data, and context together in a way that simply has not been possible before now.”

    All well and good, but just because we have the technology doesn’t mean we have the data in the right place or the right format – the problem is, as Helen O’Sullivan has already pointed out on Wonkhe, silos.

    Think again about your student data.

    Some of it is in your student information system (assessment performance, module choices), which may or may not link to the application tracking systems that got students on to courses in the first place. You’ll also have information about how students engage with your virtual learning environment, what books they are reading in the library, how they interact with support services, whether and how often they attend in person, and their (disclosed) underlying health conditions and specific support needs.

    The value of this stuff is clear – but without a whole-institution strategic approach to data it remains just a possibility. James notes that:

    We have learned that a focus on the student digital journey and institutional digital transformation means that we need to bring data silos together, both in terms of use and collection. There needs to be a coherent strategy to drive deployment and data use.

    But how do we get there? From what James has seen overseas, in the big online US providers like Georgia Tech and Arizona State data is something that is managed strategically at the highest levels of university leadership. It’s perhaps a truism to suggest that if you really care about something it needs ownership at a senior level, but having that level of buy-in unlocks the resource and momentum that a big project like this needs.

    We also talked about the finer-grained aspects of implementation – James felt that the way to bring students and staff on board is to clearly demonstrate the benefits, and listen (and respond) to concerns. That latter is essential because “you will annoy folks”.

    Is it worth this annoyance to unlock gains in productivity and effectiveness? Ideally, we’d all be focused on getting the greatest benefit from our resources – but often processes and common practices are arranged in sub-optimal ways for historical reasons, and rewiring large parts of someone’s role is a big ask. The hope is that the new way will prove simpler and less arduous, so it absolutely makes sense to focus on potential benefits and their realisation – and bringing in staff voices at the design stage can make for gains in autonomy and job satisfaction.

    The other end of the problem concerns procurement. Many providers have updated their student records systems in recent years in response to the demands of the Data Futures programme. The trend has been away from bespoke and customised solutions and towards commercial off-the-shelf (COTS) procurement: the thinking here being that updates and modifications are easier to apply consistently with a standard install.

    As James outlines, providers are looking at a “buy, build, or partner” decision – and institutions with different goals (and at different stages of data maturity) may choose different options. There is though enormous value in senior leaders talking across institutions about decisions such as these. “We had to go through the same process” James outlined. “In the end we decided to focus on our existing partnership with Microsoft to build a cutting edge data warehouse, and data ingestion, hierarchy and management process leveraging Azure and MS Fabric with direct connectivity to Gen AI capabilities to support our university customers with their data, and digital transformation journey.” – there is certainly both knowledge and hard-won experience out there about the different trade-offs, but what university leader wants to tell a competitor about the time they spent thousands of pounds on a platform that didn’t communicate with the rest of their data ecosystem?

    As Claire Taylor recently noted on Wonkhe there is a power in relationships and networks among senior leaders that exist to share learning for the benefit of many. It is becoming increasingly clear that higher education is a data-intensive sector – so every provider should feel empowered to make one of the most important decisions they will make in the light of a collective understanding of the landscape.

    This article is published in association with Kortext. Join us at an upcoming Kortext LIVE event in London, Manchester and Edinburgh in January and February 2025 to find out more about Wonkhe and Kortext’s work on leading digital capability for learning, teaching and student success.

    Source link

  • Common App data shows 5% jump in first-year college applicants

    Common App data shows 5% jump in first-year college applicants

    This audio is auto-generated. Please let us know if you have feedback.

     Dive Brief:

    • First-year Common Applications are up 5% year over year, with over 1.2 million prospective students submitting the forms for the 2024-25 application cycle as of Jan. 1, the company said Thursday.
    • First-year applications ticked up across both institution types and student demographics, but some groups saw accelerated growth. Common App found disproportionate increases among students believed to be from low-income households and those who identified as underrepresented minorities. 
    • Applications to public institutions grew by 11% year over year, outpacing the 3% growth seen at private colleges, Thursday’s report said. 

    Dive Insight:

    Applications from prospective first-year students have steadily increased since the 2020-21 application cycle, Common App found. 

    That’s despite the challenges that have thrown aspects of college admissions into tumult, including the botched rollout of the updated Free Application for Federal Student Aid during the 2024-25 cycle and the U.S. Supreme Court’s June 2023 ban on race-conscious admissions.

    Roughly 960,000 students used the Common App portal to submit over 4.8 million applications during the 2020-21 cycle. In the 2024-25 cycle, over 1.2 million users submitted just under 6.7 million applications.

    Prospective students can continue to apply to colleges through the month and beyond. But a majority of applications for the following fall semester are traditionally submitted by the end of December. 

    The number of colleges first-year prospects applied to ticked up slightly between 2020-21 and 2024-25, but remained between five and six institutions. 

    Common App found disproportionate application growth among students from low-income households. The portal does not directly collect household income from applicants, but researchers used students who were eligible for fee waivers as a proxy. Application rates for that group increased by 10%, compared to 2% for their counterparts who weren’t eligible for the waivers.

    Moreover, applications from students in ZIP codes where median incomes fall below the national average grew 9% since the 2023-24 cycle, compared to 4% growth from those in above-median income areas, Common App found.

    The company also saw more applications from minority groups underrepresented in higher education, classified by researchers as those who identify as Black or African American, Latinx, Native American or Alaska Native, or Native Hawaiian or other Pacific Islander.

    As of Jan. 1, 367,000 underrepresented applicants used Common App to submit first-year applications. But their numbers are growing at a faster rate than their counterparts.

    Among students in underrepresented groups, first-year applications grew by 13% since last year, compared to the 2% growth for the others. 

    Latinx and Black or African American candidates drove much of that growth, showing year-over-year increases of 13% and 12%, respectively.

    However, it appears that students are reconsidering their application materials following the 2023 Supreme Court decision. In June, separate Common App research found a decrease in the number of Asian, Black, Latinx and White students referencing race or ethnicity in their college essays.

    Thursday’s report also found more first-year students including standardized test scores in their applications, up 10% since last year. The number of applicants leaving them out remained unchanged year over year.

    “This marks the first time since the 2021–22 season that the growth rate of test score reporters has surpassed that of non-reporters, narrowing the gap between the two groups,” the report said.

    That’s despite interest slowing in highly selective colleges, the type of institutions that have historically most used standardized test scores in the admissions process.

    Applications to colleges with acceptance rates below 25% grew just 2% in 2024-25, Common App found. That’s compared to the between 8% and 9% increases seen at institutions of all other selectivity levels.

    Just 5% of the colleges on Common App required test scores in the 2024-25 application cycle, a slight uptick from the 4% that did so the previous year. 

    COVID-19 pushed many institutions with test requirements to temporarily waive this mandate, and some ultimately made the change permanent.

    But others returned to their original rules. And reversal announcements continue to trickle in, including one from the highly selective University of Miami just this past Friday. 

    Source link

  • This week in numbers: Clearinghouse retracts first-year enrollment data

    This week in numbers: Clearinghouse retracts first-year enrollment data

    We’re rounding up recent stories, including a methodology mea culpa and billions of dollars in discharged loan debt.

    Source link

  • Freshman enrollment up this fall; data error led to miscount

    Freshman enrollment up this fall; data error led to miscount

    Freshman enrollment did not decline this fall, as previously reported in the National Student Clearinghouse Research Center’s annual enrollment report in October. On Monday, the NSC acknowledged that a methodological error led to a major misrepresentation of first-year enrollment trends, and that first-year enrollment appears to have increased.

    The October report showed first-year enrollments fell by 5 percent, in what would have been the largest decline since the COVID-19 pandemic—and appeared to confirm fears that last year’s bungled rollout of a new federal aid form would curtail college access. Inside Higher Ed reported on that data across multiple articles, and it was featured prominently in major news outlets like The New York Times and The Washington Post.

    According to the clearinghouse, the error was a methodological one, caused by mislabeling many first-year students as dual-enrolled high school students. This also led to artificially inflated numbers on dual enrollment; the October report said the population of dually enrolled students grew by 7.2 percent.

    “The National Student Clearinghouse Research Center acknowledges the importance and significance of its role in providing accurate and reliable research to the higher education community,” Doug Shapiro, the center’s executive director, wrote in a statement. “We deeply regret this error and are conducting a thorough review to understand the root cause and implement measures to prevent such occurrences in the future.”

    On Jan. 23, the clearinghouse will release another annual enrollment report based on current term estimates that use different research methodologies.

    The Education Department had flagged a potential issue in the data this fall when its financial aid data showed a 5 percent increase in students receiving federal aid. In a statement, Under Secretary James Kvaal said the department was “encouraged and relieved” by the clearinghouse’s correction.

    Source link

  • The data dark ages | Wonkhe

    The data dark ages | Wonkhe

    Is there something going wrong with large surveys?

    We asked a bunch of people but they didn’t answer. That’s been the story of the Labour Force Survey (LFS) and the Annual Population Survey (APS) – two venerable fixtures in the Office for National Statistics (ONS) arsenal of data collections.

    Both have just lost their accreditation as official statistics. A statement from the Office for Statistical Regulation highlights just how much of the data we use to understand the world around us is at risk as a result: statistics about employment are affected by the LFS concerns, whereas APS covers everything from regional labour markets, to household income, to basic stuff about the population of the UK by nationality. These are huge, fundamental, sources of information on the way people work and live.

    The LFS response rate has historically been around 50 per cent, but it had fallen to 40 per cent by 2020 and is now below 20 per cent. The APS is an additional sample using the LFS approach – current advice suggests that response rates have deteriorated to the extent that it is no longer safe to use APS data at local authority level (the resolution it was designed to be used at).

    What’s going on?

    With so much of our understanding of social policy issues coming through survey data, problems like these feel almost existential in scope. Online survey tools have made it easier to design and conduct surveys – and often design in the kind of good survey development practices that used to be the domain of specialists. Theoretically, it should be easier to run good quality surveys than ever before – certainly we see more of them (we even run them ourselves).

    Is it simply a matter of survey fatigue? Or are people less likely to (less willing to?) give information to researchers for reasons of trust?

    In our world of higher education, we have recently seen the Graduate Outcomes response rate drop below 50 per cent for the first time, casting doubt as to its suitability as a regulatory measure. The survey still has accredited official statistics status, and there has been important work done on understanding the impact of non-response bias – but it is a concerning trend. The national student survey (NSS) is an outlier here – it has a 72 per cent response rate last time round (so you can be fairly confident in validity right down to course level), but it does enjoy an unusually good level of survey population awareness even despite the removal of a requirement for providers to promote the survey to students. And of course, many of the more egregious issues with HESA Student have been founded on student characteristics – the kind of thing gathered during enrollment or entry surveys.

    A survey of the literature

    There is a literature on survey response rates in published research. A meta-analysis by Wu et al (Computers in Human Behavior, 2022) found that, at this point, the average online survey result was 44.1 per cent – finding benefits for using (as NSS does) a clearly defined and refined population, pre-contacting participants, and using reminders. A smaller study by Diaker et al (Journal of Survey Statistics and Methodology, 2020) found that, in general, online surveys yield lower response rates (on average, 12 percentage point lower) than other approaches.

    Interestingly, Holton et al (Human Relations, 2022) show an increase in response rates over time in a sample of 1014 journals, and do not find a statistically significant difference linked to survey modes.

    ONS itself works with the ESRC-funded Survey Futures project, which:

    aims to deliver a step change in survey research to ensure that it will remain possible in the UK to carry out high quality social surveys of the kinds required by the public and academic sectors to monitor and understand society, and to provide an evidence base for policy

    It feels like timely stuff. Nine strands of work in the first phase included work on mode effects, and on addressing non-response.

    Fixing surveys

    ONS have been taking steps to repair LFS – implementing some of the recontacting/reminder approaches that have been successfully implemented and documented in the academic literature. There’s a renewed focus on households that include young people, and a return to the larger sample sizes we saw during the pandemic (when the whole survey had to be conducted remotely). Reweighting has led to a bunch of tweaks to the way samples are chosen, and non-responses accounted for.

    Longer term, the Transformed Labour Force Survey (TLFS) is already being trialed, though the initial March 2024 plans for full introduction has been revised to allow for further testing – important given a bias towards older age group responses, and an increased level of partial responses. Yes, there’s a lessons learned review. The old LFS and the new, online first, TLFS will be running together at least until early 2025 – with a knock on impact on APS.

    But it is worth bearing in mind that, even given the changes made to drive up responses, trial TLFS response rates have been hovering around just below 40 per cent. This is a return to 2020 levels, addressing some of the recent damage, but a long way from the historic norm.

    Survey fatigue

    More usually the term “survey fatigue” is used to describe the impact of additional questions on completion rate – respondents tire during long surveys (as Jeong et al observe in the Journal of Development Economics) and deliberately choose not to answer questions to hasten the end of the survey.

    But it is possible to consider the idea of a civilisational survey fatigue. Arguably, large parts of the online economy are propped up on the collection and reuse of personal data, which can then be used to target advertisements and reminders. Increasingly, you now have to pay to opt out of targeted ads on websites – assuming you can view the website at all without paying. After a period of abeyance, concerns around data privacy are beginning to reemerge. Forms of social media that rely on a constant drive to share personal information are unexpectedly beginning to struggle – for younger generations participatory social media is more likely to be a group chat or discord server, while formerly participatory services like YouTube and TikTok have become platforms for media consumption.

    In the world of public opinion research the struggle with response rates has partially been met via a switch from randomised phone or in-person to the use of pre-vetted online panels. This (as with the rise of focus groups) has generated a new cadre of “professional respondents” – with huge implications for the validity of polling even when weighting is applied.

    Governments and industry are moving towards administrative data – the most recognisable example in higher education being the LEO dataset of graduate salaries. But this brings problems in itself – LEO lets us know how much income graduates pay tax on from their main job, but deals poorly with the portfolio careers that are the expectation of many graduates. LEO never cut it as a policymaking tool precisely because of how broadbrush it is.

    In a world where everything is data driven, what happens when the quality of data drops? If we were ever making good, data-driven decisions, a problem with the raw material suggests a problem with the end product. There are methodological and statistical workarounds, but the trend appears to be shifting away from people being happy to give out personal information without compensation. User interaction data – the traces we create as we interact with everything from ecommerce to online learning – are for now unaffected, but are necessarily limited in scope and explanatory value.

    We’ve lived through a generation where data seemed unlimited. What tools do we need to survive a data dark age?

    Source link

  • NM Vistas: What’s New in State Student Achievement Data?

    NM Vistas: What’s New in State Student Achievement Data?

    By Mo Charnot
    For NMEducation.com

    The New Mexico Public Education Department has updated its student achievement data reporting website — NM Vistas — with a renovated layout and school performance data from the 2023-2024 academic year, with expectations for additional information to be released in January 2025. 

    NM Vistas is crucial to informing New Mexicans about school performance and progress at the school, district and state levels through yearly report cards. The site displays student reading, math and science proficiency rates taken from state assessments, as required by the federal Every Student Succeeds Act. Districts and schools receive scores between 0 and 100 based on performance, and schools also receive designations indicating the level of support the school requires to improve.

    Other information on the site includes graduation rates, attendance and student achievement growth. Data also shows rates among specific student demographics, including race, gender, disability, economic indicators and more. 

    PED Deputy Secretary of Teaching, Learning and Innovation, Amanda DeBell told NM Education in an interview that this year’s recreation of the NM Vistas site came from a desire to go beyond the state’s requirements for school performance data.

    “We knew that New Mexico VISTAs had a ton of potential to be a tool that our communities could use,” DeBell said. 

    One new data point added to NM Vistas this year is early literacy rates, which measures the percentage of students in grades K-2 who are reading proficiently at their grade level. Currently, federal law only requires proficiency rates for grades 3-8 to be published, and New Mexico also publishes 11th grade SAT scores. In the 2023-2024 school year, 34.6% of students grades K-2 were proficient in reading, the data says.

    DeBell said several advisory groups encouraged the PED to report early literacy data through NM Vistas.

    “We were missing some key data-telling opportunities by not publishing the early literacy [rates] on our website, so we made a real effort to get those early literacy teachers the kudos that they deserve by demonstrating the scores,” DeBell said.

    The PED also added data on individual schools through badges indicating specific programs and resources the school offers. For example, Ace Leadership High School in Albuquerque has two badges: one for being a community school offering wraparound services to students and families, and another for qualifying for the career and technical education-focused Innovation Zone program.

    “What we are really trying to do is provide a sort of one-stop shopping for families and community members to highlight all of the work that schools are doing,” DeBell said.

    The updated NM Vistas website has removed a few things as well, most notably the entire 2021-2022 NM Vistas data set. DeBell said this was because the PED changed the way it measured student growth data, which resulted in the 2021-2022 school year’s data being incomparable to the most recent two years. 

    “You could not say that the schools in 2021-2022 were doing the same as 2022-2023 or 2023-2024, because the mechanism for calculating their scores was different,” DeBell said.

    However, this does leave NM Vistas with less data overall, only allowing viewers to compare scores from the latest data set to last year’s. 

    In January 2025, several new indicators are expected to be uploaded to the site, including:

    • Student performance levels: Reports the percentage of students who are novices, nearing proficiency, proficient and advanced in reading, math and science at each school, rather than only separating between proficient and not proficient.
    • Results for The Nation’s Report Card (also known as NAEP): Compares student proficiencies between US states.
    • Educator qualifications: DeBell said this would include information on individual schools’ numbers of newer teachers, substitute teachers covering vacancies and more.
    • College enrollment rates: only to be statewide numbers indicating the percentage of New Mexico students attending college after graduating, but DeBell said she later hopes the PED can narrow down by each K-12 school.
    • Per-pupil spending: How much money each school, district and the state spends per-student on average. 
    • School climate: Links the viewer to results of school climate surveys asking students, parents and teachers how they feel about their school experience.
    • Alternate assessment participation: Percentage of students who take a different assessment in place of the NM-MSSA or SAT.

    “We want VISTAs to be super, super responsive, and we want families to be able to use this and get good information,” DeBell said. “We will continue to evolve this until it’s at its 100th iteration, if it takes that much.”

    This year, the PED released statewide assessment results for the 2023-2024 school year to NM Vistas on Nov. 15. Results show 39% of New Mexico students are proficient in reading, 23% are proficient in math and 38% are proficient in science. Compared to last year’s scores, reading proficiency increased by 1%, math proficiency decreased by 1% and science proficiency increased by 4%.

    Source link

  • Prioritizing Mental Health Support in Community Colleges: Key Data from 2023

    Prioritizing Mental Health Support in Community Colleges: Key Data from 2023

    Title: Supporting Minds, Supporting Learners: Addressing Student Mental Health to Advance Academic Success

    Source: Center for Community College Student Engagement

    The 2023 Community College Survey of Student Engagement (CCSSE) and Survey of Entering Student Engagement (SENSE) gathered essential data to guide community colleges in supporting student mental health and well-being. The surveys collected responses from 61,085 students at 149 community colleges in spring 2023 and 13,950 students at 61 community colleges in fall 2023, respectively.

    Key findings include:

    • Mental health concerns are prevalent among CCSSE and SENSE respondents. In the two weeks before taking the survey, half of CCSSE students and 47 percent of SENSE students reported feeling down, depressed, or hopeless for at least several days. Additionally, 66 percent of students in both groups felt nervous, anxious, or on edge for at least several days.
    • Approximately 26 percent of CCSSE respondents and 23 percent of SENSE respondents likely have a depressive disorder. Over half (53 percent) of students who identify with a gender identity other than man or woman have a probable depressive disorder, compared with 28 percent of women and 25 percent of men. Traditional college-age students (31 percent) and those with a GPA of C or lower (39 percent) are more likely to have a depressive disorder, compared with 19 percent of nontraditional-age students and 23 percent of students with a GPA of B or higher.
    • Overall, 32 percent of CCSSE respondents and 29 percent of SENSE respondents likely have generalized anxiety disorder. Among CCSSE students, 62 percent of those identifying with another gender likely have an anxiety disorder, in contrast to 36 percent of female and 25 percent of male students. Students identifying with two or more races saw the highest levels of generalized anxiety disorder, at 36 percent. Among SENSE respondents, traditional-age students were more likely to have generalized anxiety disorder, at 30 percent, compared to 23 percent of nontraditional-age students.
    • Over half of CCSSE respondents (56 percent) reported that emotional or mental health challenges affected their academic performance in the previous four weeks. 30 percent noted these issues impacted their performance for three or more days. Nearly two-thirds of women (63 percent) and almost half of men (47 percent) reported performance declines due to mental health issues, while 85 percent of students identifying with another gender faced academic impacts. Lower GPA students were more likely to report that mental health issues affected their academic performance.
    • Students with likely generalized anxiety disorder are twice as likely, and those with a depressive disorder are almost twice as likely, to report academic performance declines due to emotional or mental difficulties compared to students likely without these disorders.
    • 63 percent of students identifying with another gender reported that mental health challenges could lead them to withdraw from classes, compared to 39 percent of women and 29 percent of men. More than half of students with a GPA of C or lower (53 percent) stated they were at least somewhat likely to consider withdrawal due to mental health concerns, in contrast to 33 percent of students with a GPA of B or higher.
    • High percentages of students felt their college prioritizes mental health, yet about three in 10 CCSSE respondents and slightly more SENSE respondents said they wouldn’t know where to seek help if needed. Hispanic or Latino students were most likely among racial/ethnic groups to report not knowing where to turn for mental health support.
    • Over one-third of students with likely depressive or generalized anxiety disorders reported not knowing where to find professional mental health assistance if needed. Among CCSSE respondents who needed mental health support in the past year, 42 percent never sought help, with Hispanic or Latino students and men more likely than other groups to indicate they hadn’t pursued support. Approximately one-third of students with probable depressive or generalized anxiety disorders reported never seeking help.Many students cited limited resources as the main barrier to seeking mental health support. Students, especially traditional-age students and men, also frequently mentioned concerns about others’ perceptions and uncertainty about what kind of help they need.
    • Across all groups, students expressed a strong preference for in-person individual counseling or therapy over teletherapy and other support options.
    • Only 16 percent of CCSSE respondents considered it essential that their mental health provider understands their cultural background. However, students with another gender identity and Black or African American students were more likely to value culturally informed mental health support.

    Check out the full report on the CCSSE website.

    —Nguyen DH Nguyen


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • It’s mid October, PED. Where is the data?

    It’s mid October, PED. Where is the data?

    Last month on September 19, the Public Education Department presented to the Legislative Education Study Committee slide decks showing preliminary, high-level results of the state’s spring assessments, promising that detailed data would be forthcoming soon thereafter.

    A month has passed and the PED has released nothing further. No district- or school-level data files or presentations. Not even a press release. The school year is one-quarter over and the public is being kept in the dark about the state of New Mexico’s schools?

    Repeated outreach by New Mexico Education to the PED has been met with silence. The one PED data slide presented showed that statewide, there was incremental improvement in reading – from 38 percent proficient in 2023 to 39 percent proficient in 2024, a decline in math (from 24 percent proficient to 23 percent), and a three percentage-point increase in science (from 34 percent to 37 percent).

    A companion presentation by LESC staff contained richer data, but also showed different proficiency rates than the PED deck – reading at 38 percent proficient and math down to 22 percent proficient.

    The LESC deck also contained some graphs that merit a deeper dive, which is impossible unless and until PED releases the data files. For example, the achievement gaps between economically disadvantaged and non-economically disadvantaged students in English Language Arts closed significantly statewide. 

    This occurred both because economically disadvantaged students’ ELA scores increased by five percentage-points (and nine percentage-points since 2022), and because more affluent students saw their ELA scores decline by five percentage-points.

    Ideally, gaps should close because those on the lower end are making big gains, not because those at the higher end are dropping.

    Without detailed data, it is not possible for researchers to dive in to determine why and where these changes are occurring.

    National education researcher Chad Aldeman recently wrote on his Substack blog that this practice of hiding or delaying data has become a nationwide trend among state education departments.

    “Here we are in pumpkin spice / decorative gourd season, and half the states still have not released their results yet,” Alderman wrote. “To put it colloquially, this is too damn slow! Summer is the key here—it’s the time when parents and educators could actually do something about the results. By the time fall rolls around, kids are already back in school and they’ve moved on to the next grade. Teachers have already written their lesson plans for the year….

    …“When it comes to releasing their results, too many states are putting parents last,” Aldeman wrote. “This game of telephone is also unnecessary in today’s modern world. Most state assessments are now administered on computers and can be scored instantaneously. Private testing companies like the ACT and SAT promise to deliver results in 2-4 weeks.”

    According to Aldeman’s research, last year the PED didn’t release its data until mid-November, which ranked New Mexico 45th out of the 50th states. 

    Will this year be any different?

    Source link

  • NCES Data Show Modern Learners Want Career Focused Degrees

    NCES Data Show Modern Learners Want Career Focused Degrees

    Brief

    The 2023 NCES completion data points to some interesting – and impactful – student trends that continue to paint a picture of a fundamentally changing set of priorities for the Modern Learner. Specifically, more students are moving towards degrees that have firm career outcomes, either in furthering their current career or starting a new endeavor.

    Institutions need to pay attention to these trends in order to prepare themselves for a radically different higher education market in the next 5-7 years. This includes prioritizing programs that align with the market’s appetite, as well as re-investing in the value proposition of programs that are currently declining in popularity.

    Other Highlights

    • Associate degree completions saw marked decline, which is notable considering the growth of Undergraduate Certificate completions. Students seem to be preferring certificates that can lead to employment opportunities.
    • STEM programs continue to either grow or remain stable, depending on the level of the degree. This was most notable at the Graduate level. As more jobs continue to require advanced degrees, this trend is set to only grow in importance.
    • Liberal Arts programs across all levels experienced significant YoY reductions in completions. Schools that are defined by their Liberal Arts programs will need to assess ways in which they continue to project relevance as the market shifts.
    • Undergraduate Health Profession programs also saw a decline, which goes against the commonly held belief that the labor market and these programs are continuing to grow. This is something that should definitely be monitored and evaluated, to ensure that institutions do not over-invest in a sector that may be slowing.

    2023 NCES Completions Data and the Changing Priorities of the Modern Learner

    The 2023 National Center for Education Statistics (NCES) completions data offers a rich and complex tapestry of insights into the trajectory that the Modern Learner is taking their education. As enrollment managers and marketers, it is our imperative to move beyond surface-level observations and delve into the intricate patterns and implications woven within these numbers. This data serves not merely as a historical record but as a powerful compass, guiding us towards a deeper understanding of the Modern Learner’s market demands, and the strategic decisions that will chart the course for institutions in the years to come.

    This year’s data unveils a series of significant shifts in student choices, reflecting both the evolving needs of the labor market and the lingering reverberations of the COVID-19 pandemic. We observe a notable decline in associate degree completions, particularly in general studies and humanities, while undergraduate certificates continue their upward trajectory. At the bachelor’s level, STEM fields remain stable, while other areas, especially those associated with traditional liberal arts programs, face headwinds. Graduate programs, particularly in STEM disciplines, are experiencing a surge in completions, and both undergraduate and graduate certificates continue to gain popularity.

    In this analysis, we will dive deep into the data, exploring the specific programs experiencing growth or decline, examining the multifaceted factors driving these trends, and discussing the profound implications for higher ed. We will delve into the remarkable growth in graduate programs and certificates, highlighting the increasing demand for advanced credentials in the labor market. We will also confront the undergraduate decline, exploring the potential impact of the COVID-19 pandemic and the looming 2025 enrollment cliff, with a particular focus on the challenges facing private non-profit liberal arts schools. By understanding these multifaceted trends and their interconnectedness, we can proactively adapt our strategies, ensuring that our institutions not only remain relevant and competitive but also thrive amidst a landscape in flux.

    Associate Degree: Trade Focused

    The 5% decline in associate degree completions is notable both in what programs dropped and which programs are continuing to see growth. The most significant drop emanates from Liberal Arts and Sciences, General Studies, and Humanities, programs that have historically served as a bridge to further education or a broad foundation for diverse career paths. 61% of the YoY decline were in this category. The decline in these areas, coupled with the simultaneous rise in undergraduate certificates, suggests a growing preference among students for more focused, career-oriented pathways that offer a faster and more tangible return on investment.

    This shift in student preferences is not surprising in the context of a rapidly changing labor market that increasingly values specialized skills and knowledge. Students are seeking educational pathways that provide them with a clear and direct route to employment and career advancement. In this environment, the perceived value of broad-based, general education programs may be diminishing.

    However, amidst this overall decline, we observe encouraging signs of growth in fields directly aligned with high-demand skills. Programs such as Construction Trades, Mechanic and Repair Technologies/Technicians, and Computer and Information Sciences and Support Services have all witnessed increases in completions. This trend underscores the enduring value of associate degrees that equip students with tangible, marketable skills, enabling them to seamlessly transition into the workforce and meet the demands of employers seeking skilled talent.

    Bachelor’s Degree: Value Proposition Challenges

    At the bachelor’s level, we encounter a mixed bag of stability and change. While STEM fields remain a stronghold, with only a negligible 0.07% dip, other areas, particularly those associated with traditional liberal arts programs, are facing challenges. The most pronounced decline occurs in Health Professions, a field traditionally associated with strong job prospects and stable growth. This decline, juxtaposed with the increase in master’s level completions in Health Professions, suggests a potential shift towards requiring advanced degrees for certain healthcare roles. This mirrors a broader trend of “graduate degree bloat” in the labor market, where employers increasingly demand advanced credentials for positions that previously required only a bachelor’s degree.

    The COVID-19 pandemic has undoubtedly exacerbated the challenges facing undergraduate programs. The disruption to traditional learning models, coupled with economic uncertainty and concerns about the value of a college degree, has led many students to reconsider their educational plans. The looming 2025 enrollment cliff, which predicts a significant drop in the number of high school graduates, further compounds these challenges, creating a perfect storm for undergraduate enrollment.

    Private non-profit liberal arts schools are particularly vulnerable in this environment. The growth areas in the undergraduate space are mainly concentrated in STEM programs, leaving liberal arts institutions grappling with declining enrollments and a need to reimagine their value proposition. Adapting to this changing landscape will require innovative approaches to curriculum design, student support, and career services. Tuition driven, liberal arts institutions must demonstrate the relevance and value of their programs in today’s world, not only highlighting the critical thinking, communication, and problem-solving skills that their graduates possess (which has always been their particular promise), but also their undergraduate’s career opportunities.

    Graduate Studies: Career Growth and Specialization

    Graduate programs, especially those in STEM disciplines, are experiencing a period of remarkable growth. The 51% and 25% surges in Computer and Information Science and Support Services and Engineering master’s completions, respectively, echo the trends at the bachelor’s level and underscore the premium placed on advanced technical expertise. The overall 30% rise in STEM master’s completions further solidifies this trend, reflecting the insatiable demand for skilled professionals in these fields.

    This surge in graduate completions aligns with the broader trend of graduate degree bloat (others might more favorably describe it as “expansion”) in the labor market. As certain industries and professions increasingly require advanced degrees for career advancement, we can anticipate continued growth in graduate programs, particularly in fields that offer a clear pathway to high-demand, well-paying jobs. This presents a significant opportunity for institutions to expand their graduate offerings and cater to the growing population of working professionals seeking to upskill and advance their careers.

    Graduate certificates are also experiencing growth, with a 2% increase in completions. The growth in fields like Computer and Information Technology, Psychology, and Engineering/Engineering-related Technologies/Technicians demonstrates the appeal of these focused credentials for professionals seeking to enhance their skill sets or transition into new careers. The flexibility and shorter duration of graduate certificates make them an attractive option for busy professionals who may not have the time or resources to pursue a full master’s degree – especially if the certificates are tied to a degree later.

    The flourishing graduate landscape presents a wealth of opportunities for institutions. Expanding graduate program offerings, enhancing online and hybrid learning options, and strategically marketing to working professionals are all essential strategies for capitalizing on this growth. The increasing popularity of graduate certificates also underscores the need for institutions to develop a diverse portfolio of graduate programs that cater to the varied needs and preferences of learners.

    Navigating the Data’s Implications for Engaging with the Modern Learner

    The 2023 NCES completions data provides a roadmap for navigating the complexities of the higher education landscape. The trends we’ve observed highlight the growing preference for career-focused programs, specialized credentials, and flexible learning options. They also underscore the challenges facing undergraduate programs, particularly in the liberal arts, in the wake of the COVID-19 pandemic and the approaching enrollment cliff.

    To thrive in this environment, institutions must be proactive, agile, and data-driven. The Modern Learner is looking for clear career outcomes – not just in program availability but in the flexibility that comes with balancing work with furthering education. They want to know exactly what they can expect from their investment of time and money to the program. Schools must also reimagine their programs, enhance student support services, and strategically market offerings to meet the evolving needs of learners and the demands of the labor market. They need to embrace innovation and explore new models of education that provide students with the skills and knowledge they need to succeed in the 21st century.

    For associate degree programs, this may involve a greater emphasis on career-focused pathways, stackable credentials, and partnerships with employers. Bachelor’s degree programs, especially in the liberal arts, may need to re-articulate their value proposition, highlighting the transferable skills and lifelong learning benefits that their graduates acquire. Graduate programs should continue to expand and innovate, offering a mix of traditional degrees and flexible certificates to meet the diverse needs of working professionals.

    Above all else, if this data is speaking to troubling realities on campus, the most important takeaway should be: trying the same strategies that are producing tepid results in regards to enrollment growth will not be the solution going forward. If you are seeing challenging enrollment numbers for any level of program, think about how your institution can more readily adapt to these changing trends, whether that be introducing multiple starts per term, reworking tuition costs, or making better strategic use of marketing and enrollment processes for priority programs.

    Is Your Institution Ready for the Modern Learner?

    We help schools all the time who have been trying to fit a square peg into a round hole, and often the solution is for an outside perspective to create a vision for the future. The time to act is right now, there is a quickly closing time frame for ensuring a flourishing future for your institution. In fact for many schools, it is already too late. The Modern Learner is moving at a swift pace, and if universities do not keep up, they will quickly be left behind.

    Source link

  • More Gender Breakouts of Admission Data

    More Gender Breakouts of Admission Data

    I’ve written a lot about yield rates over time, and I’ve also written about differences in admission patterns among male and female applicants here and here; I’ve decided to take a fresh look at both based on some continuing discussions I’ve heard recently. 

    You have, of course, heard about the crisis of male enrollment in American colleges, which, if you look at the data, is really a crisis of enrollment at Community Colleges.  Far be it from me to insist on data, however.

    Here is the same data for women, just to point out that there are differences.  Whether we should celebrate increasing attainment among young women or decry the inability of young men to keep up is your choice. 

    Regardless, here is a detailed breakout of these patterns as they show up in admissions over time.  There are four views here: A summary on tab one (using the tabs across the top); ratios of women to men at all stages of the process and estimated applications per student; gender-specific admission rates at the highly rejectives over time; and, for anyone who wants to download the data using the little icon at the bottom, a spreadsheet format.  Note: IPEDS just started collecting application data on non-binary students, so it will be a while before any trend analysis is possible.  For 2022, I only included students who self-identified as male or female.)

    Rather than explain the interactivity, I’ve put two buttons on the first view: Hover over the Orange Plus Sign to read some caveats about the data; and hover over the lightbulb for information about how to interact.

    As always, I’d love to hear what you see.

    Source link