

Following years of financial challenges, NJCU found a lifeline in Kean after a state-appointed monitor ordered the university to find a financial partner.
The $10 million state allocation — a small fraction of the $3.1 billion New Jersey is set to spend on higher education in fiscal 2026 — will go toward “feasibility studies, planning and legal work tied to the merger” between NJCU and Kean. But it’s unlikely to cover the full cost of the process.
In 2020, a University System of Georgia regent estimated that just changing the name of an institution — updating everything from signage to stationery — cost over $3 million.
Under Kean and NJCU’s letter of intent, the former would assume the latter’s assets and liabilities and NJCU’s campus would be renamed Kean Jersey City.
As the two universities go through the merger process, Kean is also to receive state funding for over 1,100 NJCU jobs in the form of a loan, per the state’s budget. If the merger falls through, the funded positions will return to NJCU.
A 2019 working paper found that, on average, a merger between two nonprofit colleges raised tuition prices by students between 5% and 7%.
But Kean appears to be poised to buck that trend with its elimination of out-of-state tuition. Under the new plan, the university will drop out-of-state tuition for current and new undergraduate and graduate students.
“Kean’s outstanding academics, proximity to New York City and growing research programs make the University appealing to students outside of New Jersey,” Michael Salvatore, Kean’s executive vice president for academic and administrative operations, said in a Tuesday statement. “This will enable us to tap into expanded markets while bringing students into the state.”
In the 2025-26 academic year, full-time students from New Jersey paid $7,649.80 per semester in tuition and fees, while their out-of-state counterparts paid $12,008.58. In-state and out-of-state graduate students paid $1,019.54 and $1,206.64 per credit, respectively.

Key points:
While most teachers are eager to implement the science of reading, many lack the time and tools to connect these practices to home-based support, according to a new national survey from Lexia, a Cambium Learning Group brand.
The 2025 Back-to-School Teacher Survey, with input from more than 1,500 K–12 educators nationwide, points to an opportunity for district leaders to work in concert with teachers to provide families with the science of reading-based literacy resources they need to support student reading success.
Key insights from the survey include:
“Teachers know that parental involvement can accelerate literacy and they’re eager for ways to strengthen those connections,” said Lexia President Nick Gaehde. “This data highlights how districts can continue to build on momentum in this new school year by offering scalable, multilingual, and flexible family engagement strategies that align with the science of reading.”
Teachers also called for:
Gaehde concluded, “Lexia’s survey reflects the continued national emphasis on Structured Literacy and shows that equipping families is essential to driving lasting student outcomes. At Lexia we’re committed to partnering with districts and teachers to strengthen the school-to-home connection. By giving educators practical tools and data-driven insights, we help teachers and families work together–ensuring every child has the literacy support they need to thrive.”
The complete findings are available in a new report, From Classroom to Living Room: Exploring Parental Involvement in K–12 Literacy. District leaders can also download the accompanying infographic, “What District Leaders Need To Know: 5 Key Findings About Family Engagement and Literacy,” which highlights the most pressing data points and strategic opportunities for improving school-to-home literacy connections.

As generative AI tools become more common, a growing number of young people turn first to chatbots when they have questions. A survey by the Associated Press found that among AI users, 70 percent of young Americans use the tools to search for information.
For colleges and universities, this presents a new opportunity to reach students with curated, institution-specific resources via chatbots.
In the most recent episode of Voices of Student Success, Jeanette Powers, executive director of the student hub at Western New England University, discusses the university’s chatbot, Spirit, powered by EdSights, and how the technology helps staff intervene when students are in distress.
An edited version of the podcast appears below.
Q: Can you give us the backstory—how Spirit got to campus and what need you all were looking to fulfill?
A: Sure, Western New England, we are the Golden Bears, and our mascot’s name is Spirit. So, Spirit is behind the scenes of our chatbot.
In the year 2023–24, we were trying to look at ways that we could get student voices at the center of what we’re doing. The Western New England philosophy and kind of core values really is about student-centered learning and support. We wanted to try to find a way to engage students earlier than our typical reporting systems come out, and we really wanted to hear the student voice.
Over the course of the year, we did some research and [looked] at different AI platforms that would provide some resources for us. And we landed on EdSights, which is an amazing company that has helped us really bring Spirit to life, where students are using the chatbot on a regular basis to get questions answered, to get resources to know where to go on campus and to also give us information so that we can better support them. We really wanted our chatbot to be reflective of our community, which is why we use our mascot as kind of behind the scenes to reach out to students.
Q: Yeah, it probably seems a little less scary to talk to your mascot than maybe an anonymous administrator.
A: Exactly, especially for our first-year students. When they’re coming on campus, they’ve met the mascot at many open house services and orientation, so they have that connection right away.
Q: You mentioned that this was a semirecent addition to your campus. For some people, AI can still be kind of scary. Was there a campus culture around AI? Or, how would you describe the landscape at WNE when it comes to embracing AI or having skepticism around using AI, especially in a student-facing way like this?
A: AI is so new, and it’s changing rapidly. Western New England has really embraced it. I think one of the biggest things that we looked at was just to make sure that there’s a human side to this AI system. And that’s, I think, one of the most powerful pieces about our AI chatbot … yes, it’s a chatbot, but we also have human helpers, myself and a colleague, who are monitoring and able to reach out to students when there’s any concern.
There’s a lot of systems in place, I think, to protect students. If there’s something going on or they share something with the chatbot, we’re here to help, and we let them know that there are humans behind the chatbot. I think that was probably one of the wider concerns before we started, was, how do we make sure we don’t miss anything that might be reported to a chatbot?
It really also helps with managing time. Students can ask the chatbot questions about WNE 24-7. The student hub, we’re open Monday through Friday, 8:30 a.m. to 4:30 p.m., but then we’re not around on the weekends and at night. Students still have questions at that time, so they can reach out [to Spirit]. It’s an extension of the Student Hub. We’ve really been able to get students resources and information right away.
That’s been really helpful for them to know where to go and who to connect with. A lot of our first-year students are the main users, but all of our students are using the chatbot. The system’s been really great to be able to support students and get information from them but also give them information.
Q: I wonder if you can talk us through how you all customized it to make it campus-specific and really ensure that students know what’s available to them and how this is their community and their college experience?
A: That’s so key, because it’s not an external chatbot—it’s not ChatGPT, where you can google how to do your homework. I’ve had students ask [Spirit], “Help me with this math problem,” and Spirit’s like, “I’m really sorry, but I can’t do that.” It’s really an internal system, and students only have access to it because they are students, and we give them information directly there.
What we did with the program is the company sets you up with, here are the main questions that this chatbot typically gets, and then we back-feed it with all this information. Each department took a look at these questions, so we filled it all in. It’s called the knowledge base. In the knowledge base, we have all these different things, like, when are things open? Who to contact about this? All sorts of options that students can get.
One piece is students use it almost like a Siri or Alexa, where you get that quick answer. We really wanted to meet students where they were and wanted to make sure that, you know, it was real-time information for them.
We have really filled it with all information about Western New England that they can access and get information right away. So that’s the one piece of the chatbot that’s really powerful. It helps save time, keep students from having to wait in line or make appointments, and then it directs them in the right place.
The other piece of the chatbot, which is really a more powerful piece that this individual chatbot has, is a proactive approach. We have a system that the company has developed, based on research, [with] certain questions we ask students throughout the year.
Depending on the time of the year, what’s going on, we may be asking them about academics, financial, personal wellness and health, mental health, as well as engagement on campus. When we ask those questions, we’re hearing the student’s voice right away. Those questions start early; in early September we have the first questions going out. Typically, you may get a report from faculty or staff almost midsemester. We’re getting it really early so that we can intervene right away.
Intervening is that human helper side. We have that chatbot who’s going to be there to answer your questions. But when the chatbot reaches out, make sure you respond, because now as a staff, we can say, this group of students, or these individual students, need something more, and how can we connect with them? It really enhances the relationship.
I think sometimes there’s a fear that AI takes away from a relationship, but it truly enhances the relationship, because once a student is willing to talk to the chatbot, they’re more likely to talk to the staff who reaches out to them because of what they said to the chatbot.
Q: When you are setting up those prompts, looking at those early alerts or things that you might want to know from students, what are you all asking and what have you found is important to identify early on?
A: The first question that goes out is “How do you feel so far about the term?” Students respond with numbers: one, great, two, neutral, three, not so great. And then the chatbot will follow up if it’s neutral or not so great: Why? Is it finances? Is it belonging and connections, academics? Then the students respond there. If students are willing to keep chatting, Spirit will ask, why, can you give any more information?
So last year was the first year that we really implemented it for a full year, and that first question is so powerful because myself and my colleague were able to jump in right away and connect with students, specifically first-year students who in this first two or three weeks of classes are feeling stuck and lost and not quite sure how to move forward.
That’s been really powerful, because not only are they telling us they need help, they’re telling us why they need help and in what direction, and then our job is to reach out and say, “Thanks so much for connecting with Spirit. Now here we are. What can we do to help? Come on in and meet us in the Student Hub, and then we can help you navigate the various offices on campus.”
Q: We’re seeing more students reach out to these third-party services online, trying to look for help and support. Now you all are providing a service for them that is safe, secure and run by staff members who are really looking for their best interests and trying to make sure that they get plugged in and that they don’t stay online.
A: That’s really important. I think the biggest thing is putting it out there and saying, “Here’s how I’m feeling, who’s going to do anything about it?” And knowing that there’s staff that are going to get you connected if students are feeling like they are not involved on campus—we have so many different clubs and organizations, and just having that conversation with a staff member of, like, what’s your interest? We have a club for that. Or, we have a professor who is an expert in this field, and it really helps us tailor and personalize the student experience. That’s information we wouldn’t know otherwise.
As educators, we get a ton of information about students, and we don’t always get that student voice, and that’s what this system does. It allows us to get the voice and allows us to get it early. And we do have that safeguard in place, where students may be having struggles, but they get resources right away, and there are alert systems set up on the back end, so if there are any issues, faculty and staff are able to respond.
Q: What kind of data have you all looked at when it comes to understanding the student experience as a whole? Have there been any insights or trends that have surprised you or driven change on campus?
A: The data is fascinating. I think the biggest thing for looking at this data is, yes, you can do the individual outreach and the individual support, but we can look across the board. We can look at first-generation students. We can look at athletes. We can look at first-year students versus seniors. So there’s a lot of data based on what we have in the system.
Over the past 12 months, we’ve had 17,000 texts back and forth between Spirit and the students, which is phenomenal. We have a 98 percent opt-in rate. So students get a text from Spirit in the beginning of the year, and they can opt out, but 98 percent of students are using it. During the year, our engagement fluctuates between 64 and 70 percent.
The other thing we’ve been able to see, and this is more recent … is we have a higher retention rate for students who are engaged with the chatbot than students who aren’t. So just recently, we’re getting this report from EdSights that 90.6 percent of students who actually engage in the chatbot persisted from fall 2024 to fall 2025. The difference was 75.3 percent who didn’t engage persisted. We are seeing a growth.
I think the reason that that’s so important is because retention and persistence are all about connection and belonging and feeling like you have someone, even if it’s a chatbot, who is connecting with you and making sure that you’re feeling [like] a valued member of our campus community.
We’ve been able to connect with hundreds of students that we may not have been able to connect with or [who we] didn’t even know were struggling because of this chatbot.
We did a huge marketing campaign last year to really get students to use it. This fall, we have the largest freshman class we’ve ever had, and so encouraging them to use this chatbot as a resource has been amazing.
I did a comparison to last year where the first week of classes, we didn’t ask any questions in the first week, but we make it available if students have questions. In the first week of classes last year [fall 2024], students asked 72 questions, or 72 texts to Spirit. This year, in the first year of classes, it was 849.
Q: Wow.
A: So students are using the chatbot. Now, it’s the second year, so we’ve got returning students who also are engaged and understand what it’s all about. It’s showing that students have those questions. Think about all the different questions they got answered that they may not have either went somewhere to get it answered or time didn’t allow them to have it answered.
They’re not going to get perfect answers, either. They may ask a question and the chatbot may say, “I’m not sure I exactly know that answer, but here’s who on campus will,” and it gives them the website. It gives them the contact, it gives them the phone number, so if the chatbot doesn’t know the exact answer, it gives them resources right away, so that they can then follow up on their own.
Q: When it comes to staff capacity, have you seen any impact on the amount of redundant emails students are sending?
A: I think that’s been really helpful, because students can ask the chatbot right away. The other amazing piece about this tool that we’re using is that we can add information pretty quickly. For example, we have a student involvement fair that’s coming up tomorrow, and I had a student ask me a question. I’m like, “Well, let’s ask the chatbot.” And it wasn’t in [the information base]. So I was like, “Well, you’re probably not the only student [with this question].”
So I went in and I added it on the back end, and then I said, “All right, let’s try it again.” Five minutes later, he got the answer for the question from the chatbot.
The system is set up so that we can customize it. There are over 500 questions with answers in the system. We went over those this summer to make sure they’re accurate. We use some of the common language, like, instead of dining hall, you know, we said “D Hall”; we added the common language that students are using, so that the chatbot is even smarter and students are going to get responses even quicker.
I do think it saved time, and hopefully it keeps that redundancy away, because if a student’s going to get an answer, they’re going to tell their classmate or their roommate or their peer, “Hey, just ask [Spirit]” or “Let’s ask together,” and again, save time on the end of the staff. That frees up those little questions to delve into some other things that may be meatier that they would need to deal with for students.
Q: For a peer at a different institution who’s considering implementing a chatbot or experimenting with their own, what lessons have you learned or what advice would you give?
A: The biggest thing I can think of is you have to put in the time and the effort to build the back end. You can add questions really easily, but if you don’t have that robust answer back in the system, it doesn’t give students what they need, or it gives them an OK answer, and they’re less likely to use the chatbot again.
I think the time and the energy you put into the back end and the setup is really important before launching, so that you ensure that students are getting the most accurate information and the simplest. We’re trying to save them from having to google the answer or go onto the website to find it.
I think the other thing is not every student is going to respond, and that’s OK. We have a 98 percent opt-in rate, which means that people are getting those messages from Spirit. That doesn’t mean they’re always responding when we reach out to them. Your engagement is going to be lower than your opt-in, because sometimes students are just going to ignore the text, and that’s OK.
We hope that if they need to respond, or in that moment, that the question that’s coming to them, whether it’s about academics or if they’re struggling with finances, or are they homesick? All these questions that we ask, if they need to respond, we hope that they respond. Just being aware that not every student is going to use it as a tool. Some students will use the chatbot more than they want to come see you.
We’ve reached out to students after they get flagged on our system, and sometimes they ignore us. And so just making sure you have another way to check in on that student or bring them up at a meeting, so that you can say, “I’ve reached out, and the student isn’t coming back and wanting to meet with me,” and that’s OK. Are they still using the chatbot? They still have resources, and they’re getting that information.
I think the biggest thing that we’re trying to improve and move into this year, in our second year of implementation, is, how do we make this data more relevant and shareable to our institution as a whole? This past year, the data has really been sitting within Student Life … Let’s make that available to faculty and staff so that they can get a sense of what our students are feeling and how can maybe I change or implement something that’s going to help. As well as sharing with our student leadership so that students get a sense of how people are feeling. That’s our next step.
We’re still going to do the individual outreach and the whole group support and programming. But how do we use this data now as a larger institution that really wants to focus in on student support?
Q: You mentioned a little bit about what’s next, but is there anything else on the horizon that we should know about as you all move into year two of Spirit?
A: I think the biggest thing is really emphasizing the blended AI-human interaction. The system gives us a number of risk factors and measures how students are doing, and we want to use that information as a proactive approach to support students. Whether it’s programming for specific needs or for specific groups of students, whatever it may be to get proactive, so that we know, in a sense, what students are doing and what their needs are.
The other thing we’re going to see over the next year or two is hopefully we’ll start to see some trends and patterns of how students are responding. Going into year two, I assume that we’re going to have some similar responses. But who knows? Every class is different and every year is different, so trying to see, what are some trends? We can use that data to be proactive and plan what students may need, before they even know they need it, in a way. Using this information and making it actionable so it’s not just data that’s sitting in a system is so important to us.

Universities offer a wide range of support to students – lecturers’ office hours, personal tutors, study skills advisors, peer-mentoring officers, mental health and wellbeing specialists, and more.
But even with these services in place, some students still feel they are falling through the cracks.
Why? One of the most common pieces of student feedback might offer a clue – “I wish I had known you and come to you earlier”.
Within the existing system, most forms of support rely on students to take the first step – to reach out, refer themselves, or report a problem.
But not all students can or will: some are unsure who to turn to, others worry about being judged, and many feel too overwhelmed to even begin. These are the students who often disappear from view – not because support does not exist, but because they cannot access it in time.
Meanwhile, academics are stretched thin by competing research and teaching demands, and support teams – brilliant though they are – can only respond once a student enters this enquiry-response support system.
As a result, students struggling silently often go unnoticed: for those “students in the dark”, there is often no obvious red flag for support services to act on until it is too late.
NSS data in recent years reveal a clear pattern of student dissatisfaction with support around feedback and independent study, indicating a growing concern and demand for help outside the classroom.
While the existing framework works well for those confident and proactive students, without more inclusive and personalised mechanisms in place, we risk missing the very group who would benefit most from early, student-centred support.
This is where academic coaching comes in. One of its most distinctive features is that it uses data not as an outcome, but as a starting point. At Buckinghamshire New University, Academic Coaches work with an ecosystem of live data – attendance patterns, assessment outcomes, and engagement time with the VLE – collaborating closely with data intelligence and student experience teams to turn these signals into timely action.
While our academic coaching model is still in its early phase, we have developed simulated student personae based on common disengagement patterns and feedback from colleagues. These hypothetical profiles help us shape our early intervention strategies and continuously polish our academic coaching model.
For example, “Joseph”, a first-year undergraduate (level 4) commuter student, stops logging into the VLE midway through the term. Their engagement drops from above cohort average to zero and stays that way for a week. In the current system, this might pass unnoticed.
But through live data monitoring, we can spot this shift and reach out – not to reprimand but to check in with empathy. Having been through the student years, many of us know, and even still remember, what it is like to feel overwhelmed, isolated, or simply lost in a new environment. The academic coaching model allows us to offer a gentle point of re-entry with either academic or pastoral support.
One thing to clarify – data alone does not diagnose the problem – but it does help identify when something has changed. It flags patterns that suggest a student might be struggling silently, giving us the opportunity to intervene before there is a formal cause for concern. From there, we Academic Coaches reach out with an attentive touch: not with a warning, but with an invitation.
This is what makes the model both scalable and targeted. Instead of waiting for students to self-refer or relying on word of mouth, we can direct time and support where it is likely to matter most – early, quietly, and personally.
Most importantly, academic coaching does not reduce students to data points. It uses data to ask the right questions and to guide an appropriate response. Why has this student disengaged? Perhaps something in their life has changed.
Our role is to notice this change and offer timely and empathetic support, or simply a listening ear, before the struggle becomes overwhelming. It is a model that recognises the earlier we notice and act, the greater the impact will be. Sometimes, the most effective student support begins not with a request, but with a well-timed email in the student’s inbox.
The academic coaching model is not just about individual students – it is about rethinking how this sector approaches student support at a time of mounting pressure. As UK higher education institutions face financial constraints, rising demand, and increasing complexity in students’ needs, academic coaching offers a student-centred and cost-effective intervention.
It does not replace personal tutors or other academic or wellbeing services – instead, it complements them by stepping in earlier and guiding students toward appropriate support before a crisis hits.
This model also helps relieve pressure on overstretched academic staff by providing a clearly defined, short-term role focused on proactive engagement – shifting the approach from reactive firefighting to preventative care.
Fundamentally, academic coaching addresses a structural gap: some students start their university life already at a disadvantage – unsure how to fit into this new learning environment or make use of available support services to become independent learners – and the current system often makes it harder for them to catch up.
While the existing framework tends to favour confident and well-connected students, academic coaching helps rebalance the system by creating a more equitable pathway into support – one that is data-driven yet recognises and respects each student’s uniqueness. In a sector that urgently needs to do more with less, academic coaching is not just a compassionate gesture, but a future-facing venture.
That said, academic coaching is not a silver bullet and it will not solve every problem or reach every student. From our discussions with colleagues and institutional counterparts, one of the biggest challenges identified – after using data to flag students – is actually getting them on board with the conversation.
Like all interventions, academic coaching needs proper investment, training, interdepartmental cooperation, clear role boundaries, and a scalable framework for evaluating impact.
But it is a timely, student-centred response to a gap that traditional structures often miss – a role designed to notice what is not being said, to act on early warning signs, and to offer students a safe place to re-engage.
As resources tighten and expectations grow, university leadership must invest in smarter, more sensible forms of support. Academic coaching offers not just an added layer – it is a reimagining of how we gently guide students back on track before they drift too far from it.

Title: 2025 U.S. Student Wellbeing Survey
Source: Studiosity in partnership with YouGov
The higher education landscape is undergoing a profound transformation shaped by rapid technological advancements and shifting student expectations. The 2025 U.S. Student Wellbeing Survey, conducted by Studiosity in partnership with YouGov, offers in-depth insights into student behavior, particularly their growing reliance on AI tools for academic support.
The report states that 82 percent of U.S. students have used AI for assignments or study tasks. This trend is even more pronounced among international students, with 40 percent reporting regular AI use compared with 24 percent of domestic students. The findings make clear: AI is no longer emerging—it’s central to the student academic experience.
While student use of AI is high, only 58 percent of respondents feel their universities are adapting quickly enough to provide institution-approved AI tools, a figure that shows minimal improvement from 2024 (57 percent). Furthermore, 55 percent of students now expect their institution to provide AI support, reflecting shifting priorities among students. This year, “confidence” overtook “speed” as the main reason students prefer institution-provided AI tools, underscoring the demand for reliable and ethical solutions.
The data also highlight heightened stress levels linked to AI use, with 66 percent of students reporting some level of anxiety about incorporating AI into their studies. Students voiced concerns about academic integrity, accidental plagiarism, and cognitive offloading. One student said, “AI tools usually need a well-detailed prompt. Most times AI gets outdated data. Most importantly, the more reliable AI tools require payment, which makes things unnecessarily hard.” This highlights an equity issue in AI use, as some students reported paying for a premium AI tool to get better results. Those experiencing constant academic stress were more likely to report regular AI use, suggesting a need for support systems that integrate human connection with technological assistance.
The research emphasizes actionable strategies for universities:
As students navigate an increasingly AI-driven academic environment, universities must step into a leadership role. Providing ethical, institution-approved AI tools isn’t just about keeping pace with technology; it’s about safeguarding learning, reducing stress, and fostering confidence in academic outcomes. The 2025 survey makes one thing clear: students are ready for universities to meet them where they are in their AI use, but they are asking for guidance and assurance in doing so.
To download a copy of the USA report, click here. For global reports and surveys, including cross-institutional meta-analyses and educator surveys, click here.
If you have any questions or comments about this blog post, please contact us.

The Trump administration’s aggressive stance toward higher education institutions is contributing to a precipitous drop in support among college-educated voters, with new polling data revealing the president’s approval rating among graduates has fallen to historic lows.
The administration’s education policies have taken aim at what Trump characterizes as liberal bias and antisemitism on college campuses. Harvard University has faced the most severe federal intervention, with the White House canceling approximately $100 million in federal contracts and freezing $3.2 billion in research funding. The administration has also moved to block international student enrollment and threatened to revoke the institution’s tax-exempt status while demanding sweeping reforms to admissions processes and curricular oversight.
Similar measures have been enacted against Columbia University, the University of Pennsylvania, and Cornell University over issues ranging from pro-Palestinian campus activism to policies regarding transgender athletes in women’s sports. Harvard officials have characterized these interventions as an unprecedented assault on academic freedom and institutional autonomy.
The crackdown has generated significant campus unrest and drawn comparisons to Cold War-era loyalty investigations, raising questions about the federal government’s appropriate role in higher education governance.
The polling data reflects broader dissatisfaction with the administration’s educational approach. Only 26% of college graduates approve of Trump’s handling of education policy, while 71% disapprove. A separate AP-NORC survey from May found that 56% of Americans nationwide disapprove of the president’s higher education agenda.
However, the policies resonate strongly within Trump’s Republican base, with roughly 80% of Republicans approving his higher education approach—a higher approval rate than his economic policies garner. About 60% of Republicans express significant concern about perceived liberal bias on college campuses, aligning with the administration’s framing of universities as ideologically compromised institutions.
The Republican coalition shows some internal division on enforcement mechanisms, with approximately half supporting federal funding cuts for non-compliant institutions while a quarter oppose such measures and another quarter remain undecided.
While political controversies dominate headlines, economic concerns remain the primary driver of public opinion on higher education. Sixty percent of Americans express deep concern about college costs, a bipartisan worry that transcends ideological divisions around campus politics.
Current data from the College Board and Bankrate show average annual costs of $29,910 for in-state public university students, $49,080 for out-of-state students, and approximately $61,990 for private nonprofit institutions when including room, board, and additional expenses. Financial aid reduces these figures to average net prices of $20,800 at public universities and $36,150 at private colleges.
These costs reflect decades of sustained increases. EducationData.org reports that public in-state college costs have risen from $2,489 in 1963 to $89,556 in 2022-23 (adjusted for inflation). Over the past decade alone, in-state public tuition has increased by nearly 58%, while out-of-state and private tuition have risen by 30% and 27% respectively.
The economic pressures extend beyond college costs to post-graduation employment prospects. While overall unemployment among adults with bachelor’s degrees remains low at 2.3%, recent graduates face significant challenges. Bureau of Labor Statistics data shows that only 69.6% of bachelor’s degree recipients aged 20-29 were employed in late 2024, with unemployment among 23-27-year-olds reaching nearly 6%—substantially above the 4.2% national average.
These employment difficulties contribute to broader economic anxiety, with 39% of college graduates describing national economic conditions as “poor” and 64% reporting job search struggles.
The confluence of political and economic pressures creates a challenging landscape for Republicans heading into the 2026 midterms. College-educated voters represent a growing and increasingly decisive demographic, particularly in suburban areas that often determine control of swing seats.

Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter
New research has found that rural LGBTQ+ teens experience significant challenges in their communities and turn to the internet for support.
The research from Hopelab and the Born This Way Foundation looked at what more than 1,200 LGBTQ+ teens faced and compared the experiences of those in rural communities with those of teens in suburban and urban communities. The research found that rural teens are more likely to give and receive support through their online communities and friends than via their in-person relationships.
“The rural young people we’re seeing were reporting having a lot less support in their homes, in their communities, and their schools,” Mike Parent, a principal researcher at Hopelab, said in an interview with the Daily Yonder. “They weren’t doing too well in terms of feeling supported in the places they were living, though they were feeling supported online.”
However, the research found that rural LGBTQ+ teens had the same sense of pride in who they were as suburban and urban teens.
“The parallel, interesting finding was that we didn’t see differences in their internal sense of pride, which you might kind of expect if they feel all less supported,” he said. “What was surprising, in a very good way, was that indication of resilience or being able to feel a strong sense of their internal selves despite this kind of harsh environment they might be in.”
Researchers recruited young people between the ages of 15 and 24 who identified as LGBTQ+ through targeted ads on social media. After surveying the respondents during August and September of last year, the researchers also followed up some of the surveys with interviews, Parent said.
According to the study, rural teens were more likely than their urban and suburban counterparts to find support online. Of the rural respondents, 56% of rural young people reported receiving support from others online several times a month compared to 51% of urban and suburban respondents, and 76% reported giving support online, compared to 70% of urban and suburban respondents.
Conversely, only 28% of rural respondents reported feeling supported by their schools, compared to 49% of urban and suburban respondents, the study found, and 13% of rural respondents felt supported by their communities, compared to 35% of urban and suburban respondents.
Rural LGBTQ+ young people are significantly more likely to suffer mental health issues because of the lack of support where they live, researchers said. Rural LGBTQ+ young people were more likely to meet the threshold for depression (57% compared to 45%), and more likely to report less flourishing than their suburban/urban counterparts (43% to 52%).
The study found that those LGBTQ+ young people who received support from those they lived with, regardless of where they live, are more likely to report flourishing (50% compared to 35%) and less likely to meet the threshold for depression (52% compared to 63%).
One respondent said the impact of lack of support impacted every aspect of their lives.
“Not being able to be who you truly are around the people that you love most or the communities that you’re in is going to make somebody depressed or give them mental issues,” they said in survey interviews, according to Hopelab. “Because if you can’t be who you are around the people that you love most and people who surround you, you’re not gonna be able to feel the best about your well-being.”
Respondents said connecting with those online communities saved their lives.
“Throughout my entire life, I have been bullied relentlessly. However, when I’m online, I find that it is easier to make friends… I met my best friend through role play [games],” one teen told researchers. “Without it, I wouldn’t be here today. So, in the long run, it’s the friendships I’ve made online that have kept me alive all these years.”
Having support in rural areas, especially, can provide rural LGBTQ+ teens with a feeling of belonging, researchers said.
“Our findings highlight the urgent need for safe, affirming in-person spaces and the importance of including young people in shaping the solutions,” Claudia-Santi F. Fernandes, vice president of research and evaluation at Born This Way Foundation, said in a statement. “If we want to improve outcomes, especially for LGBTQ+ young people in rural communities, their voices–and scientific evidence–must guide the work.”
Parent said the survey respondents stressed the importance of having safe spaces for LGBTQ+ young people to gather in their own communities.
“I think most of the participants recognize that you can’t do a lot to change your family if they’re not supportive,” he said. “What they were saying was that finding ways for schools to be supportive and for communities to be supportive in terms of physical spaces (that allowed them) to express themselves safely (and) having places where they can gather and feel safe, uh, were really important to them.”
Hopelab seeks to address mental health in young people through evidence-based innovation, according to its organizers. The Born This Way Foundation was co-founded by Lady Gaga and her mother, West Virginia native Cynthia Bisset Germanotta.
The organization is focused on ending bullying and building up communities, while using research, programming, grants, and partnerships to engage young people and connect them to mental health resources, according to the foundation’s website.
This article first appeared on The Daily Yonder and is republished here under a Creative Commons Attribution-NoDerivatives 4.0 International License.
Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

The Health and Human Services Department has terminated the Minority Biomedical Research Support program, which provided colleges and universities grants to increase the number of minority faculty, students and investigators conducting biomedical research.
In a notice published Monday in the Federal Register, HHS secretary Robert F. Kennedy Jr. said the cancellation is to comply with two anti–diversity, equity and inclusion executive orders President Trump signed in January on his first two days back in office, plus the 2023 U.S. Supreme Court decision banning affirmative action in college admissions decisions. The change is effective Sept. 25.
“The MBRS program prioritizes racial classifications in awarding federal funding,” including by relying on “‘minority student enrollment’ to determine applicant eligibility,” Kennedy wrote. And, though the Supreme Court ruling focused on university admissions, Kennedy wrote that “the principles identified in Students for Fair Admissions also apply to the federal government and require repeal of the MBRS program.”
STAT reported the move earlier. Rochelle Newman, a University of Maryland psychologist who used the grant to pay undergraduate researchers and train them, told STAT that “cutting of these programs means that an entire generation of students will end up being lost to science.”

A recent announcement from the Department for Education promised “radical skills reforms” and focused the government’s sights on developing the “next generation” of home-grown talent.
It included eye-catching offerings to sectors in need of rejuvenation such as construction and healthcare – and a refocusing of funding away from older learners on level 7 apprenticeships. This is significant as, although the number of young people not in education, employment or training (NEET) has fallen slightly of late, ONS statistics still record half a million economically inactive young people in the UK.
The revised strategy points to purposeful investment in the country’s youth, which should encourage further green shoots of economic recovery. For a young generation constrained by coronavirus restrictions and economic stagnation, securing their future will be vital to economic prosperity.
Given this shift in government narrative, we wanted to explore how age impacts apprentices’ learning experiences.
Our research is based on experiences of the Chartered Management Degree Apprenticeship, a cornerstone of skills development in leadership and management, where employed apprentices learn both at work, and with a higher education institution for one day a week. Our data includes interviews with both apprentices and their line managers supporting their learning in the workplace.
Our findings show very different approaches to ownership of learning depending on prior workplace experience. While apprenticeship alumni acknowledge the benefits of a degree apprenticeship programme and its worth to them and their careers, we found distinct differences in the way that learners connect with their studies and the amount of support they require.
Weighing up apprenticeships as an alternative option to traditional university study is now well-trodden ground for young people, their families, and careers advisers in schools and colleges. We found, however, that starting an apprenticeship straight out of school presents unique challenges for younger learners.
Prior research has shown that older workers have also benefited from apprenticeship initiatives and parity of opportunity. These learners – that we term “upskillers” – have typically been mature learners requiring a degree to progress with their existing employers. Our research shows that upskillers, in contrast to younger apprentices, lean into the challenges of degree apprenticeships, bolstered by the personal agency and independence that experience brings.
We found much positivity amongst younger learners undertaking degree apprenticeships as an alternative to enrolling in a traditional degree. For them, having “a job secured” provided a strong rationale for the apprenticeship route, with individuals rating the opportunity to gain experience at such a young age. They noted that it was “very, very, beneficial”, and emphasised that “campus is not the only way to start your career”.
However, one young alum noted the programme was “not an easy ask”, going on to comment:
If you put in all the work, and you’re inclined to really work hard at age 18, 19, you’ll reap the rewards… [yet] once you package the entire full picture of a young person’s life and then you’re asking for this on top… it becomes a tough ask.
Others highlighted downsides and stresses of starting an apprenticeship straight from school, rather than after at least a brief experience of working life:
You’d need at least a year before doing it… you need that context… you don’t even know what a business is, what it entails, how it runs… you don’t know the real-life workings.
Employer respondents could also see the benefit of apprentices having at least some work experience and organisational understanding before commencing an apprenticeship. They argued that apprentices needed a “baseline of knowledge” to be able to “give it your all”, in terms of “managing people [and] managing situations”.
Young people’s experiences contrasted with work-experienced apprentices who took opportunities with both hands, including evaluating the pros and cons of different universities and the qualification on offer. One older apprentice talked about the freedom to “go and have a look to see what else I could find” when the existing workplace scheme recommended by his employer didn’t meet his needs. The travelling nature of his job meant he was keen to do his degree apprenticeship remotely, rather than having to spend “time on campus every week”.
Reliance on programme structure and planning was also less important for more mature learners. Two took time to reflect on their ability to be proactive in managing their learning: “I have to negotiate with the team… and plan my own time”. Another spoke of having both organisational understanding and skill available to choose their own final year project, ensuring it was relevant and useful to both him and his organisation. This made the qualification more valuable than having someone else direct their study.
Wonkhe analysis has noted that older degree apprentices are more likely to complete their studies. This fits with the sentiment of seizing a chance later in life in line with one of our upskillers commenting that “the older you are… you’ll just get it done, whatever.”
If funding switches to younger people, providers will need to call on their expertise to support changing learner demographics if they are to retain high completion rates.
What works in one situation might not be right for another. If “national renewal” is to be achieved through developing young talent, implementation must account for the unique needs of young apprentices.
We hope and believe however that – despite the myriad challenges of national economic renewal – continued collaboration between the government, higher education institutions, and business will enable us to find a productive way forward within the degree apprenticeship arena.