Tag: Student

  • Personalized Communication Strategies That Drive Student Engagement in Higher Education

    Personalized Communication Strategies That Drive Student Engagement in Higher Education

    Key Takeaways:

    • Personalized, timely, and relevant communication is key to engaging prospective students and meeting enrollment goals in higher education.



    • Effective strategies rely on immediacy, relevance, automation, and trackability, ensuring impactful and consistent interactions.



    • Omnichannel outreach, using a mix of email, SMS, print, and digital platforms, enhances visibility and builds trust by meeting students where they are.



    • Data-driven tools enable tailored, personalized communication, real-time adjustments, and sustainable strategies.

     

    Connecting with prospective undergraduate students in meaningful ways requires a thoughtful blend of strategy, immediacy, and personalization. Gone are the days when generic messaging could effectively spark interest or drive engagement. Today’s prospective students expect communications that reflect an understanding of their individual needs, aspirations, and priorities and their value to your institution.

    Institutions aiming to enhance their enrollment strategies must adopt a more data-informed and strategic approach to communication. This means reaching out with the right message, at the right time, and through the right channels.

     

    Laying the Foundation for Communication Success

    Effective communication with students is built on four key principles: immediacy, relevance, automation, and trackability. Each element plays a critical role in ensuring that interactions resonate with students and influence their decision-making process.

    • Immediacy: Quick and timely responses that change as students’ behaviors change demonstrate attentiveness and can make a significant impression on prospective students. Delays in following up on inquiries or campus visits risk the loss of momentum and interest. Statistics show that the school that responds to inquiries first is more likely to convince that student to enroll.



    • Relevance: Tailored, personalized communication should go beyond basic name inclusion. Students expect messages that address their specific interests. Misaligned content, such as sending information unrelated to a student’s expressed major, can quickly undermine trust.



    • Automation: Streamlined, automated workflows keep communication consistent and dependable, even during staff transitions or times of high demand. Manual processes, such as college fair follow-ups that sit unprocessed for long periods, can derail engagement. Automation prevents these bottlenecks, enabling timely responses even when staff are unavailable.



    • Trackability: Monitoring communication effectiveness helps institutions refine their strategies and optimize ROI.

    By integrating these principles, higher education institutions can deliver a cohesive and impactful communication strategy that strengthens student engagement and builds trust.

     

    The Importance of Omnichannel Outreach

    While email has long been—and remains—a cornerstone of communication, relying on it exclusively is no longer sufficient. The sheer volume of emails students receive daily makes it easy for even the most well-crafted messages to be overlooked. To stand out, institutions must adopt an omnichannel approach with campaigns that combine email with print materials, SMS messaging, voice blasts, digital ads, social media engagement, and microsites, all tailored to student interests.

    Each channel serves a unique purpose for student engagement in higher education. Print materials, for example, are particularly effective at involving families in the decision-making process. A well-designed brochure placed on a kitchen table can spark conversations among family members, especially parents, who are often key influencers in the college selection process.

    Similarly, integrating consistent, tailored messaging across multiple channels ensures that students receive a seamless experience. Whether they encounter an institution on social media, via a targeted ad, by SMS message, or through an email campaign, the message should feel cohesive and tailored to their interests. Omnichannel strategies, timed appropriately through the enrollment timeline, not only improve visibility but also demonstrate an institution’s commitment to meeting students where they are, thus building trust and rapport.

     

    Leveraging Data for Personalization

    Modern communication strategies must be rooted in data. By analyzing student preferences and behaviors, institutions can craft messages that resonate on an individual level. With data-informed insights, institutions can identify what matters most to prospective students—whether that’s career outcomes, financial aid, or specific academic opportunities—and address those priorities directly.

    For example, students interested in STEM programs may be more receptive to communications highlighting research opportunities and faculty expertise, while first-generation students may appreciate messages emphasizing affordability and support services.

    To further maximize impact, institutions can use surveys and initial engagement data to tailor their outreach strategies, which allows them to deploy resources efficiently while maintaining relevance. For example, expensive print materials can be reserved for students who show strong interest in particular programs, while a social media campaign may be more appropriate for inquiries earlier in the enrollment cycle.

    Real-time data tracking lets institutions segment their strategies dynamically. If a particular campaign underperforms across the board or for certain cohorts of students, modifications can be made immediately to better align with student preferences. This agility is essential for maintaining relevance and impact throughout the recruitment cycle.

     

    Building a Sustainable Communication Infrastructure

    Sustainable communication strategies rely on the integration of advanced tools and technologies. While a customer relationship management (CRM) system lays a strong foundation, institutions often need more specialized solutions to elevate their outreach efforts. Liaison offers a suite of products designed to enhance and streamline communication and enrollment strategies, including:

    • Enrollment Marketing (EM): Liaison’s EM software and marketing services help institutions manage and analyze personalized, automated omnichannel campaigns, ensuring consistent and effective messaging across multiple channels.



    • Othot: This AI-driven tool leverages predictive and prescriptive analytics to optimize communication strategies and enrollment decisions, tailoring outreach to align with student behavior and institutional goals.



    • Centralized Application Service (CAS): By simplifying the admissions process for students and providing institutions with tools for marketing, data management, and application processing, CAS supports efficient communication with applicants.

    By incorporating these technologies, along with Liaison’s CRMs, institutions can maintain a seamless and unified communication flow so that prospective students receive timely, relevant, and personalized messages. These solutions also allow institutions to monitor campaign performance and adjust strategies in real-time, maximizing the effectiveness of resources and making messaging more impactful for target audiences. This integration reduces reliance on fragmented workflows, preventing gaps or delays caused by disconnected platforms.

    Aligning tools and strategies across departments using Liaison’s technologies keeps messaging consistent and impactful, even as prospective students engage with multiple touchpoints throughout their journey.

     

    Achieving Long-Term Engagement

    Effective communication with students is about building relationships that extend beyond the initial stages of recruitment. Institutions that invest in understanding and addressing the unique needs of their prospective students position themselves as partners in their academic journey.

    By delivering personalized, timely, and relevant messages through multiple channels, institutions can foster deeper connections and enhance student engagement in higher education. As the competitive landscape of enrollment continues to shift, adopting a strategic and data-informed approach to communication will remain essential for success.

    Ready to elevate your communication strategies? Discover how Liaison’s advanced tools and technologies can transform how you connect with prospective students. From personalized, omnichannel campaigns to data-driven insights, our solutions help you engage students meaningfully and meet your enrollment goals. Contact us today to learn more.

    About the Author

    Craig Cornell is the Vice President for Enrollment Strategy at Liaison. In that capacity, he oversees a team of enrollment strategists and brings best practices, consultation, and data trends to campuses across the country in all things enrollment management. Craig also serves as the dedicated resource to NASH (National Association of Higher Education Systems) and works closely with the higher education system that Liaison supports. Before joining Liaison in 2023, Craig served for over 30 years in multiple higher education executive enrollment management positions. During his tenure, the campuses he served often received national recognition for enrollment growth, effective financial aid leveraging, marketing enhancements, and innovative enrollment strategies.

    Source link

  • First-year student enrollment spiked 5.5% in fall 2024

    First-year student enrollment spiked 5.5% in fall 2024

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief: 

    • Enrollment of first-year students grew 5.5% in fall 2024 compared to the year before, representing an increase of about 130,000 students, according to a final tally from the National Student Clearinghouse Research Center
    • The figure is a striking reversal from the clearinghouse’s preliminary findings in October, which erroneously reported a decline in first-year students. Earlier this month, the clearinghouse said the early data contained a research error and suspended its preliminary enrollment reports, which use different methodologies to determine first-year student counts than the research center’s reports on final enrollment figures. 
    • College enrollment overall grew 4.5% in fall 2024 compared to the year before, according to the final data, rebounding to levels seen before the coronavirus pandemic caused widespread declines. 

    Dive Insight: 

    The new data is promising for higher education institutions, many of which have weathered steep enrollment declines in the wake of the pandemic. 

    “It is encouraging to see the total number of postsecondary students rising above the pre-pandemic level for the first time this fall,” Doug Shapiro, the research center’s executive director, said in a Wednesday statement. 

    Undergraduate enrollment surged 4.7% this fall, representing an increase of about 716,000 students. Graduate enrollment likewise spiked 3.3%, representing an uptick of about 100,000 students. 

    All sectors enjoyed enrollment increases. For-profit, four-year institutions had the largest enrollment growth, with headcounts rising 7.5% in fall 2024 compared to the year before. Public two-year institutions and public primarily associate-degree granting baccalaureate institutions, or PABs, saw similar levels of growth — 5.8% and 6.3%, respectively. 

    Enrollment also increased at four-year nonprofits. Overall headcounts grew 3.8% at private colleges and 3.1% at public institutions. 

    Older students largely drove the growth in first-year students. Enrollment of first-year students from ages 21 to 24 surged 16.7% in fall 2024, while headcounts of students 25 and older spiked by a whopping 19.7%. 

    Enrollment of younger first-year students also increased, though the growth was more muted. 

    Headcounts of 18-year-old students grew 3.4%. However, this group of first-year students has still not recovered to pre-pandemic levels, Shapiro said in a statement.

    Similarly, enrollment of first-year students ages 19 to 20 increased 4.5%. 

    Two-year public colleges and public PABs enjoyed strong increases in their first-year student population, with 6.8% and 8.4% growth, respectively. However, for-profit, four-year colleges saw the largest increase, 26.1%, according to the new data. 

    Headcounts of first-year students also spiked at four-year nonprofits, rising 3.3% at public institutions and 2.8% at private colleges. 

    Shapiro addressed the research center’s methodological error during a call Wednesday with reporters. The erroneous preliminary report found that first-year enrollment had declined by 5% — over 10 percentage points lower than what the final data showed. 

    “I think our sensitivity to abnormally large changes was somewhat reduced because we had a host of kind of ready explanations for why we might be seeing these declines,” Shapiro said, citing issues with the federal student aid form, growing concerns with student debt and changes in the labor market.

    The research center staff has been investigating its other publications to see if the issue crept into them. 

    So far, they discovered that the flawed methodology also impacted a February 2024 report on transfer students. The clearinghouse will correct that data when it issues its next transfer report in February. 

    The research center previously announced that the error affected other reports in its “Stay Informed” series, which shares preliminary enrollment data. It has halted those reports — which launched at the height of the pandemic — until it vets a new methodology.

    Source link

  • Leverage Student Ambassadors and UGC in Education Marketing

    Leverage Student Ambassadors and UGC in Education Marketing

    Reading Time: 11 minutes

    Authenticity has become a cornerstone of successful education marketing campaigns. Nothing speaks louder to prospective students than real experiences shared by current students. That’s why we recommend the combined use of two powerful tools: student ambassador programs and user-generated content (UGC). 

    These strategies harness the voices of your students to create compelling, authentic narratives that resonate. In this blog, we’ll explore the enrollment-boosting potential of student ambassadors and UGC for education marketing, the benefits they offer, and actionable steps to integrate them into your strategy. Let’s get started!

    Struggling with enrollment?

    Our expert digital marketing services can help you attract and enroll more students!

    Understanding the Role of a Student Ambassador

    What is a student ambassador? A student ambassador is a current student who represents your institution in various capacities, from marketing and recruitment to campus events. These individuals are typically chosen for their enthusiasm, communication skills, and ability to connect with diverse audiences. 

    What do student ambassadors do? As the face of your school, student ambassadors embody its culture and values, offering prospective students and their families an authentic glimpse into campus life. 

    The roles of student ambassadors are varied. They may host campus tours, participate in Q&A sessions during open houses, or even create content for your social media platforms. By sharing their personal experiences, they help humanize your institution, breaking down barriers and building trust.

    HEM Image 2HEM Image 2

    HEM Image 3HEM Image 3

    Source: University of Waterloo

    Example: On its website, the University of Waterloo has a dedicated page for members of its community who are interested in its student ambassador program. This page details the role of a student ambassador, the requirements for candidates, their workload, and compensation. When you launch your student ambassador program, use site content to provide vital information to potential candidates and the students they’ll support in their roles. Use social media to keep your audience updated on the application process and involve student ambassadors in content creation to establish a relationship between them and the rest of your student body. 

    Reach out for help implementing effective enrollment-boosting digital marketing strategies! 

    What Is User-Generated Content (UGC)?

    User-generated content (UGC) refers to any content created by your students, alumni, or even staff, rather than your marketing team. This can include photos, videos, testimonials, social media posts, or blogs that showcase their authentic experiences. Unlike polished advertising campaigns, UGC is often raw and unfiltered, making it highly relatable and trustworthy.

    Now that audiences are bombarded with promotional material, UGC stands out. It delivers a level of authenticity that professionally crafted content simply cannot replicate. For prospective students, seeing someone “just like them” thriving at your institution can be the deciding factor in their enrollment journey.

    HEM Image 4HEM Image 4

    Source: University of Oxford | TikTok

    Example: Take a look at the comments on this TikTok video. The bottom one shows how many prospective students are turning to current students for advice and insights into their journey with your institution. This “day in the life” video from a University of Oxford student offers a glimpse into campus life from a personal perspective. Videos shared on a student’s personal page often feel more genuine since they don’t come across as promotional content.

    That’s not to say your school shouldn’t engage with these posts! Use hashtags, like #universityofoxford, to find UGC created by your community and reshare it on your school’s profile. To encourage more of this content, promote specific hashtags and even run contests or challenges to inspire creativity and engagement.

    The Benefits of Student Ambassadors and UGC

    Though their methodology is different, both student ambassador programs and UGC help to tell your school’s unique story authentically.

    These methods are particularly effective at humanizing your school’s brand. Discover some more of the unique benefits you can see when you combine these strategies correctly.

    • Authenticity and Trust: Both student ambassadors and UGC provide unfiltered insights into your institution. Prospective students are more likely to trust the words of a peer than a marketing brochure. When real students share their stories, it creates a sense of transparency and trust.
    • Increased Engagement: Content created by student ambassadors and peers often performs better on social media platforms. Audiences are more likely to engage with posts that feel genuine and relatable. This increased engagement can translate to higher visibility for your institution.
    • Cost-Effectiveness: Leveraging the voices of your students can reduce the need for extensive advertising budgets. While there may be costs associated with training or compensating ambassadors, the return on investment through increased applications and enrollment often outweighs the initial expenditure.
    • Community Building: By involving students in your marketing efforts, you foster a sense of pride and belonging. Ambassadors feel more connected to your institution, and their enthusiasm is infectious, positively influencing both their peers and prospective students.

    How to Build a Successful Student Ambassador Program

    Building a student ambassador program involves creating a structured initiative that aligns with your school’s marketing goals and fosters authentic engagement. A successful program requires careful planning, clear objectives, and ongoing support to empower ambassadors as true representatives of your institution. Here, we’ll walk you through the essential steps to design and implement a program that connects with prospective students and amplifies your school’s story.

    Define Clear Objectives

    Clear objectives are the cornerstone of a student ambassador program, aligning with your marketing goals and guiding ambassadors toward success. Start by clearly outlining the program’s purpose. For example, increasing applications, enhancing campus tour experiences, or boosting social media engagement. 

    This clarity of intent should be paired with measurable goals, to help ambassadors understand what success looks like. Measurable goals could be increasing tour attendance by 20% or generating a set number of social media posts each month 

    Tailor these objectives to match the unique strengths of each ambassador, assigning roles that play to their talents, such as public speaking for campus tours or storytelling for blog posts and videos. Providing a clear role description that details their responsibilities, tasks, and time commitments is equally critical to avoid confusion and set expectations. 

    To foster motivation, explain the “why” behind their tasks, helping them see how their efforts impact prospective students, build trust in the institution, and contribute to enrollment goals. Regular check-ins or feedback sessions can also ensure ambassadors stay on track, allowing for adjustments and maintaining engagement. With clearly defined objectives and the right support, ambassadors can confidently represent your institution and drive meaningful results.

    Recruit the Right Ambassadors

    Select ambassadors who reflect the diversity and values of your institution. Look for individuals who are enthusiastic, articulate, and comfortable sharing their experiences. Peer recommendations, faculty referrals, and application processes can help identify the best candidates.

    Foster Collaboration

    Facilitate collaboration between ambassadors and your marketing team. Regular meetings can help align their content with your broader campaigns while maintaining authenticity. Ambassadors should feel supported but not micromanaged.

    HEM Image 5HEM Image 5

    Source: University of Windsor

    Example: The University of Windsor demonstrates trust in its student ambassadors with a unique feature on its website. It allows current and prospective students to select an ambassador to chat with for answers to their school-related questions. To replicate this success, implement a comprehensive training program to ensure consistency and quality. Clear expectations enable your ambassadors to take on key responsibilities confidently, delivering a strong return on your investment.

    Provide Comprehensive Training

    • Familiarize Ambassadors with Your Institution’s Key Messaging and Values
      Begin by familiarizing ambassadors with your institution’s key messaging and values. This includes providing them with a clear understanding of your school’s mission, vision, and what sets it apart from competitors. Equip them with talking points about academics, extracurricular offerings, campus facilities, and student life, ensuring consistency in how they communicate your brand. Role-playing exercises can be particularly effective here, helping ambassadors practice delivering messages in a variety of scenarios, such as open houses, campus tours, or online Q&A sessions.
    • Train Ambassadors on Social Media Best Practices
      Training should also include social media best practices, especially if ambassadors are creating content for your platforms. Teach them how to craft posts that are engaging and aligned with your school’s tone and style. Provide guidelines on appropriate language, photo and video quality, and compliance with privacy policies.
    • Develop Public Speaking Skills
      Since many ambassadors will engage with prospective students and families in person, public speaking training is invaluable. Help them refine their communication skills with workshops that focus on clarity, confidence, and storytelling. Encourage them to share personal anecdotes about their experiences at your school, as these authentic stories are often the most memorable. Practice sessions with constructive feedback can significantly boost their comfort in delivering presentations or handling impromptu questions.
    • Build Soft Skills for Diverse Audiences
      Effective training also involves building soft skills like empathy, adaptability, and cultural awareness, especially for ambassadors interacting with diverse audiences. 

    Include scenarios that challenge them to navigate different cultural perspectives or address sensitive questions tactfully. By fostering these skills, you ensure ambassadors can create welcoming and inclusive experiences for prospective students and their families.

    • Incorporate Interactive Training Methods
      To make training engaging and practical, use a mix of interactive methods such as role-playing, group discussions, and hands-on activities. Incorporate real-world examples and success stories from past ambassadors to inspire new recruits and show them what’s possible. Providing a training manual or digital resource hub can also serve as a handy reference for ambassadors as they grow into their roles.
    • Provide Ongoing Support and Refreshers
      Finally, ongoing support and refreshers are critical. Schedule periodic check-ins to provide additional guidance, address challenges, and celebrate successes. The more prepared they are, the more effectively they’ll represent your school.

    Empower Ambassadors to Create

    Empowering student ambassadors to create their own content is one of the most effective ways to showcase the authentic, lived experiences that resonate with prospective students. By trusting ambassadors with creative freedom, you enable them to craft content that feels genuine and relatable—qualities that polished marketing campaigns often struggle to replicate.

    Start by encouraging ambassadors to focus on their personal experiences and unique perspectives. Heartfelt testimonials are another powerful form of content. Whether it’s a written story, a video, or a social media post, ambassadors sharing their personal journeys—why they chose your school, how it’s impacted their lives, and what they’ve learned—can create an emotional connection with viewers. 

    To provide inspiration and structure, consider giving student ambassadors a content calendar – a detailed content plan that outlines the where, what, and when of your posts. Highlighting diverse voices within your ambassador team ensures a broad range of experiences and perspectives are represented, appealing to a wider audience.

    Celebrate Their Contributions

    Recognize and reward your ambassadors for their efforts. This can range from financial compensation to exclusive perks like access to networking events or career development opportunities. Publicly celebrating their work reinforces their value and motivates others to get involved.

    HEM Image 6HEM Image 6

    Source: New York University

    Example: Here, New York University’s School of Global Public Health welcomes a new student ambassador, celebrating her accomplishments in the field, describing her role in the NYU community, and directing the audience to her student blog post. In addition to monetary rewards, student ambassadors appreciate public acknowledgments of their contributions. 

    Measure Success

    Track the impact of your ambassador program using metrics such as social media engagement, website traffic, and application rates. Use this data to refine your approach, ensuring continuous improvement.

    Incorporating UGC into Your Marketing Strategy

    A UGC marketing campaign can be a goldmine for schools looking to leverage their communities to tell their story. By encouraging students to share their experiences, you tap into a wealth of relatable and engaging material that resonates with prospective students. Let’s explore how to integrate UGC into your marketing strategy for maximum impact.

    Create Opportunities for UGC

    Encourage your students to share their experiences by hosting contests, themed hashtag campaigns, or student takeovers on social media. The more accessible you make the process, the more likely students are to participate.

    HEM Image 7HEM Image 7

    Source: Caleontwins | TikTok

    Example: Here, Humber College has paid well-known influencers to promote a contest called Humber Bring It. The aim was to showcase all the unique skills students brought to their community. In their video, the Caleon twins shared all the essential details of the contest such as the deadline, prizes for winners (a 5000 dollar tuition credit or a laptop), and the hashtag that each contestant should use. Contests like this are the perfect way to create a UGC buzz around your institution. 

    Showcase UGC Across Platforms

    To maximize the impact of user-generated content (UGC), feature it prominently across your marketing platforms. Incorporate student stories, photos, and videos on your website’s homepage, within program pages, and in blog posts to provide a genuine glimpse into campus life. Social media channels are another natural home for UGC, where they can drive engagement and create relatable touchpoints with prospective students. Consider integrating this content into admissions brochures, emails, and campus tour presentations to ensure consistent messaging.

    Before sharing any UGC, prioritize student consent. Always seek permission from contributors, clearly explaining where and how their content will be used. Providing written guidelines and gaining explicit agreement ensures transparency and builds trust. By celebrating your students’ experiences respectfully and prominently, you showcase your school’s vibrant community and also create a foundation of authenticity and ethical storytelling that resonates with your audience.

    Maintain Quality Control

    While UGC is inherently less polished, maintaining a level of quality ensures it aligns with your institution’s values and messaging. Begin by establishing clear guidelines for students contributing content. 

    These guidelines should outline your school’s tone, branding, and expectations for appropriateness, while still encouraging creativity and individuality. For example, provide tips on photography and video basics, such as lighting and framing, to enhance visual appeal without compromising authenticity.

    Review content before publication to ensure it represents your school positively. This doesn’t mean heavily editing or sanitizing the material—rather, it’s about ensuring the content reflects your institution’s culture, is free of inappropriate language or imagery, and avoids unintentional misrepresentation.

    Offering feedback to students can also be a valuable learning experience, helping them refine their work while staying true to their voice. By balancing authenticity with quality, you showcase the best of your community in a way that’s both relatable and professional.

    Engage with UGC Creators

    Show appreciation for students who contribute content by engaging with their posts, sharing their work, or even spotlighting them in dedicated campaigns. This not only boosts their morale but also encourages others to participate.

    Use UGC to Tell Stories

    Go beyond individual posts by weaving UGC into cohesive narratives. For example, compile videos and testimonials into a series showcasing different aspects of campus life. Storytelling adds depth and emotional resonance to your campaigns.

    Bringing It All Together

    Student ambassador programs and UGC are avenues for building authentic connections with your audience. By leveraging the voices of your students, you showcase your institution’s unique story in a way that resonates deeply with prospective students and their families.

    At Higher Education Marketing, we specialize in helping schools like yours unlock the potential of these strategies and many others. Whether you’re just starting or looking to refine your approach, our expertise ensures your campaigns drive meaningful engagement and results.

    Your students are your greatest storytellers. Let their voices elevate your brand and inspire the next generation to join your community.

    Struggling with enrollment?

    Our expert digital marketing services can help you attract and enroll more students!

    Frequently Asked Questions 

    What is a student ambassador

    A student ambassador is a current student who represents your institution in various capacities, from marketing and recruitment to campus events.

    What do student ambassadors do? 

    As the face of your school, student ambassadors embody its culture and values, offering prospective students and their families an authentic glimpse into campus life.

    Source link

  • Q&A with retiring National Student Clearinghouse CEO

    Q&A with retiring National Student Clearinghouse CEO

    Ricardo Torres, the CEO of the National Student Clearinghouse, is retiring next month after 17 years at the helm. His last few weeks on the job have not been quiet.

    On Jan. 13, the clearinghouse’s research team announced they had found a significant error in their October enrollment report: Instead of freshman enrollment falling by 5 percent, it actually seemed to have increased; the clearinghouse is releasing its more complete enrollment report tomorrow. In the meantime, researchers, college officials and policymakers are re-evaluating their understanding of how 2024’s marquee events, like the bungled FAFSA rollout, influenced enrollment; some are questioning their reliance on clearinghouse research.

    It’s come as a difficult setback at the end of Torres’s tenure. He established the research center in 2010, two years after becoming CEO, and helped guide it to prominence as one of the most widely used and trusted sources of postsecondary student data.

    The clearinghouse only began releasing the preliminary enrollment report, called the “Stay Informed” report, in 2020 as a kind of “emergency measure” to gauge the pandemic’s impact on enrollment, Torres told Inside Higher Ed. The methodological error in October’s report, which the research team discovered this month, had been present in every iteration since. And a spokesperson for the clearinghouse said that after reviewing the methodology for their “Transfer and Progress” report, which they’ve released every February since 2023, was also affected by the miscounting error; the 2025 report will be corrected, but the last two were skewed.

    Torres said the clearinghouse is exploring discontinuing the “Stay Informed” report entirely.

    Such a consequential snafu would put a damper on anyone’s retirement and threaten to tarnish their legacy. But Torres is used to a little turbulence: He oversaw the clearinghouse through a crucial period of transformation, from an arm of the student lending sector to a research powerhouse. He said the pressure on higher ed researchers is only going to get more intense in the years ahead, given the surging demand for enrollment and outcomes data from anxious college leaders and ambitious lawmakers. Transparency and integrity, he cautioned, will be paramount.

    His conversation with Inside Higher Ed, edited for length and clarity, is below.

    Q: You’ve led the clearinghouse since 2008, when higher ed was a very different sector. How does it feel to be leaving?

    A: It’s a bit bittersweet, but I feel like we’ve accomplished something during my tenure that can be built upon. I came into the job not really knowing about higher ed; it was a small company, a $13 million operation serving the student lending industry. We were designed to support their fundamental need to understand who’s enrolled and who isn’t, for the purposes of monitoring student loans. As a matter of fact, the original name of the organization was the National Student Loan Clearinghouse. When you think about what happened when things began to evolve and opportunities began to present themselves, we’ve done a lot.

    Q: Tell me more about how the organization has changed since the days of the Student Loan Clearinghouse.

    A: Frankly, the role and purpose of the clearinghouse and its main activities have not changed in about 15 years. The need was to have a trusted, centralized location where schools could send their information that then could be used to validate loan status based on enrollments. The process, prior to the clearinghouse, was loaded with paperwork. The registrars that are out there now get this almost PTSD effect when they go back in time before the clearinghouse. If a student was enrolled in School A, transferred to School B and had a loan, by the time everybody figured out that you were enrolled someplace else, you were in default on your loan. We were set up to fix that problem.

    What made our database unique at that time was that when a school sent us enrollment data, they had to send all of the learners because they actually didn’t know who had a previous loan and who didn’t. That allowed us to build a holistic, comprehensive view of the whole lending environment. So we began experimenting with what else we could do with the data.

    Our first observation was how great a need there was for this data. Policy formulation at almost every level—federal, state, regional—for improving learner outcomes lacked the real-time data to figure out what was going on. Still, democratizing the data alone was insufficient because you need to convert that insight into action of some kind that is meaningful. What I found as I was meeting schools and individuals was that the ability and the skill sets required to convert data to action were mostly available in the wealthiest institutions. They had all the analysts in the world to figure out what the hell was going on, and the small publics were just scraping by. That was the second observation, the inequity.

    The third came around 2009 to 2012, when there was an extensive effort to make data an important part of decision-making across the country. The side effect of that, though, was that not all the data sets were created equal, which made answering questions about what works and what doesn’t that much more difficult.

    The fourth observation, and I think it’s still very relevant today, is that the majority of our postsecondary constituencies are struggling to work with the increasing demands they’re getting from regulators: from the feds, from the states, from their accreditors, the demand for reports is increasing. The demand for feedback is increasing. Your big institutions, your flagships, might see this as a pain in the neck, but I would suggest that your smaller publics and smaller private schools are asking, “Oh my gosh, how are we even going to do this?” Our data helps.

    Q: What was the clearinghouse doing differently in terms of data collection?

    A: From the postsecondary standpoint, our first set of reports that we released in 2011 focused on two types of learners that at most were anecdotally referred to: transfer students and part-time students. The fact that we included part-time students, which [the Integrated Postsecondary Education Data System] did not, was a huge change. And our first completion report, I believe, said that over 50 percent of baccalaureate recipients had some community college in their background. That was eye-popping for the country to see and really catalyzed a lot of thinking about transfer pathways.

    We also helped spur the rise of these third-party academic-oriented organizations like Lumina and enabled them to help learners by using our data. One of our obligations as a data aggregator was to find ways to make this data useful for the field, and I think we accomplished that. Now, of course, demand is rising with artificial intelligence; people want to do more. We understand that, but we also think we have a huge responsibility as a data custodian to do that responsibly. People who work with us realize how seriously we take that custodial relationship with the data. That has been one of the hallmarks of our tenure as an organization.

    Q: Speaking of custodial responsibility, people are questioning the clearinghouse’s research credibility after last week’s revelation of the data error in your preliminary enrollment report. Are you worried it will undo the years of trust building you just described? How do you take accountability?

    A: No. 1: The data itself, which we receive from institutions, is reliable, current and accurate. We make best efforts to ensure that it accurately represents what the institutions have within their own systems before any data is merged into the clearinghouse data system.

    When we first formed the Research Center, we had to show how you can get from the IPEDS number to the clearinghouse number and show people our data was something they could count on. We spent 15 years building this reputation. The key to any research-related error like this is, first, you have to take ownership of it and hold yourself accountable. As soon as I found out about this we were already making moves to [make it public]—we’re talking 48 hours. That’s the first step in maintaining trust.

    That being said, there’s an element of risk built into this work. Part of what the clearinghouse brings to the table is the ability to responsibly advance the dialogue of what’s happening in education and student pathways. There are things that are happening out there, such as students stopping out and coming back many years later, that basically defy conventional wisdom. And so the risk in all of this is that you shy away from that work and decide to stick with the knitting. But your obligation is, if you’re going to report those things, to be very transparent. As long as we can thread that needle, I think the clearinghouse will play an important role in helping to advance the dialogue.

    We’re taking this very seriously and understand the importance of the integrity of our reports considering how the field is dependent on the information we provide. Frankly, one of the things we’re going to take a look at is, what is the need for the preliminary report at the end of the day? Or do we need to pair it with more analysis—is it just enough to say that total enrollments are up X or down Y?

    Q: Are you saying you may discontinue the preliminary report entirely?

    A: That’s certainly an option. I think we need to assess the field’s need for an early report—what questions are we trying to answer and why is it important that those questions be answered by a certain time? I’ll be honest; this is the first time something like this has happened, where it’s been that dramatic. That’s where the introspection starts, saying, “Well, this was working before; what the heck happened?”

    When we released the first [preliminary enrollment] report [in 2020], we thought it’d be a one-time thing. Now, we’ve issued other reports that we thought were going to be one-time and ended up being a really big deal, like “Some College, No Credential.” We’re going to continue to look for opportunities to provide those types of insights. But I think any research entity needs to take a look at what you’re producing to make sure there’s still a need or a demand, or maybe what you’re providing needs to pivot slightly. That’s a process that’s going to be undertaken over the next few months as we evaluate this report and other reports we do.

    Q: How did this happen, exactly? Have you found the source of the imputation error?

    A: The research team is looking into it. In order to ensure for this particular report that we don’t extrapolate this to a whole bunch of other things, you just need to make sure that you know you’ve got your bases covered analytically.

    There was an error in how we imputed a particular category of dual-enrolled students versus freshmen. But if you look at the report, the total number of learners wasn’t impacted by that. These preliminary reports were designed to meet a need after COVID, to understand what the impact was going to be. We basically designed a report on an emergency basis, and by default, when you don’t have complete information, there’s imputation. There’s been a lot of pressure on getting the preliminary fall report out. That being said, you learn your lesson—you gotta own it and then you keep going. This was very unfortunate, and you can imagine the amount of soul searching to ensure that this never happens again.

    Q: Do you think demand for more postsecondary data is driving some irresponsible analytic practices?

    A: I can tell you that new types of demands are going to be put out there on student success data, looking at nondegree credentials, looking at microcredentials. And there’s going to be a lot of spitballing. Just look at how ROI is trying to be calculated right now; I could talk for hours about the ins and outs of ROI methodology. For example, if a graduate makes $80,000 after graduating but transferred first from a community college, what kind of attribution does the community college get for that salary outcome versus the four-year school? Hell, it could be due to a third-party boot camp done after earning a degree. Research on these topics is going to be full of outstanding questions.

    Q: What comes next for the clearinghouse’s research after you leave?

    A: I’m excited about where it’s going. I’m very excited about how artificial intelligence can be appropriately leveraged, though I think we’re still trying to figure out how to do that. I can only hope that the clearinghouse will continue its journey of support. Because while we don’t directly impact learner trajectories, we can create the tools that help people who support learners every year impact those trajectories. Looking back on my time here, that’s what I’m most proud of.

    Source link

  • Student Booted from PhD Program Over AI Use (Derek Newton/The Cheat Sheet)

    Student Booted from PhD Program Over AI Use (Derek Newton/The Cheat Sheet)


    This one is going to take a hot minute to dissect. Minnesota Public Radio (MPR) has the story.

    The plot contours are easy. A PhD student at the University of Minnesota was accused of using AI on a required pre-dissertation exam and removed from the program. He denies that allegation and has sued the school — and one of his professors — for due process violations and defamation respectively.

    Starting the case.

    The coverage reports that:

    all four faculty graders of his exam expressed “significant concerns” that it was not written in his voice. They noted answers that seemed irrelevant or involved subjects not covered in coursework. Two instructors then generated their own responses in ChatGPT to compare against his and submitted those as evidence against Yang. At the resulting disciplinary hearing, Yang says those professors also shared results from AI detection software. 

    Personally, when I see that four members of the faculty unanimously agreed on the authenticity of his work, I am out. I trust teachers.

    I know what a serious thing it is to accuse someone of cheating; I know teachers do not take such things lightly. When four go on the record to say so, I’m convinced. Barring some personal grievance or prejudice, which could happen, hard for me to believe that all four subject-matter experts were just wrong here. Also, if there was bias or petty politics at play, it probably would have shown up before the student’s third year, not just before starting his dissertation.

    Moreover, at least as far as the coverage is concerned, the student does not allege bias or program politics. His complaint is based on due process and inaccuracy of the underlying accusation.

    Let me also say quickly that asking ChatGPT for answers you plan to compare to suspicious work may be interesting, but it’s far from convincing — in my opinion. ChatGPT makes stuff up. I’m not saying that answer comparison is a waste, I just would not build a case on it. Here, the university didn’t. It may have added to the case, but it was not the case. Adding also that the similarities between the faculty-created answers and the student’s — both are included in the article — are more compelling than I expected.

    Then you add detection software, which the article later shares showed high likelihood of AI text, and the case is pretty tight. Four professors, similar answers, AI detection flags — feels like a heavy case.

    Denied it.

    The article continues that Yang, the student:

    denies using AI for this exam and says the professors have a flawed approach to determining whether AI was used. He said methods used to detect AI are known to be unreliable and biased, particularly against people whose first language isn’t English. Yang grew up speaking Southern Min, a Chinese dialect. 

    Although it’s not specified, it is likely that Yang is referring to the research from Stanford that has been — or at least ought to be — entirely discredited (see Issue 216 and Issue 251). For the love of research integrity, the paper has invented citations — sources that go to papers or news coverage that are not at all related to what the paper says they are.

    Does anyone actually read those things?

    Back to Minnesota, Yang says that as a result of the findings against him and being removed from the program, he lost his American study visa. Yang called it “a death penalty.”

    With friends like these.

    Also interesting is that, according to the coverage:

    His academic advisor Bryan Dowd spoke in Yang’s defense at the November hearing, telling panelists that expulsion, effectively a deportation, was “an odd punishment for something that is as difficult to establish as a correspondence between ChatGPT and a student’s answer.” 

    That would be a fair point except that the next paragraph is:

    Dowd is a professor in health policy and management with over 40 years of teaching at the U of M. He told MPR News he lets students in his courses use generative AI because, in his opinion, it’s impossible to prevent or detect AI use. Dowd himself has never used ChatGPT, but he relies on Microsoft Word’s auto-correction and search engines like Google Scholar and finds those comparable. 

    That’s ridiculous. I’m sorry, it is. The dude who lets students use AI because he thinks AI is “impossible to prevent or detect,” the guy who has never used ChatGPT himself, and thinks that Google Scholar and auto-complete are “comparable” to AI — that’s the person speaking up for the guy who says he did not use AI. Wow.

    That guy says:

    “I think he’s quite an excellent student. He’s certainly, I think, one of the best-read students I’ve ever encountered”

    Time out. Is it not at least possible that professor Dowd thinks student Yang is an excellent student because Yang was using AI all along, and our professor doesn’t care to ascertain the difference? Also, mind you, as far as we can learn from this news story, Dowd does not even say Yang is innocent. He says the punishment is “odd,” that the case is hard to establish, and that Yang was a good student who did not need to use AI. Although, again, I’m not sure how good professor Dowd would know.

    As further evidence of Yang’s scholastic ability, Dowd also points out that Yang has a paper under consideration at a top academic journal.

    You know what I am going to say.

    To me, that entire Dowd diversion is mostly funny.

    More evidence.

    Back on track, we get even more detail, such as that the exam in question was:

    an eight-hour preliminary exam that Yang took online. Instructions he shared show the exam was open-book, meaning test takers could use notes, papers and textbooks, but AI was explicitly prohibited. 

    Exam graders argued the AI use was obvious enough. Yang disagrees. 

    Weeks after the exam, associate professor Ezra Golberstein submitted a complaint to the U of M saying the four faculty reviewers agreed that Yang’s exam was not in his voice and recommending he be dismissed from the program. Yang had been in at least one class with all of them, so they compared his responses against two other writing samples. 

    So, the exam expressly banned AI. And we learn that, as part of the determination of the professors, they compared his exam answers with past writing.

    I say all the time, there is no substitute for knowing your students. If the initial four faculty who flagged Yang’s work had him in classes and compared suspicious work to past work, what more can we want? It does not get much better than that.

    Then there’s even more evidence:

    Yang also objects to professors using AI detection software to make their case at the November hearing.  

    He shared the U of M’s presentation showing findings from running his writing through GPTZero, which purports to determine the percentage of writing done by AI. The software was highly confident a human wrote Yang’s writing sample from two years ago. It was uncertain about his exam responses from August, assigning 89 percent probability of AI having generated his answer to one question and 19 percent probability for another. 

    “Imagine the AI detector can claim that their accuracy rate is 99%. What does it mean?” asked Yang, who argued that the error rate could unfairly tarnish a student who didn’t use AI to do the work.  

    First, GPTZero is junk. It’s reliably among the worst available detection systems. Even so, 89% is a high number. And most importantly, the case against Yang is not built on AI detection software alone, as no case should ever be. It’s confirmation, not conviction. Also, Yang, who the paper says already has one PhD, knows exactly what an accuracy rate of 99% means. Be serious.

    A pattern.

    Then we get this, buried in the news coverage:

    Yang suggests the U of M may have had an unjust motive to kick him out. When prompted, he shared documentation of at least three other instances of accusations raised by others against him that did not result in disciplinary action but that he thinks may have factored in his expulsion.  

    He does not include this concern in his lawsuits. These allegations are also not explicitly listed as factors in the complaint against him, nor letters explaining the decision to expel Yang or rejecting his appeal. But one incident was mentioned at his hearing: in October 2023, Yang had been suspected of using AI on a homework assignment for a graduate-level course. 

    In a written statement shared with panelists, associate professor Susan Mason said Yang had turned in an assignment where he wrote “re write it, make it more casual, like a foreign student write but no ai.”  She recorded the Zoom meeting where she said Yang denied using AI and told her he uses ChatGPT to check his English.

    She asked if he had a problem with people believing his writing was too formal and said he responded that he meant his answer was too long and he wanted ChatGPT to shorten it. “I did not find this explanation convincing,” she wrote. 

    I’m sorry — what now?

    Yang says he was accused of using AI in academic work in “at least three other instances.” For which he was, of course, not disciplined. In one of those cases, Yang literally turned in a paper with this:

    “re write it, make it more casual, like a foreign student write but no ai.” 

    He said he used ChatGPT to check his English and asked ChatGPT to shorten his writing. But he did not use AI. How does that work?

    For that one where he left in the prompts to ChatGPT:

    the Office of Community Standards sent Yang a letter warning that the case was dropped but it may be taken into consideration on any future violations. 

    Yang was warned, in writing.

    If you’re still here, we have four professors who agree that Yang’s exam likely used AI, in violation of exam rules. All four had Yang in classes previously and compared his exam work to past hand-written work. His exam answers had similarities with ChatGPT output. An AI detector said, in at least one place, his exam was 89% likely to be generated with AI. Yang was accused of using AI in academic work at least three other times, by a fifth professor, including one case in which it appears he may have left in his instructions to the AI bot.

    On the other hand, he did say he did not do it.

    Findings, review.

    Further:

    But the range of evidence was sufficient for the U of M. In the final ruling, the panel — comprised of several professors and graduate students from other departments — said they trusted the professors’ ability to identify AI-generated papers.

    Several professors and students agreed with the accusations. Yang appealed and the school upheld the decision. Yang was gone. The appeal officer wrote:

    “PhD research is, by definition, exploring new ideas and often involves development of new methods. There are many opportunities for an individual to falsify data and/or analysis of data. Consequently, the academy has no tolerance for academic dishonesty in PhD programs or among faculty. A finding of dishonesty not only casts doubt on the veracity of everything that the individual has done or will do in the future, it also causes the broader community to distrust the discipline as a whole.” 

    Slow clap.

    And slow clap for the University of Minnesota. The process is hard. Doing the review, examining the evidence, making an accusation — they are all hard. Sticking by it is hard too.

    Seriously, integrity is not a statement. It is action. Integrity is making the hard choice.

    MPR, spare me.

    Minnesota Public Radio is a credible news organization. Which makes it difficult to understand why they chose — as so many news outlets do — to not interview one single expert on academic integrity for a story about academic integrity. It’s downright baffling.

    Worse, MPR, for no specific reason whatsoever, decides to take prolonged shots at AI detection systems such as:

    Computer science researchers say detection software can have significant margins of error in finding instances of AI-generated text. OpenAI, the company behind ChatGPT, shut down its own detection tool last year citing a “low rate of accuracy.” Reports suggest AI detectors have misclassified work by non-native English writers, neurodivergent students and people who use tools like Grammarly or Microsoft Editor to improve their writing. 

    “As an educator, one has to also think about the anxiety that students might develop,” said Manjeet Rege, a University of St. Thomas professor who has studied machine learning for more than two decades. 

    We covered the OpenAI deception — and it was deception — in Issue 241, and in other issues. We covered the non-native English thing. And the neurodivergent thing. And the Grammarly thing. All of which MPR wraps up in the passive and deflecting “reports suggest.” No analysis. No skepticism.

    That’s just bad journalism.

    And, of course — anxiety. Rege, who please note has studied machine learning and not academic integrity, is predictable, but not credible here. He says, for example:

    it’s important to find the balance between academic integrity and embracing AI innovation. But rather than relying on AI detection software, he advocates for evaluating students by designing assignments hard for AI to complete — like personal reflections, project-based learnings, oral presentations — or integrating AI into the instructions. 

    Absolute joke.

    I am not sorry — if you use the word “balance” in conjunction with the word “integrity,” you should not be teaching. Especially if what you’re weighing against lying and fraud is the value of embracing innovation. And if you needed further evidence for his absurdity, we get the “personal reflections and project-based learnings” buffoonery (see Issue 323). But, again, the error here is MPR quoting a professor of machine learning about course design and integrity.

    MPR also quotes a student who says:

    she and many other students live in fear of AI detection software.  

    “AI and its lack of dependability for detection of itself could be the difference between a degree and going home,” she said. 

    Nope. Please, please tell me I don’t need to go through all the reasons that’s absurd. Find me one single of case in which an AI detector alone sent a student home. One.

    Two final bits.

    The MPR story shares:

    In the 2023-24 school year, the University of Minnesota found 188 students responsible of scholastic dishonesty because of AI use, reflecting about half of all confirmed cases of dishonesty on the Twin Cities campus. 

    Just noteworthy. Also, it is interesting that 188 were “responsible.” Considering how rare it is to be caught, and for formal processes to be initiated and upheld, 188 feels like a real number. Again, good for U of M.

    The MPR article wraps up that Yang:

    found his life in disarray. He said he would lose access to datasets essential for his dissertation and other projects he was working on with his U of M account, and was forced to leave research responsibilities to others at short notice. He fears how this will impact his academic career

    Stating the obvious, like the University of Minnesota, I could not bring myself to trust Yang’s data. And I do actually hope that being kicked out of a university for cheating would impact his academic career.

    And finally:

    “Probably I should think to do something, selling potatoes on the streets or something else,” he said. 

    Dude has a PhD in economics from Utah State University. Selling potatoes on the streets. Come on.

    Source link

  • FIRE to University of Texas at Dallas: Stop censoring the student press

    FIRE to University of Texas at Dallas: Stop censoring the student press

    The University of Texas at Dallas has a troubling history of trying to silence students. Now those students are fighting back.

    Today, the editors of The Retrograde published their first print edition, marking a triumphant return for journalism on campus in the face of administrative efforts to quash student press.

    Headlines above the fold of the first issue of The Retrograde, a new independent student newspaper at UT Dallas.

    Why call the newspaper The Retrograde? Because it’s replacing the former student newspaper, The Mercury, which ran into trouble when it covered the pro-Palestinian encampments on campus and shed light on UT Dallas’s use of state troopers (the same force that broke up UT Austin’s encampment just one week prior) and other efforts to quash even peaceful protest. As student journalists reported, their relationship with the administration subsequently deteriorated. University officials demoted the newspaper’s advisor and even removed copies of the paper from newsstands. At the center of this interference were Lydia Lum, director of student media, and Jenni Huffenberger, senior director of marketing and student media, whose titles reflect the university’s resistance to editorial freedom.

    The conflict between the paper and the administration came to a head when Lum called for a meeting of the Student Media Oversight Board, a university body which has the power to remove student leaders, accusing The Mercury’s editor-in-chief, Gregorio Olivares Gutierrez, of violating student media bylaws by having another form of employment, exceeding printing costs, and “bypassing advisor involvement.” Yet rather than follow those same bylaws, which offer detailed instructions for removing a student editor, Lum told board members from other student media outlets not to attend the meeting. A short-handed board then voted to oust Gutierrez. Adding insult to injury, Huffenberger unilaterally denied Gutierrez’s appeal, again ignoring the bylaws, which require the full board to consider any termination appeals.

    The student journalists of The Retrograde have shown incredible spirit. With your help, we can ensure their efforts — and the rights of all student journalists — are respected.

    In response, The Mercury’s staff went on strike, demanding Gutierrez’s reinstatement. To help in that effort, FIRE and the Student Press Law Center joined forces to pen a Nov. 12, 2024 letter calling for UT Dallas to honor the rights of the student journalists. We also asked them to pay the students the money they earned for the time they worked prior to the strike.

    UT Dallas refused to listen. Instead of embracing freedom of the press, the administration doubled down on censorship, ignoring both the students’ and our calls for justice.

    FIRE's advertisement in the first issue of the Retrograde student newspaper at UT Dallas. The headline reads: "FIRE Supports Student Journalism"

    FIRE took out a full page ad in support of The Retrograde at UT Dallas.

    In our letter, we argued that the university’s firing of Gutierrez was in retaliation for The Mercury’s unflattering coverage of the way administrators had handled the encampments. This is not even the first time UT Dallas has chosen censorship as the “best solution;” look no further than in late 2023 when they removed the “Spirit Rocks” students used to express themselves. Unfortunately, the university ignored both the students’ exhortations and FIRE’s demands, leaving UT Dallas without its newspaper. 

    But FIRE’s Student Press Freedom Initiative is here to make sure censorship never gets the last word.

    Students established The Retrograde, a fully independent newspaper. Without university resources, they have had to crowdfund and source their own equipment, working spaces, a new website, and everything else necessary to provide quality student-led journalism to the UT Dallas community. They succeeded, and FIRE is proud to support their efforts, placing a full-page ad in this week’s inaugural issue of The Retrograde.

    The fight for press freedom at UT Dallas is far from over — but we need your help to make a difference.

    Demand accountability from UT Dallas. The student journalists of The Retrograde have shown incredible spirit. With your help, we can ensure their efforts — and the rights of all student journalists — are respected.

    Source link

  • UKVI is tightening the rules on international student attendance

    UKVI is tightening the rules on international student attendance

    Back in April you’ll recall that UKVI shared a draft “remote delivery” policy with higher education providers for consultation.

    That process is complete – and now it’s written to providers to confirm the detail of the new arrangements.

    Little has changed in the proposal from last Spring – there are some clarifications on how it will apply, but the main impact is going to be on providers and students who depend, one way or another, on some of their teaching not being accessed “in person”.

    The backstory here is that technically, all teaching for international students right now is supposed to be in-person. That was relaxed during the pandemic for obvious reasons – and since, the rapid innovations in students being able to access types of teaching (either synchronously or asynchronously) has raised questions about how realistic and desirable that position remains.

    Politics swirls around this too – the worry/allegation is that students arrive and then disappear, and with a mixture of relaxed attendance regulation (UKVI stopped demanding a specific number of contact points a few years ago for universities) and a worry that some students are faking or bypassing some of the attendance systems that are in place, the time has come, it seems, to tighten a little – “formalising the boundaries in which institutions can use online teaching methods to deliver courses to international students”, as UKVI puts it.

    Its recent burst of compliance monitoring (with now public naming and shaming of universities “subject to an action plan”) seems to have been a factor too – with tales reaching us of officials asking often quite difficult questions about both how many students a provider thinks are on campus, and then how many actually are, on a given day or across a week.

    The balance being struck is designed, says UKVI, to “empower the sector to utilise advances in education technology” by delivering elements of courses remotely whilst setting “necessary thresholds” to provide clarity and ensure there is “no compromise” of immigration control.

    Remote or “optional”?

    The policy that will be introduced is broadly as described back in April – first, that two types of “teaching delivery” are to be defined as follows:

    • Remote delivery is defined as “timetabled delivery of learning where there is no need for the student to attend the premises of the student sponsor or partner institution which would otherwise take place live in-person at the sponsor or partner institution site.
    • Face-to-face delivery is defined as “timetabled learning that takes place in-person and on the premises of the student sponsor or a partner institution.

    You’ll see that that difference isn’t (necessarily) between teaching designed as in-person or designed as remote – it’s between hours that a student is required to be on campus for, and hours that they either specifically aren’t expected to come in for, or have the option to not come in for. That’s an important distinction:

    Where the student has an option of online or in-person learning, this should count as a remote element for this purpose.

    Then with those definitions set, we get a ratio.

    As a baseline, providers (with a track record of compliance) will be allowed to deliver up to 20 per cent of the taught elements of any degree level and above course remotely.

    Then if a provider is able to demonstrate how the higher usage is consistent with the requirements of the relevant educational quality standards body (OfS in England, QAA in Wales and Scotland) and remains consistent with the principles of the student route, they’ll be able to have a different ratio – up to 40 per cent of the teaching will be allowed to be in that “remote” category.

    Providers keen to use that higher limit will need to apply to do so via the annual CAS allocation process – and almost by definition will attract additional scrutiny as a result, if only to monitor how the policy is panning out. They’ll also have to list all courses provided to sponsored students that include remote delivery within that higher band – and provide justification for the higher proportion of remote learning based on educational value.

    (For those not immersed in immigration compliance, a CAS (Confirmation of Acceptance for Studies) is an electronic document issued by a UK provider to an international student that serves as proof of admission, and is required when applying for a student visa. The CAS includes a unique reference number, details of the course, tuition fees, and the institution’s sponsorship license information – and will soon have to detail if an international agent is involved too.)

    One question plenty of people have asked is whether this changes things for disabled students – UKVI makes clear that by exception, remote delivery can permitted on courses of any academic level studied at a student sponsor in circumstances where requiring face to face delivery would constitute discrimination on the basis of a student’s protected characteristics under the Equality Act 2010.

    A concern about that was that providers might not know if a student needs that exception in advance – UKVI says that it will trust providers to judge individual student circumstances in cases of extenuating circumstances and justify them during audits. The requirement to state protected characteristics on the CAS will be withdrawn.

    Oh – and sponsors will also be permitted to use remote delivery where continuity of education provision would otherwise be interrupted by unforeseen circumstances – things like industrial action, extreme weather, periods of travel restriction and so on.

    Notably, courses at levels 4 and 5 won’t be able to offer “remote delivery” at all – UKVI reckons they are “more vulnerable to abuse” from “non-genuine students”, so it’s resolved to link the more limited freedoms provided by Band 1 of the existing academic engagement policy to this provision of “remote” elements – degree level and above.

    Yes but what is teaching?

    A head-scratcher when the draft went out for consultation was what “counts” as teaching. Some will still raise questions with the answer – but UKVI says that activities like writing dissertations, conducting research, undertaking fieldwork, carrying out work placements and sitting exams are not “taught elements” – and are not therefore in scope.

    Another way of looking at that is basically – if it’s timetabled, it probably counts.

    Some providers have also been confused about modules – given that students on most courses are able to routinely choose elective modules (which themselves might contain different percentages of teaching in the two categories) after the CAS is assigned.

    UKVI says that sponsors should calculate the remote delivery percentage on the assumption that the student will elect to attend all possible remote elements online. So where elective modules form part of the course delivery, the highest possible remote delivery percentage will have to be stated (!) And where hours in the timetable are optional, providers will have to calculate remote delivery by assuming that students will participate in all optional remote elements online.

    The good news when managing all of that is that the percentage won’t have to be calculated on the basis of module or year – it’s the entire course that counts. And where the course is a joint programme with a partner institution based overseas, only elements of the course taking place in the UK will be taken into account.

    What’s next

    There’s no specific date yet on implementation – IT changes to the sponsor management system are required, and new fields will be added to the CAS and annual CAS allocation request forms first. The “spring” is the target, and there’s also a commitment to reviewing the policy after 12 months.

    In any event, any university intending to utilise (any) remote delivery will need to have updated their internal academic engagement (ie attendance) policy ahead of submitting their next annual CAS allocation request – and UKVI may even require the policy to be submitted before deciding on the next CAS allocation request, and definitely by September 2025.

    During the consultation, a number of providers raised the issue of equity – how would one justify international and home students being treated differently? UKVI says that distinctions are reasonable because international students require permission to attend a course in the UK:

    If attendance is no longer necessary, the validity of holding such permission must be reassessed.

    There’s no doubt that – notwithstanding that providers are also under pressure to produce (in many cases for the first time) home student attendance policies because of concerns about attendance and student loan entitlements – the new policy will cause some equity issues between home and international students.

    In some cases those will be no different to the issues that exist now – some providers in some departments simply harmonise their requirements, some apply different regs by visa status, and some apply different rules for home students to different dept/courses depending on the relative proportion of international students in that basket. That may all have to be revisited.

    The big change – for some providers, but not all – is those definitions. The idea of a student never turning up for anything until they “cram” for their “finals” is built into many an apocryphal student life tale – that definitely won’t be allowed for international students, and it’s hard to see a provider getting away with that in their SFE/SFW/SAAS demanded home student policy either.

    Some providers won’t be keen to admit as such, but the idea of 100 per cent attendance to hours of teaching in that 80 per cent basket is going to cause a capacity problem in some lecture theatres and teaching spaces that will now need to be resolved. Module choice (and design) is also likely to need a careful look.

    And the wider questions of the way in which students use “optional” attendance and/or recorded lectures to manage their health and time – with all the challenges relating to part-time work and commuting/travelling in the mix – may result in a need to accelerate timetable reform to reduce the overall number of now very-much “required” visits to campus.

    One other thing not mentioned in here is the reality that UKVI is setting a percentage of a number of hours that is not specified – some providers could engage in reducing the number of taught hours altogether to make the percentages add up. Neither in the domestic version of this agenda nor in this international version do we have an attempt at defining what “full-time” really means in terms of overall taught hours – perhaps necessarily given programme diversity – but it’ll be a worry for some.

    Add all of this up – mixing in UKVI stepping up compliance monitoring and stories of students sharing QR codes for teaching rooms on WhatsApp to evade attendance monitoring systems – and for some providers and some students, the change will be quite dramatic.

    The consultation on the arrangements has been carried out quite confidentially so far – I’d tentatively suggest here that any revision to arrangements implemented locally should very much aim to switch that trend away from “UKVI said so” towards detailed discussion with (international) student representatives, with a consideration of wider timetabling, housing, travel and other support arrangements in the mix.

    Source link

  • student debt relief progress and new fact sheets (SBPC)

    student debt relief progress and new fact sheets (SBPC)

    The fight for student loan borrowers continues! In the last remaining days of the Biden-Harris Administration, the U.S. Department of Education (ED) is pushing some final relief through for student loan borrowers, new Income-Driven Repayment (IDR) Account Adjustment payment counts are live, and we have new fact sheets shedding light on the impact of the student debt crisis on borrowers.

    Here’s a roundup of the latest:

    Over 5 million borrowers have been freed from student debt.

    In a major win for borrowers, ED announced that the Biden-Harris Administration has now approved $183.6 billion in student debt discharges via various student debt relief fixes and programs. This relief has now reached over 5 million borrowers and includes new approvals for Public Service Loan Forgiveness (PSLF) relief, borrower defense relief, and Total and Permanent Disability Discharge relief.

    This relief is life-changing for millions of families, proving the power of bold, decisive action on student debt. Yet, there is much more work to do. Every step toward relief underscores the need to continue fighting for policies that reduce the burden of student debt and ensure affordable access to higher education.

    Final phase of the IDR Account Adjustment is underway—take screenshots!

    In tandem with the latest cancellation efforts, ED has also finally started updating borrower payment counts on the Federal Student Aid dashboard. Providing official payment counts will help borrowers receive the credit they have earned towards cancellation under IDR, and ensure that all borrowers who have been forced to pay for 20 years or longer are automatically able to benefit from relief they are entitled to under federal law. ***If you are a borrower with federal student loans, we recommend that you check your dashboard on studentaid.gov, screenshot your new count, and save it in your records.

    Previously, many borrowers—including those who work in public service jobs and low-income borrowers struggling to afford payments—were steered into costly deferments and forbearance, preventing them from reaching the 20 years or longer for IDR relief or the 120 payments necessary for PSLF cancellation. Under the IDR Account Adjustment, these periods are now counted, even if borrowers were mistakenly placed in the wrong repayment plan or faced servicing errors. 

    Source link

  • WEEKEND READING: Why Scotland’s student funding system is “unfair, unsustainable, unaffordable” and needs to be replaced with a graduate contribution model

    WEEKEND READING: Why Scotland’s student funding system is “unfair, unsustainable, unaffordable” and needs to be replaced with a graduate contribution model

    • These are the remarks by Alison Payne, Research Director at Reform Scotland, at the HEPI / CDBU event on funding higher education, held at Birkbeck, University of London, on Thursday of this week.
    • We are also making available Johnny Rich’s slides on ‘Making graduate employer contributions work’ from the same event, which are available to download here.

    Thanks to the CDBU and to HEPI for the invitation to attend and take part in today’s discussion. 

    My speech today has been titled ‘A graduate contribution model’. Of course, for UK graduates not from Scotland, I’m sure they would make the point that they very much do contribute through their fees, but the situation is very different in Scotland and I’m really grateful that I have the opportunity to feed the Scottish situation into today’s discussion.

    I thought it may be helpful if I gave a quick overview of the Scottish situation, as it differs somewhat to the overview Nick gave this morning covering the rest of the UK. 

    Although tuition fees were introduced throughout the UK in 1998, the advent of devolution in 1999 and the passing of responsibility for higher education to Holyrood began the period of diverging funding policies.

    The then Labour / Lib Dem Scottish Executive, as it was then known, scrapped tuition fees and replaced them with a graduate endowment from 2001-02, with the first students becoming liable to pay the fee from April 2005. The scheme called for students to pay back £2,000 once they started earning over £10,000. 

    The graduate endowment was then scrapped by the SNP in February 2008. A quirk of EU law meant that students from EU countries could not be charged tuition fees if Scottish students were not paying them but students from England, Wales and Northern Ireland could be charged. This meant that from 2008 to 2021/22 EU students did not need to pay fees to attend Scottish universities, though students from the rest of the UK did. 

    We’re used to politics in Scotland being highly polarised and often toxic with few areas of commonality, but for the most part the policy of ‘free’ higher education has been supported by all of the political parties. Indeed at the last Scottish election in 2021 all parties committed to maintaining the policy in their manifestos. It is only recently that the Scottish Tories have suggested a move away from this following the election of their new leader, Russell Finlay.

    But behind this unusual political consensus, the ‘free’ policy is becoming increasingly unsustainable and unaffordable. Politicians will privately admit this, but politics, and a rock with an ill-advised slogan, have made it harder to have the much needed debate.

    The Cap

    While we don’t have tuition fees, we do have a cap on student numbers. And while more Scots are going to university, places are unable to keep up with demand. Since 2006 there has been a 56% increase in applicants, but an 84% increase in the number refused entry. 

    It is increasingly the case that students from the rest of the UK or overseas are accepted on to courses in Scotland while their Scottish counterparts are denied. For example, when clearing options are posted, often those places at Scotland’s top universities are only available to students from the rest of the UK and not to Scottish students, even if the latter have better grades. As a result, Scots can feel that they are denied access to education on their doorstep that those from elsewhere can obtain. Indeed, there are growing anecdotes about those who can afford it buying or renting property elsewhere in the UK so that they can attend a Scottish university, pay the higher fee and get around the cap.

    Basically, more people want to go to university, but the fiscal arrangements are holding ambition them back. This problem was highlighted by the Scottish Affairs Select Committee’s report on Universities from 2021.

    Some commentators in Scotland have blamed the lack of places on widening access programmes, but I would challenge this. It is undoubtedly a good thing that more people from non-traditional backgrounds are getting into university, it is the cap that is limiting Scottish places, not access programmes. This is a point that has been backed by individuals such as the Principal of St Andrews, Professor Dame Sally Mapstone [who also serves as HEPI’s Chair].

    Financial Woes

    The higher education sector in Scotland, as with elsewhere in the UK, is not in great financial health. Audit Scotland warned back in 2019 that half of our institutions were facing growing deficits. Pressures including pensions contributions, Brexit and estate maintenance have all played a role and in the face of this decline, but nothing has changed and we’re now seeing crisis like those at Dundee emerge. Against this backdrop, income from those students who pay higher fees is an important revenue stream.

    There is obviously a huge variation in what the fees are to attend a Scottish university, considerably more so than in the rest of the UK.

    For example, to study Accounting and Business as an undergraduate at Edinburgh University, the cost for a full-time new student for 2024/25 is £1,820 per year for a Scottish-domiciled student (met by the Scottish Government), £9,250 per year for someone from the rest of the UK and £26,500 for an international student. 

    It is clear why international students and UK students from outside Scotland are therefore so much more attractive than Scottish students.

    However, there is by no means an equal distribution of higher fee paying students among our institutions.

    For example, at St Andrews about one-third of undergraduate full-time students were Scots, with one-third from the rest of the UK and one-third international. The numbers for Edinburgh are similar.  

    At the other end of the scale, at the University of the Highlands and Islands and Glasgow Caledonian, around 90% of students are Scottish, with only around only 1% being international.  

    So it is clear that institutions’ ability to raise money from fee-paying students varies very dramatically, increasing the financial pressures on those with low fee income.

    However, when looking at the issue, it is important to recognise that it is not just our universities who are struggling, Scotland’s colleges are facing huge financial pressures as well. 

    The current proposed Scottish budget would leave colleges struggling with a persistent, real-terms funding cut of 17 per cent since 2021/22. Our college sector is hugely important in terms of the delivery of skills, working with local economies and as a route to university for so many, but for too long colleges have been treated like the Cinderella service in Scotland. The prioritising of ‘free’ university tuition over the college sector is adding to this problem.

    Regardless of who wins the Holyrood election next year, money is, and will remain, tight for some time. It would be lovely to be able to have lots of taxpayer funded ‘free’ services, but that is simply unsustainable and difficult choices need to be made. 

    This is why we believe that the current situation is unfair, unsustainable, unaffordable and needs to change.

    Reform Scotland would offer another alternative solution. We believe that there needs to be a better balance between the individual graduate and Scottish taxpayers in the contribution towards higher education. 

    One way this could be achieved is through a fee after graduation, to be repaid once they earn more than the Scottish average salary. This would not be a fee incurred on starting university and deferred until after graduation, rather the fee would be incurred on graduation.

    In terms of what that fee could be, the Cubie report over 25 years ago suggested a graduate fee of £3,000, which would be about £5,500 today.  This could perhaps be the starting point for consideration.  

    Any figure should take account of different variations in terms of the true cost of the course and potential skill shortages. 

    However, introducing a graduate fee would not necessarily mean an end to ‘free’ tuition. 

    Rather it provides an opportunity to look at the skills gaps that exist in Scotland and the possibility of developing schemes which cut off or scrap repayments for graduates who work in specific geographic areas or sectors of Scotland for set periods of time. 

    Such schemes could also look to incorporate students from elsewhere for Scotland is facing a demographic crisis. Our population is set to become older and smaller, and we are the only part of the UK projected to have a smaller population by 2045. 

    We desperately need to retain and attract more working-age people. Perhaps such graduate repayment waiver schemes could also be offered to students from the rest of the UK who choose to study in Scotland – stay here and work after graduation and we will pay a proportion of your fee. A wide range of different schemes could be considered and linked into the wider policy issues facing Scotland. 

    According to the Higher Education Statistics Authority (HESA) there were 3,370 graduates from the rest of the UK who attended a Scottish institution in 2020/21. Of those, only 990 chose to remain in Scotland for work after graduation. Could we encourage more people to stay after studying?

    Conclusion

    A graduate fee is only one possible solution, but I would argue that it is also one with a short shelf life. As graduates would not incur the fee until they graduated, there would be a four-year delay between the change in policy and revenue beginning to be received. Our institutions are facing very real fiscal problems and there is a danger of a university going to the wall. 

    If we get to the 2026 election and political parties refuse to shift the dial and at least recognise that the current system is unsustainable, then there is a danger that nothing will change for another Parliamentary term. I don’t think we can afford to wait until 2031.

    There is another interesting dynamic now as well. Labour in Scotland currently, publicly at least, oppose tuition fees. However, there are now 37 Scottish Labour MPs at Westminster who are backing the increase of fees on students from outside Scotland, or Scottish students studying down south. Given the unpopularity of the Labour government as well as the tight contest between the SNP and Labour for Holyrood, it seems unlikely that position can be maintained.

    All across the UK there are increasing signs of the stark financial situation we are facing. Against that backdrop, along with the restrictions placed on the number being able to attend, free university tuition is unsustainable and unaffordable. People outside Scottish politics seem to be able to see this reality, privately so do many of our politicians. We need to shift this debate in to the public domain in Scotland and develop a workable solution.

    Source link

  • Social Security Offsets and Defaulted Student Loans (CFPB)

    Social Security Offsets and Defaulted Student Loans (CFPB)

    Executive Summary

    When
    borrowers default on their federal student loans, the U.S. Department
    of Education (“Department of Education”) can collect the outstanding
    balance through forced collections, including the offset of tax refunds
    and Social Security benefits and the garnishment of wages. At the
    beginning of the COVID-19 pandemic, the Department of Education paused
    collections on defaulted federal student loans.
    This year, collections are set to resume and almost 6 million student
    loan borrowers with loans in default will again be subject to the
    Department of Education’s forced collection of their tax refunds, wages,
    and Social Security benefits.
    Among the borrowers who are likely to experience forced collections are
    an estimated 452,000 borrowers ages 62 and older with defaulted loans
    who are likely receiving Social Security benefits.

    This
    spotlight describes the circumstances and experiences of student loan
    borrowers affected by the forced collection of Social Security benefits.
    It also describes how forced collections can push older borrowers into
    poverty, undermining the purpose of the Social Security program.

    Key findings

    • The
      number of Social Security beneficiaries experiencing forced collection
      grew by more than 3,000 percent in fewer than 20 years; the count is
      likely to grow as the age of student loan borrowers trends older.

      Between 2001 and 2019, the number of Social Security beneficiaries
      experiencing reduced benefits due to forced collection increased from
      approximately 6,200 to 192,300. This exponential growth is likely driven
      by older borrowers who make up an increasingly large share of the
      federal student loan portfolio. The number of student loan borrowers
      ages 62 and older increased by 59 percent from 1.7 million in 2017 to
      2.7 million in 2023, compared to a 1 percent decline among borrowers
      under the age of 62.
    • The total amount
      of Social Security benefits the Department of Education collected
      between 2001 and 2019 through the offset program increased from $16.2
      million to $429.7 million
      . Despite the exponential increase in
      collections from Social Security, the majority of money the Department
      of Education has collected has been applied to interest and fees and has
      not affected borrowers’ principal amount owed. Furthermore, between
      2016 and 2019, the Department of the Treasury’s fees alone accounted for
      nearly 10 percent of the average borrower’s lost Social Security
      benefits.
    • More than one in three
      Social Security recipients with student loans are reliant on Social
      Security payments, meaning forced collections could significantly
      imperil their financial well-being.
      Approximately 37 percent of the
      1.3 million Social Security beneficiaries with student loans rely on
      modest payments, an average monthly benefit of $1,523, for 90 percent of
      their income. This population is particularly vulnerable to reduction
      in their benefits especially if benefits are offset year-round. In 2019,
      the average annual amount collected from individual beneficiaries was
      $2,232 ($186 per month).
    • The physical well-being of half of Social Security beneficiaries with student loans in default may be at risk.
      Half of Social Security beneficiaries with student loans in default and
      collections skipped a doctor’s visit or did not obtain prescription
      medication due to cost.
    • Existing minimum income protections fail to protect student loan borrowers with Social Security against financial hardship.
      Currently, only $750 per month of Social Security income—an amount that
      is $400 below the monthly poverty threshold for an individual and has
      not been adjusted for inflation since 1996—is protected from forced
      collections by statute. Even if the minimum protected income was
      adjusted for inflation, beneficiaries would likely still experience
      hardship, such as food insecurity and problems paying utility bills. A
      higher threshold could protect borrowers against hardship more
      effectively. The CFPB found that for 87 percent of student loan
      borrowers who receive Social Security, their benefit amount is below 225
      percent of the federal poverty level (FPL), an income level at which
      people are as likely to experience material hardship as those with
      incomes below the federal poverty level.
    • Large
      shares of Social Security beneficiaries affected by forced collections
      may be eligible for relief or outright loan cancellation, yet they are
      unable to access these benefits, possibly due to insufficient
      automation or borrowers’ cognitive and physical decline.
      As many as
      eight in ten Social Security beneficiaries with loans in default may be
      eligible to suspend or reduce forced collections due to financial
      hardship. Moreover, one in five Social Security beneficiaries may be
      eligible for discharge of their loans due to a disability. Yet these
      individuals are not accessing such relief because the Department of
      Education’s data matching process insufficiently identifies those who
      may be eligible.

    Taken together,
    these findings suggest that the Department of Education’s forced
    collections of Social Security benefits increasingly interfere with
    Social Security’s longstanding purpose of protecting its beneficiaries
    from poverty and financial instability.

    Introduction

    When
    borrowers default on their federal student loans, the Department of
    Education can collect the outstanding balance through forced
    collections, including the offset of tax refunds and Social Security
    benefits, and the garnishment of wages. At the beginning of the COVID-19
    pandemic, the Department of Education paused collections on defaulted
    federal student loans. This year, collections are set to resume and
    almost 6 million student loan borrowers with loans in default will again
    be subject to the Department of Education’s forced collection of their
    tax refunds, wages, and Social Security benefits.

    Among
    the borrowers who are likely to experience the Department of
    Education’s renewed forced collections are an estimated 452,000
    borrowers with defaulted loans who are ages 62 and older and who are
    likely receiving Social Security benefits.
    Congress created the Social Security program in 1935 to provide a basic
    level of income that protects insured workers and their families from
    poverty due to situations including old age, widowhood, or disability.
    The Social Security Administration calls the program “one of the most
    successful anti-poverty programs in our nation’s history.”
    In 2022, Social Security lifted over 29 million Americans from poverty,
    including retirees, disabled adults, and their spouses and dependents.
    Congress has recognized the importance of securing the value of Social
    Security benefits and on several occasions has intervened to protect
    them.

    This
    spotlight describes the circumstances and experiences of student loan
    borrowers affected by the forced collection of their Social Security
    benefits.
    It also describes how the purpose of Social Security is being
    increasingly undermined by the limited and deficient options the
    Department of Education has to protect Social Security beneficiaries
    from poverty and hardship.

    The forced collection of Social Security benefits has increased exponentially.

    Federal
    student loans enter default after 270 days of missed payments and
    transfer to the Department of Education’s default collections program
    after 360 days. Borrowers with a loan in default face several
    consequences: (1) their credit is negatively affected; (2) they lose
    eligibility to receive federal student aid while their loans are in
    default; (3) they are unable to change repayment plans and request
    deferment and forbearance; and (4) they face forced collections of tax refunds, Social Security benefits, and wages among other payments.
    To conduct its forced collections of federal payments like tax refunds
    and Social Security benefits, the Department of Education relies on a
    collection service run by the U.S. Department of the Treasury called the
    Treasury Offset Program.

    Between
    2001 and 2019, the number of student loan borrowers facing forced
    collection of their Social Security benefits increased from at least
    6,200 to 192,300.
    That is a more than 3,000 percent increase in fewer than 20 years. By
    comparison, the number of borrowers facing forced collections of their
    tax refunds increased by about 90 percent from 1.17 million to 2.22
    million during the same period.

    This exponential growth of Social Security offsets between 2001 and 2019 is likely driven by multiple factors including:

    • Older
      borrowers accounted for an increasingly large share of the federal
      student loan portfolio due to increasing average age of enrollment and
      length of time in repayment.
      Data from the Department of Education
      (which is only available since 2017), show that the number of student
      loan borrowers ages 62 and older, increased 24 percent from 1.7 million
      in 2017 to 2.1 million in 2019, compared to less than 1 percent among
      borrowers under the age of 62.
    • A larger number of borrowers, especially older borrowers, had loans in default.
      Data from the Department of Education show that the number of student
      loan borrowers with a defaulted loan increased by 230 percent from 3.8
      million in 2006 to 8.8 million in 2019. Compounding these trends is the fact that older borrowers are twice as likely to have a loan in default than younger borrowers.

    Due
    to these factors, the total amount of Social Security benefits the
    Department of Education collected between 2001 and 2019 through the
    offset program increased annually from $16.2 million to $429.7 million
    (when adjusted for inflation).
    This increase occurred even though the average monthly amount the
    Department of Education collected from individual beneficiaries was the
    same for most years, at approximately $180 per month.

    Figure 1: Number of Social Security beneficiaries and total amount collected for student loans (2001-2019)

    Source: CFPB analysis of public data from U.S. Treasury’s Fiscal Data portal. Amounts are presented in 2024 dollars.

    While the total collected from
    Social Security benefits has increased exponentially, the majority of
    money the Department of Education collected has not been applied to
    borrowers’ principal amount owed. Specifically, nearly three-quarters of
    the monies the Department of Education collects through offsets is
    applied to interest and fees, and not towards paying down principal
    balances.
    Between 2016 and 2019, the U.S. Department of the Treasury charged the
    Department of Education between $13.12 and $15.00 per Social Security
    offset, or approximately between $157.44 and $180 for 12 months of
    Social Security offsets per beneficiary with defaulted federal student
    loans. As a matter of practice, the Department of Education often passes these fees on directly to borrowers.
    Furthermore, these fees accounted for nearly 10 percent of the average
    monthly borrower’s lost Social Security benefits which was $183 during
    this time.
    Interest and fees not only reduce beneficiaries’ monthly benefits, but
    also prolong the period that beneficiaries are likely subject to forced
    collections.

    Forced collections are compromising Social Security beneficiaries’ financial well-being.

    Forced
    collection of Social Security benefits affects the financial well-being
    of the most vulnerable borrowers and can exacerbate any financial and
    health challenges they may already be experiencing. The CFPB’s analysis
    of the Survey of Income and Program Participation (SIPP) pooled data for
    2018 to 2021 finds that Social Security beneficiaries with student
    loans receive an average monthly benefit of $1,524.
    The analysis also indicates that approximately 480,000 (37 percent) of
    the 1.3 million beneficiaries with student loans rely on these modest
    payments for 90 percent or more of their income,
    thereby making them particularly vulnerable to reduction in their
    benefits especially if benefits are offset year-round. In 2019, the
    average annual amount collected from individual beneficiaries was $2,232
    ($186 per month).

    A
    recent survey from The Pew Charitable Trusts found that more than nine
    in ten borrowers who reported experiencing wage garnishment or Social
    Security payment offsets said that these penalties caused them financial
    hardship.
    Consequently, for many, their ability to meet their basic needs,
    including access to healthcare, became more difficult. According to our
    analysis of the Federal Reserve’s Survey of Household Economic and
    Decision-making (SHED), half of Social Security beneficiaries with
    defaulted student loans skipped a doctor’s visit and/or did not obtain
    prescription medication due to cost.
    Moreover, 36 percent of Social Security beneficiaries with loans in
    delinquency or in collections report fair or poor health. Over half of
    them have medical debt.

    Figure 2: Selected financial experiences and hardships among subgroups of loan borrowers

    Bar graph showing that borrowers who receive Social Security benefits and are delinquent or in collections are more likely to report that their spending is same or higher than their income, they are unable to pay some bills, have fair or poor health, and skip medical care than borrowers who receive Social Security benefits and are not delinquent or in collections.

    Source: CFPB analysis of the Federal Reserve Board Survey of Household Economic and Decision-making (2019-2023).

    Social Security recipients
    subject to forced collection may not be able to access key public
    benefits that could help them mitigate the loss of income. This is
    because Social Security beneficiaries must list the unreduced amount of
    their benefits prior to collections when applying for other means-tested
    benefits programs such as Social Security Insurance (SSI), Supplemental
    Nutrition Assistance Program (SNAP), and the Medicare Savings Programs.
    Consequently, beneficiaries subject to forced collections must report
    an inflated income relative to what they are actually receiving. As a
    result, these beneficiaries may be denied public benefits that provide
    food, medical care, prescription drugs, and assistance with paying for
    other daily living costs.

    Consumers’
    complaints submitted to the CFPB describe the hardship caused by forced
    collections on borrowers reliant on Social Security benefits to pay for
    essential expenses.
    Consumers often explain their difficulty paying for such expenses as
    rent and medical bills. In one complaint, a consumer noted that they
    were having difficulty paying their rent since their Social Security
    benefit usually went to paying that expense.
    In another complaint, a caregiver described that the money was being
    withheld from their mother’s Social Security, which was the only source
    of income used to pay for their mother’s care at an assisted living
    facility.
    As forced collections threaten the housing security and health of
    Social Security beneficiaries, they also create a financial burden on
    non-borrowers who help address these hardships, including family members
    and caregivers.

    Existing minimum income protections fail to protect student loan borrowers with Social Security against financial hardship.

    The
    Debt Collection Improvement Act set a minimum floor of income below
    which the federal government cannot offset Social Security benefits and
    subsequent Treasury regulations established a cap on the percentage of
    income above that floor.
    Specifically, these statutory guardrails limit collections to 15
    percent of Social Security benefits above $750. The minimum threshold
    was established in 1996 and has not been updated since. As a result, the
    amount protected by law alone does not adequately protect beneficiaries
    from financial hardship and in fact no longer protects them from
    falling below the federal poverty level (FPL). In 1996, $750 was nearly
    $100 above the monthly poverty threshold for an individual.
    Today that same protection is $400 below the threshold. If the
    protected amount of $750 per month ($9,000 per year) set in 1996 was
    adjusted for inflation, in 2024 dollars, it would total $1,450 per month
    ($17,400 per year).

    Figure
    3: Comparison of monthly FPL threshold with the current protected
    amount established in 1996 and the amount that would be protected with
    inflation adjustment

    Image with a bar graph showing the difference in monthly amounts for different thresholds and protections, from lowest to highest: (a) existing protections ($750), (b) the federal poverty level in 2024 ($1,255), (c) the amount set in 1996 if it had been CPI adjusted ($1,450), and (e) 225% of the FPL under the SAVE Plan ($2,824).

    Source: Calculations by the CFPB. Notes: Inflation adjustments based on the consumer price index (CPI).

    Even if the minimum protected
    income of $750 is adjusted for inflation, beneficiaries will likely
    still experience hardship as a result of their reduced benefits.
    Consumers with incomes above the poverty line also commonly experience
    material hardship. This suggests that a threshold that is higher than the poverty level will more effectively protect against hardship.
    Indeed, in determining an income threshold for $0 payments under the
    SAVE plan, the Department of Education researchers used material
    hardship (defined as being unable to pay utility bills and reporting
    food insecurity) as their primary metric, and found similar levels of
    material hardship among those with incomes below the poverty line and
    those with incomes up to 225 percent of the FPL.
    Similarly, the CFPB’s analysis of a pooled sample of SIPP respondents
    finds the same levels of material hardship for Social Security
    beneficiaries with student loans with incomes below 100 percent of the
    FPL and those with incomes up to 225 percent of the FPL.
    The CFPB found that for 87 percent of student loan borrowers who
    receive Social Security, their benefit amount is below 225 percent of
    the FPL.
    Accordingly, all of those borrowers would be removed from forced
    collections if the Department of Education applied the same income
    metrics it established under the SAVE program to an automatic hardship
    exemption program.

    Existing options for relief from forced collections fail to reach older borrowers.

    Borrowers
    with loans in default remain eligible for certain types of loan
    cancellation and relief from forced collections. However, our analysis
    suggests that these programs may not be reaching many eligible
    consumers. When borrowers do not benefit from these programs, their
    hardship includes, but is not limited to, unnecessary losses to their
    Social Security benefits and negative credit reporting.

    Borrowers who become disabled after reaching full retirement age may miss out on Total and Permanent Disability

    The
    Total and Permanent Disability (TPD) discharge program cancels federal
    student loans and effectively stops all forced collections for disabled
    borrowers who meet certain requirements. After recent revisions to the
    program, this form of cancelation has become common for those borrowers
    with Social Security who became disabled prior to full retirement age. In 2016, a GAO study documented the significant barriers to TPD that Social Security beneficiaries faced.
    To address GAO’s concerns, the Department of Education in 2021 took a
    series of mitigating actions, including entering into a data-matching
    agreement with the Social Security Administration (SSA) to automate the
    TPD eligibility determination and discharge process.
    This process was expanded further with new final rules being
    implemented July 1, 2023 that expanded the categories of borrowers
    eligible for automatic TPD cancellation. In total, these changes successfully resulted in loan cancelations for approximately 570,000 borrowers.

    However,
    the automation and other regulatory changes did not significantly
    change the application process for consumers who become disabled after
    they reach full retirement age or who have already claimed the Social
    Security retirement benefits. For these beneficiaries, because they are
    already receiving retirement benefits, SSA does not need to determine
    disability status. Likewise, SSA does not track disability status for
    those individuals who become disabled after they start collecting their
    Social Security retirement benefits.

    Consequently,
    SSA does not transfer information on disability to the Department of
    Education once the beneficiary begins collecting Social Security
    retirement.
    These individuals therefore will not automatically get a TPD discharge
    of their student loans, and they must be aware and physically and
    mentally able to proactively apply for the discharge.

    The
    CFPB’s analysis of the Census survey data suggests that the population
    that is excluded from the TPD automation process could be substantial.
    More than one in five (22 percent) Social Security beneficiaries with
    student loans are receiving retirement benefits and report a disability
    such as a limitation with vision, hearing, mobility, or cognition.
    People with dementia and other cognitive disabilities are among those
    with the greatest risk of being excluded, since they are more likely to
    be diagnosed after the age 70, which is the maximum age for claiming
    retirement benefits.

    These
    limitations may also help explain why older borrowers are less likely
    to rehabilitate their defaulted student loans. Specifically, 11 percent
    of student loan borrowers ages 50 to 59 facing forced collections
    successfully rehabilitated their loans, while only five percent of borrowers over the age of 75 do so.

    Figure
    4: Number of student loan borrowers ages 50 and older in forced
    collection, borrowers who signed a rehabilitation agreement, and
    borrowers who successfully rehabilitated a loan by selected age groups

    Age Group Number of Borrowers in Offset Number of Borrowers Who Signed a Rehabilitation Agreement Percent of Borrowers Who Signed a Rehabilitation Agreement Number of Borrowers Successfully Rehabilitated Percent of Borrowers who Successfully Rehabilitated
    50 to 59 265,200 50,800 14% 38,400 11%
    60 to 74 184,900 24,100 11% 18,500 8%
    75 and older 15,800 1,000 6% 800 5%

    Source: CFPB analysis of data provided by the Department of Education.

    Shifting demographics of
    student loan borrowers suggest that the current automation process may
    become less effective to protect Social Security benefits from forced
    collections as more and more older adults have student loan debt. The
    fastest growing segment of student loan borrowers are adults ages 62 and
    older. These individuals are generally eligible for retirement
    benefits, not disability benefits, because they cannot receive both
    classifications at the same time. Data from the Department of Education
    reflect that the number of student loan borrowers ages 62 and older
    increased by 59 percent from 1.7 million in 2017 to 2.7 million in 2023.
    In comparison, the number of borrowers under the age of 62 remained
    unchanged at 43 million in both years.
    Furthermore, additional data provided to the CFPB by the Department of
    Education show that nearly 90,000 borrowers ages 81 and older hold an
    average amount of $29,000 in federal student loan debt, a substantial
    amount despite facing an estimated average life expectancy of less than
    nine years.

    Existing exceptions to forced collections fail to protect many Social Security beneficiaries

    In
    addition to TPD discharge, the Department of Education offers reduction
    or suspension of Social Security offset where borrowers demonstrate
    financial hardship.
    To show hardship, borrowers must provide documentation of their income
    and expenses, which the Department of Education then uses to make its
    determination.
    Unlike the Debt Collection Improvement Act’s minimum protections, the
    eligibility for hardship is based on a comparison of an individual’s
    documented income and qualified expenses. If the borrower has eligible
    monthly expenses that exceed or match their income, the Department of
    Education then grants a financial hardship exemption.

    The
    CFPB’s analysis suggests that the vast majority of Social Security
    beneficiaries with student loans would qualify for a hardship
    protection. According to CFPB’s analysis of the Federal Reserve Board’s
    SHED, eight in ten (82 percent) of Social Security beneficiaries with
    student loans in default report that their expenses equal or exceed
    their income.
    Accordingly, these individuals would likely qualify for a full
    suspension of forced collections. Yet the GAO found that in 2015 (when
    the last data was available) less than ten percent of Social Security
    beneficiaries with forced collections applied for a hardship exemption
    or reduction of their offset.
    A possible reason for the low uptake rate is that many beneficiaries or
    their caregivers never learn about the hardship exemption or the
    possibility of a reduction in the offset amount.
    For those that do apply, only a fraction get relief. The GAO study
    found that at the time of their initial offset, only about 20 percent of
    Social Security beneficiaries ages 50 and older with forced collections
    were approved for a financial hardship exemption or a reduction of the
    offset amount if they applied.

    Conclusion

    As
    hundreds of thousands of student loan borrowers with loans in default
    face the resumption of forced collection of their Social Security
    benefits, this spotlight shows that the forced collection of Social
    Security benefits causes significant hardship among affected borrowers.
    The spotlight also shows that the basic income protections aimed at
    preventing poverty and hardship among affected borrowers have become
    increasingly ineffective over time. While the Department of Education
    has made some improvements to expand access to relief options,
    especially for those who initially receive Social Security due to a
    disability, these improvements are insufficient to protect older adults
    from the forced collection of their Social Security benefits.

    Taken
    together, these findings suggest that forced collections of Social
    Security benefits increasingly interfere with Social Security’s
    longstanding purpose of protecting its beneficiaries from poverty and
    financial instability. These findings also suggest that alternative
    approaches are needed to address the harm that forced collections cause
    on beneficiaries and to compensate for the declining effectiveness of
    existing remedies. One potential solution may be found in the Debt
    Collection Improvement Act, which provides that when forced collections
    “interfere substantially with or defeat the purposes of the payment
    certifying agency’s program” the head of an agency may request from the
    Secretary of the Treasury an exemption from forced collections.
    Given the data findings above, such a request for relief from the
    Commissioner of the Social Security Administration on behalf of Social
    Security beneficiaries who have defaulted student loans could be
    justified. Unless the toll of forced collections on Social Security
    beneficiaries is considered alongside the program’s stated goals, the
    number of older adults facing these challenges is only set to grow.

    Data and Methodology

    To
    develop this report, the CFPB relied primarily upon original analysis
    of public-use data from the U.S. Census Bureau Survey of Income and
    Program Participation (SIPP), the Federal Reserve Board Board’s Survey
    of Household Economics and Decision-making (SHED), U.S. Department of
    the Treasury, Fiscal Data portal, consumer complaints received by the
    Bureau, and administrative data on borrowers in default provided by the
    Department of Education. The report also leverages data and findings
    from other reports, studies, and sources, and cites to these sources
    accordingly. Readers should note that estimates drawn from survey data
    are subject to measurement error resulting, among other things, from
    reporting biases and question wording.

    Survey of Income and Program Participation

    The
    Survey of Income and Program Participation (SIPP) is a nationally
    representative survey of U.S. households conducted by the U.S. Census
    Bureau. The SIPP collects data from about 20,000 households (40,000
    people) per wave. The survey captures a wide range of characteristics
    and information about these households and their members. The CFPB
    relied on a pooled sample of responses from 2018, 2019, 2020, and 2021
    waves for a total number of 17,607 responses from student loan borrowers
    across all waves, including 920 respondents with student loans
    receiving Social Security benefits. The CFPB’s analysis relied on the
    public use data. To capture student loan debt, the survey asked to all
    respondents (variable EOEDDEBT): Owed any money for student loans or
    educational expenses in own name only during the reference period. To
    capture receipt of Social Security benefits, the survey asked to all
    respondents (variable ESSSANY): “Did … receive Social Security
    benefits for himself/herself at any time during the reference period?”
    To capture amount of Social Security benefits, the survey asked to all
    respondents (variable TSSSAMT): “How much did … receive in Social
    Security benefit payment in this month (1-12), prior to any deductions
    for Medicare premiums?”

    The public-use version of the survey dataset, and the survey documentation can be found at: https://www.census.gov/programs-surveys/sipp.html

    Survey of Household Economics and Decision-making

    The
    Federal Reserve Board’s Survey of Household Economics and
    Decision-making (SHED) is an annual web-based survey of households. The
    survey captures information about respondents’ financial situations. The
    CFPB relied on a pooled sample of responses from 2019 through 2023
    waves for a total number of 1,376 responses from student loan borrowers
    in collection across all waves. The CFPB analysis relied on the public
    use data. To capture default and collection, the survey asked all
    respondents with student loans (variable SL6): “Are you behind on
    payments or in collections for one or more of the student loans from
    your own education?” To capture receipt of Social Security benefits, the
    survey asked to all respondents (variable I0_c): “In the past 12
    months, did you (and/or your spouse or partner) receive any income from
    the following sources: Social Security (including old age and DI)?”

    The public-use version of the survey dataset, and the survey documentation can be found at https://www.federalreserve.gov/consumerscommunities/shed_data.htm  

    Appendix
    A: Number of student loan borrowers ages 60 and older, total
    outstanding balance, and average balance by age group, August 2024

    Age Group Borrower Count (in thousands) Balance (in billions) Average balance

    60 to 65

    1,951.4

    $87.49

    $44,834

    66 to 70

    909.8

    $39.47

    $43,383

    71 to 75

    457.5

    $18.95

    $41,421

    76 to 80

    179.0

    $6.80

    $37,989

    81 to 85

    59.9

    $1.90

    $31,720

    86 to 90

    20.1

    $0.51

    $25,373

    91 to 95

    7.0

    $0.14

    $20,000

    96+

    2.8

    $0.05

    $17,857

    Source: Data provided by the Department of Education.

    The endnotes for this report are available here

    Source link