Tag: Student

  • Q&A with retiring National Student Clearinghouse CEO

    Q&A with retiring National Student Clearinghouse CEO

    Ricardo Torres, the CEO of the National Student Clearinghouse, is retiring next month after 17 years at the helm. His last few weeks on the job have not been quiet.

    On Jan. 13, the clearinghouse’s research team announced they had found a significant error in their October enrollment report: Instead of freshman enrollment falling by 5 percent, it actually seemed to have increased; the clearinghouse is releasing its more complete enrollment report tomorrow. In the meantime, researchers, college officials and policymakers are re-evaluating their understanding of how 2024’s marquee events, like the bungled FAFSA rollout, influenced enrollment; some are questioning their reliance on clearinghouse research.

    It’s come as a difficult setback at the end of Torres’s tenure. He established the research center in 2010, two years after becoming CEO, and helped guide it to prominence as one of the most widely used and trusted sources of postsecondary student data.

    The clearinghouse only began releasing the preliminary enrollment report, called the “Stay Informed” report, in 2020 as a kind of “emergency measure” to gauge the pandemic’s impact on enrollment, Torres told Inside Higher Ed. The methodological error in October’s report, which the research team discovered this month, had been present in every iteration since. And a spokesperson for the clearinghouse said that after reviewing the methodology for their “Transfer and Progress” report, which they’ve released every February since 2023, was also affected by the miscounting error; the 2025 report will be corrected, but the last two were skewed.

    Torres said the clearinghouse is exploring discontinuing the “Stay Informed” report entirely.

    Such a consequential snafu would put a damper on anyone’s retirement and threaten to tarnish their legacy. But Torres is used to a little turbulence: He oversaw the clearinghouse through a crucial period of transformation, from an arm of the student lending sector to a research powerhouse. He said the pressure on higher ed researchers is only going to get more intense in the years ahead, given the surging demand for enrollment and outcomes data from anxious college leaders and ambitious lawmakers. Transparency and integrity, he cautioned, will be paramount.

    His conversation with Inside Higher Ed, edited for length and clarity, is below.

    Q: You’ve led the clearinghouse since 2008, when higher ed was a very different sector. How does it feel to be leaving?

    A: It’s a bit bittersweet, but I feel like we’ve accomplished something during my tenure that can be built upon. I came into the job not really knowing about higher ed; it was a small company, a $13 million operation serving the student lending industry. We were designed to support their fundamental need to understand who’s enrolled and who isn’t, for the purposes of monitoring student loans. As a matter of fact, the original name of the organization was the National Student Loan Clearinghouse. When you think about what happened when things began to evolve and opportunities began to present themselves, we’ve done a lot.

    Q: Tell me more about how the organization has changed since the days of the Student Loan Clearinghouse.

    A: Frankly, the role and purpose of the clearinghouse and its main activities have not changed in about 15 years. The need was to have a trusted, centralized location where schools could send their information that then could be used to validate loan status based on enrollments. The process, prior to the clearinghouse, was loaded with paperwork. The registrars that are out there now get this almost PTSD effect when they go back in time before the clearinghouse. If a student was enrolled in School A, transferred to School B and had a loan, by the time everybody figured out that you were enrolled someplace else, you were in default on your loan. We were set up to fix that problem.

    What made our database unique at that time was that when a school sent us enrollment data, they had to send all of the learners because they actually didn’t know who had a previous loan and who didn’t. That allowed us to build a holistic, comprehensive view of the whole lending environment. So we began experimenting with what else we could do with the data.

    Our first observation was how great a need there was for this data. Policy formulation at almost every level—federal, state, regional—for improving learner outcomes lacked the real-time data to figure out what was going on. Still, democratizing the data alone was insufficient because you need to convert that insight into action of some kind that is meaningful. What I found as I was meeting schools and individuals was that the ability and the skill sets required to convert data to action were mostly available in the wealthiest institutions. They had all the analysts in the world to figure out what the hell was going on, and the small publics were just scraping by. That was the second observation, the inequity.

    The third came around 2009 to 2012, when there was an extensive effort to make data an important part of decision-making across the country. The side effect of that, though, was that not all the data sets were created equal, which made answering questions about what works and what doesn’t that much more difficult.

    The fourth observation, and I think it’s still very relevant today, is that the majority of our postsecondary constituencies are struggling to work with the increasing demands they’re getting from regulators: from the feds, from the states, from their accreditors, the demand for reports is increasing. The demand for feedback is increasing. Your big institutions, your flagships, might see this as a pain in the neck, but I would suggest that your smaller publics and smaller private schools are asking, “Oh my gosh, how are we even going to do this?” Our data helps.

    Q: What was the clearinghouse doing differently in terms of data collection?

    A: From the postsecondary standpoint, our first set of reports that we released in 2011 focused on two types of learners that at most were anecdotally referred to: transfer students and part-time students. The fact that we included part-time students, which [the Integrated Postsecondary Education Data System] did not, was a huge change. And our first completion report, I believe, said that over 50 percent of baccalaureate recipients had some community college in their background. That was eye-popping for the country to see and really catalyzed a lot of thinking about transfer pathways.

    We also helped spur the rise of these third-party academic-oriented organizations like Lumina and enabled them to help learners by using our data. One of our obligations as a data aggregator was to find ways to make this data useful for the field, and I think we accomplished that. Now, of course, demand is rising with artificial intelligence; people want to do more. We understand that, but we also think we have a huge responsibility as a data custodian to do that responsibly. People who work with us realize how seriously we take that custodial relationship with the data. That has been one of the hallmarks of our tenure as an organization.

    Q: Speaking of custodial responsibility, people are questioning the clearinghouse’s research credibility after last week’s revelation of the data error in your preliminary enrollment report. Are you worried it will undo the years of trust building you just described? How do you take accountability?

    A: No. 1: The data itself, which we receive from institutions, is reliable, current and accurate. We make best efforts to ensure that it accurately represents what the institutions have within their own systems before any data is merged into the clearinghouse data system.

    When we first formed the Research Center, we had to show how you can get from the IPEDS number to the clearinghouse number and show people our data was something they could count on. We spent 15 years building this reputation. The key to any research-related error like this is, first, you have to take ownership of it and hold yourself accountable. As soon as I found out about this we were already making moves to [make it public]—we’re talking 48 hours. That’s the first step in maintaining trust.

    That being said, there’s an element of risk built into this work. Part of what the clearinghouse brings to the table is the ability to responsibly advance the dialogue of what’s happening in education and student pathways. There are things that are happening out there, such as students stopping out and coming back many years later, that basically defy conventional wisdom. And so the risk in all of this is that you shy away from that work and decide to stick with the knitting. But your obligation is, if you’re going to report those things, to be very transparent. As long as we can thread that needle, I think the clearinghouse will play an important role in helping to advance the dialogue.

    We’re taking this very seriously and understand the importance of the integrity of our reports considering how the field is dependent on the information we provide. Frankly, one of the things we’re going to take a look at is, what is the need for the preliminary report at the end of the day? Or do we need to pair it with more analysis—is it just enough to say that total enrollments are up X or down Y?

    Q: Are you saying you may discontinue the preliminary report entirely?

    A: That’s certainly an option. I think we need to assess the field’s need for an early report—what questions are we trying to answer and why is it important that those questions be answered by a certain time? I’ll be honest; this is the first time something like this has happened, where it’s been that dramatic. That’s where the introspection starts, saying, “Well, this was working before; what the heck happened?”

    When we released the first [preliminary enrollment] report [in 2020], we thought it’d be a one-time thing. Now, we’ve issued other reports that we thought were going to be one-time and ended up being a really big deal, like “Some College, No Credential.” We’re going to continue to look for opportunities to provide those types of insights. But I think any research entity needs to take a look at what you’re producing to make sure there’s still a need or a demand, or maybe what you’re providing needs to pivot slightly. That’s a process that’s going to be undertaken over the next few months as we evaluate this report and other reports we do.

    Q: How did this happen, exactly? Have you found the source of the imputation error?

    A: The research team is looking into it. In order to ensure for this particular report that we don’t extrapolate this to a whole bunch of other things, you just need to make sure that you know you’ve got your bases covered analytically.

    There was an error in how we imputed a particular category of dual-enrolled students versus freshmen. But if you look at the report, the total number of learners wasn’t impacted by that. These preliminary reports were designed to meet a need after COVID, to understand what the impact was going to be. We basically designed a report on an emergency basis, and by default, when you don’t have complete information, there’s imputation. There’s been a lot of pressure on getting the preliminary fall report out. That being said, you learn your lesson—you gotta own it and then you keep going. This was very unfortunate, and you can imagine the amount of soul searching to ensure that this never happens again.

    Q: Do you think demand for more postsecondary data is driving some irresponsible analytic practices?

    A: I can tell you that new types of demands are going to be put out there on student success data, looking at nondegree credentials, looking at microcredentials. And there’s going to be a lot of spitballing. Just look at how ROI is trying to be calculated right now; I could talk for hours about the ins and outs of ROI methodology. For example, if a graduate makes $80,000 after graduating but transferred first from a community college, what kind of attribution does the community college get for that salary outcome versus the four-year school? Hell, it could be due to a third-party boot camp done after earning a degree. Research on these topics is going to be full of outstanding questions.

    Q: What comes next for the clearinghouse’s research after you leave?

    A: I’m excited about where it’s going. I’m very excited about how artificial intelligence can be appropriately leveraged, though I think we’re still trying to figure out how to do that. I can only hope that the clearinghouse will continue its journey of support. Because while we don’t directly impact learner trajectories, we can create the tools that help people who support learners every year impact those trajectories. Looking back on my time here, that’s what I’m most proud of.

    Source link

  • Student Booted from PhD Program Over AI Use (Derek Newton/The Cheat Sheet)

    Student Booted from PhD Program Over AI Use (Derek Newton/The Cheat Sheet)


    This one is going to take a hot minute to dissect. Minnesota Public Radio (MPR) has the story.

    The plot contours are easy. A PhD student at the University of Minnesota was accused of using AI on a required pre-dissertation exam and removed from the program. He denies that allegation and has sued the school — and one of his professors — for due process violations and defamation respectively.

    Starting the case.

    The coverage reports that:

    all four faculty graders of his exam expressed “significant concerns” that it was not written in his voice. They noted answers that seemed irrelevant or involved subjects not covered in coursework. Two instructors then generated their own responses in ChatGPT to compare against his and submitted those as evidence against Yang. At the resulting disciplinary hearing, Yang says those professors also shared results from AI detection software. 

    Personally, when I see that four members of the faculty unanimously agreed on the authenticity of his work, I am out. I trust teachers.

    I know what a serious thing it is to accuse someone of cheating; I know teachers do not take such things lightly. When four go on the record to say so, I’m convinced. Barring some personal grievance or prejudice, which could happen, hard for me to believe that all four subject-matter experts were just wrong here. Also, if there was bias or petty politics at play, it probably would have shown up before the student’s third year, not just before starting his dissertation.

    Moreover, at least as far as the coverage is concerned, the student does not allege bias or program politics. His complaint is based on due process and inaccuracy of the underlying accusation.

    Let me also say quickly that asking ChatGPT for answers you plan to compare to suspicious work may be interesting, but it’s far from convincing — in my opinion. ChatGPT makes stuff up. I’m not saying that answer comparison is a waste, I just would not build a case on it. Here, the university didn’t. It may have added to the case, but it was not the case. Adding also that the similarities between the faculty-created answers and the student’s — both are included in the article — are more compelling than I expected.

    Then you add detection software, which the article later shares showed high likelihood of AI text, and the case is pretty tight. Four professors, similar answers, AI detection flags — feels like a heavy case.

    Denied it.

    The article continues that Yang, the student:

    denies using AI for this exam and says the professors have a flawed approach to determining whether AI was used. He said methods used to detect AI are known to be unreliable and biased, particularly against people whose first language isn’t English. Yang grew up speaking Southern Min, a Chinese dialect. 

    Although it’s not specified, it is likely that Yang is referring to the research from Stanford that has been — or at least ought to be — entirely discredited (see Issue 216 and Issue 251). For the love of research integrity, the paper has invented citations — sources that go to papers or news coverage that are not at all related to what the paper says they are.

    Does anyone actually read those things?

    Back to Minnesota, Yang says that as a result of the findings against him and being removed from the program, he lost his American study visa. Yang called it “a death penalty.”

    With friends like these.

    Also interesting is that, according to the coverage:

    His academic advisor Bryan Dowd spoke in Yang’s defense at the November hearing, telling panelists that expulsion, effectively a deportation, was “an odd punishment for something that is as difficult to establish as a correspondence between ChatGPT and a student’s answer.” 

    That would be a fair point except that the next paragraph is:

    Dowd is a professor in health policy and management with over 40 years of teaching at the U of M. He told MPR News he lets students in his courses use generative AI because, in his opinion, it’s impossible to prevent or detect AI use. Dowd himself has never used ChatGPT, but he relies on Microsoft Word’s auto-correction and search engines like Google Scholar and finds those comparable. 

    That’s ridiculous. I’m sorry, it is. The dude who lets students use AI because he thinks AI is “impossible to prevent or detect,” the guy who has never used ChatGPT himself, and thinks that Google Scholar and auto-complete are “comparable” to AI — that’s the person speaking up for the guy who says he did not use AI. Wow.

    That guy says:

    “I think he’s quite an excellent student. He’s certainly, I think, one of the best-read students I’ve ever encountered”

    Time out. Is it not at least possible that professor Dowd thinks student Yang is an excellent student because Yang was using AI all along, and our professor doesn’t care to ascertain the difference? Also, mind you, as far as we can learn from this news story, Dowd does not even say Yang is innocent. He says the punishment is “odd,” that the case is hard to establish, and that Yang was a good student who did not need to use AI. Although, again, I’m not sure how good professor Dowd would know.

    As further evidence of Yang’s scholastic ability, Dowd also points out that Yang has a paper under consideration at a top academic journal.

    You know what I am going to say.

    To me, that entire Dowd diversion is mostly funny.

    More evidence.

    Back on track, we get even more detail, such as that the exam in question was:

    an eight-hour preliminary exam that Yang took online. Instructions he shared show the exam was open-book, meaning test takers could use notes, papers and textbooks, but AI was explicitly prohibited. 

    Exam graders argued the AI use was obvious enough. Yang disagrees. 

    Weeks after the exam, associate professor Ezra Golberstein submitted a complaint to the U of M saying the four faculty reviewers agreed that Yang’s exam was not in his voice and recommending he be dismissed from the program. Yang had been in at least one class with all of them, so they compared his responses against two other writing samples. 

    So, the exam expressly banned AI. And we learn that, as part of the determination of the professors, they compared his exam answers with past writing.

    I say all the time, there is no substitute for knowing your students. If the initial four faculty who flagged Yang’s work had him in classes and compared suspicious work to past work, what more can we want? It does not get much better than that.

    Then there’s even more evidence:

    Yang also objects to professors using AI detection software to make their case at the November hearing.  

    He shared the U of M’s presentation showing findings from running his writing through GPTZero, which purports to determine the percentage of writing done by AI. The software was highly confident a human wrote Yang’s writing sample from two years ago. It was uncertain about his exam responses from August, assigning 89 percent probability of AI having generated his answer to one question and 19 percent probability for another. 

    “Imagine the AI detector can claim that their accuracy rate is 99%. What does it mean?” asked Yang, who argued that the error rate could unfairly tarnish a student who didn’t use AI to do the work.  

    First, GPTZero is junk. It’s reliably among the worst available detection systems. Even so, 89% is a high number. And most importantly, the case against Yang is not built on AI detection software alone, as no case should ever be. It’s confirmation, not conviction. Also, Yang, who the paper says already has one PhD, knows exactly what an accuracy rate of 99% means. Be serious.

    A pattern.

    Then we get this, buried in the news coverage:

    Yang suggests the U of M may have had an unjust motive to kick him out. When prompted, he shared documentation of at least three other instances of accusations raised by others against him that did not result in disciplinary action but that he thinks may have factored in his expulsion.  

    He does not include this concern in his lawsuits. These allegations are also not explicitly listed as factors in the complaint against him, nor letters explaining the decision to expel Yang or rejecting his appeal. But one incident was mentioned at his hearing: in October 2023, Yang had been suspected of using AI on a homework assignment for a graduate-level course. 

    In a written statement shared with panelists, associate professor Susan Mason said Yang had turned in an assignment where he wrote “re write it, make it more casual, like a foreign student write but no ai.”  She recorded the Zoom meeting where she said Yang denied using AI and told her he uses ChatGPT to check his English.

    She asked if he had a problem with people believing his writing was too formal and said he responded that he meant his answer was too long and he wanted ChatGPT to shorten it. “I did not find this explanation convincing,” she wrote. 

    I’m sorry — what now?

    Yang says he was accused of using AI in academic work in “at least three other instances.” For which he was, of course, not disciplined. In one of those cases, Yang literally turned in a paper with this:

    “re write it, make it more casual, like a foreign student write but no ai.” 

    He said he used ChatGPT to check his English and asked ChatGPT to shorten his writing. But he did not use AI. How does that work?

    For that one where he left in the prompts to ChatGPT:

    the Office of Community Standards sent Yang a letter warning that the case was dropped but it may be taken into consideration on any future violations. 

    Yang was warned, in writing.

    If you’re still here, we have four professors who agree that Yang’s exam likely used AI, in violation of exam rules. All four had Yang in classes previously and compared his exam work to past hand-written work. His exam answers had similarities with ChatGPT output. An AI detector said, in at least one place, his exam was 89% likely to be generated with AI. Yang was accused of using AI in academic work at least three other times, by a fifth professor, including one case in which it appears he may have left in his instructions to the AI bot.

    On the other hand, he did say he did not do it.

    Findings, review.

    Further:

    But the range of evidence was sufficient for the U of M. In the final ruling, the panel — comprised of several professors and graduate students from other departments — said they trusted the professors’ ability to identify AI-generated papers.

    Several professors and students agreed with the accusations. Yang appealed and the school upheld the decision. Yang was gone. The appeal officer wrote:

    “PhD research is, by definition, exploring new ideas and often involves development of new methods. There are many opportunities for an individual to falsify data and/or analysis of data. Consequently, the academy has no tolerance for academic dishonesty in PhD programs or among faculty. A finding of dishonesty not only casts doubt on the veracity of everything that the individual has done or will do in the future, it also causes the broader community to distrust the discipline as a whole.” 

    Slow clap.

    And slow clap for the University of Minnesota. The process is hard. Doing the review, examining the evidence, making an accusation — they are all hard. Sticking by it is hard too.

    Seriously, integrity is not a statement. It is action. Integrity is making the hard choice.

    MPR, spare me.

    Minnesota Public Radio is a credible news organization. Which makes it difficult to understand why they chose — as so many news outlets do — to not interview one single expert on academic integrity for a story about academic integrity. It’s downright baffling.

    Worse, MPR, for no specific reason whatsoever, decides to take prolonged shots at AI detection systems such as:

    Computer science researchers say detection software can have significant margins of error in finding instances of AI-generated text. OpenAI, the company behind ChatGPT, shut down its own detection tool last year citing a “low rate of accuracy.” Reports suggest AI detectors have misclassified work by non-native English writers, neurodivergent students and people who use tools like Grammarly or Microsoft Editor to improve their writing. 

    “As an educator, one has to also think about the anxiety that students might develop,” said Manjeet Rege, a University of St. Thomas professor who has studied machine learning for more than two decades. 

    We covered the OpenAI deception — and it was deception — in Issue 241, and in other issues. We covered the non-native English thing. And the neurodivergent thing. And the Grammarly thing. All of which MPR wraps up in the passive and deflecting “reports suggest.” No analysis. No skepticism.

    That’s just bad journalism.

    And, of course — anxiety. Rege, who please note has studied machine learning and not academic integrity, is predictable, but not credible here. He says, for example:

    it’s important to find the balance between academic integrity and embracing AI innovation. But rather than relying on AI detection software, he advocates for evaluating students by designing assignments hard for AI to complete — like personal reflections, project-based learnings, oral presentations — or integrating AI into the instructions. 

    Absolute joke.

    I am not sorry — if you use the word “balance” in conjunction with the word “integrity,” you should not be teaching. Especially if what you’re weighing against lying and fraud is the value of embracing innovation. And if you needed further evidence for his absurdity, we get the “personal reflections and project-based learnings” buffoonery (see Issue 323). But, again, the error here is MPR quoting a professor of machine learning about course design and integrity.

    MPR also quotes a student who says:

    she and many other students live in fear of AI detection software.  

    “AI and its lack of dependability for detection of itself could be the difference between a degree and going home,” she said. 

    Nope. Please, please tell me I don’t need to go through all the reasons that’s absurd. Find me one single of case in which an AI detector alone sent a student home. One.

    Two final bits.

    The MPR story shares:

    In the 2023-24 school year, the University of Minnesota found 188 students responsible of scholastic dishonesty because of AI use, reflecting about half of all confirmed cases of dishonesty on the Twin Cities campus. 

    Just noteworthy. Also, it is interesting that 188 were “responsible.” Considering how rare it is to be caught, and for formal processes to be initiated and upheld, 188 feels like a real number. Again, good for U of M.

    The MPR article wraps up that Yang:

    found his life in disarray. He said he would lose access to datasets essential for his dissertation and other projects he was working on with his U of M account, and was forced to leave research responsibilities to others at short notice. He fears how this will impact his academic career

    Stating the obvious, like the University of Minnesota, I could not bring myself to trust Yang’s data. And I do actually hope that being kicked out of a university for cheating would impact his academic career.

    And finally:

    “Probably I should think to do something, selling potatoes on the streets or something else,” he said. 

    Dude has a PhD in economics from Utah State University. Selling potatoes on the streets. Come on.

    Source link

  • FIRE to University of Texas at Dallas: Stop censoring the student press

    FIRE to University of Texas at Dallas: Stop censoring the student press

    The University of Texas at Dallas has a troubling history of trying to silence students. Now those students are fighting back.

    Today, the editors of The Retrograde published their first print edition, marking a triumphant return for journalism on campus in the face of administrative efforts to quash student press.

    Headlines above the fold of the first issue of The Retrograde, a new independent student newspaper at UT Dallas.

    Why call the newspaper The Retrograde? Because it’s replacing the former student newspaper, The Mercury, which ran into trouble when it covered the pro-Palestinian encampments on campus and shed light on UT Dallas’s use of state troopers (the same force that broke up UT Austin’s encampment just one week prior) and other efforts to quash even peaceful protest. As student journalists reported, their relationship with the administration subsequently deteriorated. University officials demoted the newspaper’s advisor and even removed copies of the paper from newsstands. At the center of this interference were Lydia Lum, director of student media, and Jenni Huffenberger, senior director of marketing and student media, whose titles reflect the university’s resistance to editorial freedom.

    The conflict between the paper and the administration came to a head when Lum called for a meeting of the Student Media Oversight Board, a university body which has the power to remove student leaders, accusing The Mercury’s editor-in-chief, Gregorio Olivares Gutierrez, of violating student media bylaws by having another form of employment, exceeding printing costs, and “bypassing advisor involvement.” Yet rather than follow those same bylaws, which offer detailed instructions for removing a student editor, Lum told board members from other student media outlets not to attend the meeting. A short-handed board then voted to oust Gutierrez. Adding insult to injury, Huffenberger unilaterally denied Gutierrez’s appeal, again ignoring the bylaws, which require the full board to consider any termination appeals.

    The student journalists of The Retrograde have shown incredible spirit. With your help, we can ensure their efforts — and the rights of all student journalists — are respected.

    In response, The Mercury’s staff went on strike, demanding Gutierrez’s reinstatement. To help in that effort, FIRE and the Student Press Law Center joined forces to pen a Nov. 12, 2024 letter calling for UT Dallas to honor the rights of the student journalists. We also asked them to pay the students the money they earned for the time they worked prior to the strike.

    UT Dallas refused to listen. Instead of embracing freedom of the press, the administration doubled down on censorship, ignoring both the students’ and our calls for justice.

    FIRE's advertisement in the first issue of the Retrograde student newspaper at UT Dallas. The headline reads: "FIRE Supports Student Journalism"

    FIRE took out a full page ad in support of The Retrograde at UT Dallas.

    In our letter, we argued that the university’s firing of Gutierrez was in retaliation for The Mercury’s unflattering coverage of the way administrators had handled the encampments. This is not even the first time UT Dallas has chosen censorship as the “best solution;” look no further than in late 2023 when they removed the “Spirit Rocks” students used to express themselves. Unfortunately, the university ignored both the students’ exhortations and FIRE’s demands, leaving UT Dallas without its newspaper. 

    But FIRE’s Student Press Freedom Initiative is here to make sure censorship never gets the last word.

    Students established The Retrograde, a fully independent newspaper. Without university resources, they have had to crowdfund and source their own equipment, working spaces, a new website, and everything else necessary to provide quality student-led journalism to the UT Dallas community. They succeeded, and FIRE is proud to support their efforts, placing a full-page ad in this week’s inaugural issue of The Retrograde.

    The fight for press freedom at UT Dallas is far from over — but we need your help to make a difference.

    Demand accountability from UT Dallas. The student journalists of The Retrograde have shown incredible spirit. With your help, we can ensure their efforts — and the rights of all student journalists — are respected.

    Source link

  • UKVI is tightening the rules on international student attendance

    UKVI is tightening the rules on international student attendance

    Back in April you’ll recall that UKVI shared a draft “remote delivery” policy with higher education providers for consultation.

    That process is complete – and now it’s written to providers to confirm the detail of the new arrangements.

    Little has changed in the proposal from last Spring – there are some clarifications on how it will apply, but the main impact is going to be on providers and students who depend, one way or another, on some of their teaching not being accessed “in person”.

    The backstory here is that technically, all teaching for international students right now is supposed to be in-person. That was relaxed during the pandemic for obvious reasons – and since, the rapid innovations in students being able to access types of teaching (either synchronously or asynchronously) has raised questions about how realistic and desirable that position remains.

    Politics swirls around this too – the worry/allegation is that students arrive and then disappear, and with a mixture of relaxed attendance regulation (UKVI stopped demanding a specific number of contact points a few years ago for universities) and a worry that some students are faking or bypassing some of the attendance systems that are in place, the time has come, it seems, to tighten a little – “formalising the boundaries in which institutions can use online teaching methods to deliver courses to international students”, as UKVI puts it.

    Its recent burst of compliance monitoring (with now public naming and shaming of universities “subject to an action plan”) seems to have been a factor too – with tales reaching us of officials asking often quite difficult questions about both how many students a provider thinks are on campus, and then how many actually are, on a given day or across a week.

    The balance being struck is designed, says UKVI, to “empower the sector to utilise advances in education technology” by delivering elements of courses remotely whilst setting “necessary thresholds” to provide clarity and ensure there is “no compromise” of immigration control.

    Remote or “optional”?

    The policy that will be introduced is broadly as described back in April – first, that two types of “teaching delivery” are to be defined as follows:

    • Remote delivery is defined as “timetabled delivery of learning where there is no need for the student to attend the premises of the student sponsor or partner institution which would otherwise take place live in-person at the sponsor or partner institution site.
    • Face-to-face delivery is defined as “timetabled learning that takes place in-person and on the premises of the student sponsor or a partner institution.

    You’ll see that that difference isn’t (necessarily) between teaching designed as in-person or designed as remote – it’s between hours that a student is required to be on campus for, and hours that they either specifically aren’t expected to come in for, or have the option to not come in for. That’s an important distinction:

    Where the student has an option of online or in-person learning, this should count as a remote element for this purpose.

    Then with those definitions set, we get a ratio.

    As a baseline, providers (with a track record of compliance) will be allowed to deliver up to 20 per cent of the taught elements of any degree level and above course remotely.

    Then if a provider is able to demonstrate how the higher usage is consistent with the requirements of the relevant educational quality standards body (OfS in England, QAA in Wales and Scotland) and remains consistent with the principles of the student route, they’ll be able to have a different ratio – up to 40 per cent of the teaching will be allowed to be in that “remote” category.

    Providers keen to use that higher limit will need to apply to do so via the annual CAS allocation process – and almost by definition will attract additional scrutiny as a result, if only to monitor how the policy is panning out. They’ll also have to list all courses provided to sponsored students that include remote delivery within that higher band – and provide justification for the higher proportion of remote learning based on educational value.

    (For those not immersed in immigration compliance, a CAS (Confirmation of Acceptance for Studies) is an electronic document issued by a UK provider to an international student that serves as proof of admission, and is required when applying for a student visa. The CAS includes a unique reference number, details of the course, tuition fees, and the institution’s sponsorship license information – and will soon have to detail if an international agent is involved too.)

    One question plenty of people have asked is whether this changes things for disabled students – UKVI makes clear that by exception, remote delivery can permitted on courses of any academic level studied at a student sponsor in circumstances where requiring face to face delivery would constitute discrimination on the basis of a student’s protected characteristics under the Equality Act 2010.

    A concern about that was that providers might not know if a student needs that exception in advance – UKVI says that it will trust providers to judge individual student circumstances in cases of extenuating circumstances and justify them during audits. The requirement to state protected characteristics on the CAS will be withdrawn.

    Oh – and sponsors will also be permitted to use remote delivery where continuity of education provision would otherwise be interrupted by unforeseen circumstances – things like industrial action, extreme weather, periods of travel restriction and so on.

    Notably, courses at levels 4 and 5 won’t be able to offer “remote delivery” at all – UKVI reckons they are “more vulnerable to abuse” from “non-genuine students”, so it’s resolved to link the more limited freedoms provided by Band 1 of the existing academic engagement policy to this provision of “remote” elements – degree level and above.

    Yes but what is teaching?

    A head-scratcher when the draft went out for consultation was what “counts” as teaching. Some will still raise questions with the answer – but UKVI says that activities like writing dissertations, conducting research, undertaking fieldwork, carrying out work placements and sitting exams are not “taught elements” – and are not therefore in scope.

    Another way of looking at that is basically – if it’s timetabled, it probably counts.

    Some providers have also been confused about modules – given that students on most courses are able to routinely choose elective modules (which themselves might contain different percentages of teaching in the two categories) after the CAS is assigned.

    UKVI says that sponsors should calculate the remote delivery percentage on the assumption that the student will elect to attend all possible remote elements online. So where elective modules form part of the course delivery, the highest possible remote delivery percentage will have to be stated (!) And where hours in the timetable are optional, providers will have to calculate remote delivery by assuming that students will participate in all optional remote elements online.

    The good news when managing all of that is that the percentage won’t have to be calculated on the basis of module or year – it’s the entire course that counts. And where the course is a joint programme with a partner institution based overseas, only elements of the course taking place in the UK will be taken into account.

    What’s next

    There’s no specific date yet on implementation – IT changes to the sponsor management system are required, and new fields will be added to the CAS and annual CAS allocation request forms first. The “spring” is the target, and there’s also a commitment to reviewing the policy after 12 months.

    In any event, any university intending to utilise (any) remote delivery will need to have updated their internal academic engagement (ie attendance) policy ahead of submitting their next annual CAS allocation request – and UKVI may even require the policy to be submitted before deciding on the next CAS allocation request, and definitely by September 2025.

    During the consultation, a number of providers raised the issue of equity – how would one justify international and home students being treated differently? UKVI says that distinctions are reasonable because international students require permission to attend a course in the UK:

    If attendance is no longer necessary, the validity of holding such permission must be reassessed.

    There’s no doubt that – notwithstanding that providers are also under pressure to produce (in many cases for the first time) home student attendance policies because of concerns about attendance and student loan entitlements – the new policy will cause some equity issues between home and international students.

    In some cases those will be no different to the issues that exist now – some providers in some departments simply harmonise their requirements, some apply different regs by visa status, and some apply different rules for home students to different dept/courses depending on the relative proportion of international students in that basket. That may all have to be revisited.

    The big change – for some providers, but not all – is those definitions. The idea of a student never turning up for anything until they “cram” for their “finals” is built into many an apocryphal student life tale – that definitely won’t be allowed for international students, and it’s hard to see a provider getting away with that in their SFE/SFW/SAAS demanded home student policy either.

    Some providers won’t be keen to admit as such, but the idea of 100 per cent attendance to hours of teaching in that 80 per cent basket is going to cause a capacity problem in some lecture theatres and teaching spaces that will now need to be resolved. Module choice (and design) is also likely to need a careful look.

    And the wider questions of the way in which students use “optional” attendance and/or recorded lectures to manage their health and time – with all the challenges relating to part-time work and commuting/travelling in the mix – may result in a need to accelerate timetable reform to reduce the overall number of now very-much “required” visits to campus.

    One other thing not mentioned in here is the reality that UKVI is setting a percentage of a number of hours that is not specified – some providers could engage in reducing the number of taught hours altogether to make the percentages add up. Neither in the domestic version of this agenda nor in this international version do we have an attempt at defining what “full-time” really means in terms of overall taught hours – perhaps necessarily given programme diversity – but it’ll be a worry for some.

    Add all of this up – mixing in UKVI stepping up compliance monitoring and stories of students sharing QR codes for teaching rooms on WhatsApp to evade attendance monitoring systems – and for some providers and some students, the change will be quite dramatic.

    The consultation on the arrangements has been carried out quite confidentially so far – I’d tentatively suggest here that any revision to arrangements implemented locally should very much aim to switch that trend away from “UKVI said so” towards detailed discussion with (international) student representatives, with a consideration of wider timetabling, housing, travel and other support arrangements in the mix.

    Source link

  • student debt relief progress and new fact sheets (SBPC)

    student debt relief progress and new fact sheets (SBPC)

    The fight for student loan borrowers continues! In the last remaining days of the Biden-Harris Administration, the U.S. Department of Education (ED) is pushing some final relief through for student loan borrowers, new Income-Driven Repayment (IDR) Account Adjustment payment counts are live, and we have new fact sheets shedding light on the impact of the student debt crisis on borrowers.

    Here’s a roundup of the latest:

    Over 5 million borrowers have been freed from student debt.

    In a major win for borrowers, ED announced that the Biden-Harris Administration has now approved $183.6 billion in student debt discharges via various student debt relief fixes and programs. This relief has now reached over 5 million borrowers and includes new approvals for Public Service Loan Forgiveness (PSLF) relief, borrower defense relief, and Total and Permanent Disability Discharge relief.

    This relief is life-changing for millions of families, proving the power of bold, decisive action on student debt. Yet, there is much more work to do. Every step toward relief underscores the need to continue fighting for policies that reduce the burden of student debt and ensure affordable access to higher education.

    Final phase of the IDR Account Adjustment is underway—take screenshots!

    In tandem with the latest cancellation efforts, ED has also finally started updating borrower payment counts on the Federal Student Aid dashboard. Providing official payment counts will help borrowers receive the credit they have earned towards cancellation under IDR, and ensure that all borrowers who have been forced to pay for 20 years or longer are automatically able to benefit from relief they are entitled to under federal law. ***If you are a borrower with federal student loans, we recommend that you check your dashboard on studentaid.gov, screenshot your new count, and save it in your records.

    Previously, many borrowers—including those who work in public service jobs and low-income borrowers struggling to afford payments—were steered into costly deferments and forbearance, preventing them from reaching the 20 years or longer for IDR relief or the 120 payments necessary for PSLF cancellation. Under the IDR Account Adjustment, these periods are now counted, even if borrowers were mistakenly placed in the wrong repayment plan or faced servicing errors. 

    Source link

  • WEEKEND READING: Why Scotland’s student funding system is “unfair, unsustainable, unaffordable” and needs to be replaced with a graduate contribution model

    WEEKEND READING: Why Scotland’s student funding system is “unfair, unsustainable, unaffordable” and needs to be replaced with a graduate contribution model

    • These are the remarks by Alison Payne, Research Director at Reform Scotland, at the HEPI / CDBU event on funding higher education, held at Birkbeck, University of London, on Thursday of this week.
    • We are also making available Johnny Rich’s slides on ‘Making graduate employer contributions work’ from the same event, which are available to download here.

    Thanks to the CDBU and to HEPI for the invitation to attend and take part in today’s discussion. 

    My speech today has been titled ‘A graduate contribution model’. Of course, for UK graduates not from Scotland, I’m sure they would make the point that they very much do contribute through their fees, but the situation is very different in Scotland and I’m really grateful that I have the opportunity to feed the Scottish situation into today’s discussion.

    I thought it may be helpful if I gave a quick overview of the Scottish situation, as it differs somewhat to the overview Nick gave this morning covering the rest of the UK. 

    Although tuition fees were introduced throughout the UK in 1998, the advent of devolution in 1999 and the passing of responsibility for higher education to Holyrood began the period of diverging funding policies.

    The then Labour / Lib Dem Scottish Executive, as it was then known, scrapped tuition fees and replaced them with a graduate endowment from 2001-02, with the first students becoming liable to pay the fee from April 2005. The scheme called for students to pay back £2,000 once they started earning over £10,000. 

    The graduate endowment was then scrapped by the SNP in February 2008. A quirk of EU law meant that students from EU countries could not be charged tuition fees if Scottish students were not paying them but students from England, Wales and Northern Ireland could be charged. This meant that from 2008 to 2021/22 EU students did not need to pay fees to attend Scottish universities, though students from the rest of the UK did. 

    We’re used to politics in Scotland being highly polarised and often toxic with few areas of commonality, but for the most part the policy of ‘free’ higher education has been supported by all of the political parties. Indeed at the last Scottish election in 2021 all parties committed to maintaining the policy in their manifestos. It is only recently that the Scottish Tories have suggested a move away from this following the election of their new leader, Russell Finlay.

    But behind this unusual political consensus, the ‘free’ policy is becoming increasingly unsustainable and unaffordable. Politicians will privately admit this, but politics, and a rock with an ill-advised slogan, have made it harder to have the much needed debate.

    The Cap

    While we don’t have tuition fees, we do have a cap on student numbers. And while more Scots are going to university, places are unable to keep up with demand. Since 2006 there has been a 56% increase in applicants, but an 84% increase in the number refused entry. 

    It is increasingly the case that students from the rest of the UK or overseas are accepted on to courses in Scotland while their Scottish counterparts are denied. For example, when clearing options are posted, often those places at Scotland’s top universities are only available to students from the rest of the UK and not to Scottish students, even if the latter have better grades. As a result, Scots can feel that they are denied access to education on their doorstep that those from elsewhere can obtain. Indeed, there are growing anecdotes about those who can afford it buying or renting property elsewhere in the UK so that they can attend a Scottish university, pay the higher fee and get around the cap.

    Basically, more people want to go to university, but the fiscal arrangements are holding ambition them back. This problem was highlighted by the Scottish Affairs Select Committee’s report on Universities from 2021.

    Some commentators in Scotland have blamed the lack of places on widening access programmes, but I would challenge this. It is undoubtedly a good thing that more people from non-traditional backgrounds are getting into university, it is the cap that is limiting Scottish places, not access programmes. This is a point that has been backed by individuals such as the Principal of St Andrews, Professor Dame Sally Mapstone [who also serves as HEPI’s Chair].

    Financial Woes

    The higher education sector in Scotland, as with elsewhere in the UK, is not in great financial health. Audit Scotland warned back in 2019 that half of our institutions were facing growing deficits. Pressures including pensions contributions, Brexit and estate maintenance have all played a role and in the face of this decline, but nothing has changed and we’re now seeing crisis like those at Dundee emerge. Against this backdrop, income from those students who pay higher fees is an important revenue stream.

    There is obviously a huge variation in what the fees are to attend a Scottish university, considerably more so than in the rest of the UK.

    For example, to study Accounting and Business as an undergraduate at Edinburgh University, the cost for a full-time new student for 2024/25 is £1,820 per year for a Scottish-domiciled student (met by the Scottish Government), £9,250 per year for someone from the rest of the UK and £26,500 for an international student. 

    It is clear why international students and UK students from outside Scotland are therefore so much more attractive than Scottish students.

    However, there is by no means an equal distribution of higher fee paying students among our institutions.

    For example, at St Andrews about one-third of undergraduate full-time students were Scots, with one-third from the rest of the UK and one-third international. The numbers for Edinburgh are similar.  

    At the other end of the scale, at the University of the Highlands and Islands and Glasgow Caledonian, around 90% of students are Scottish, with only around only 1% being international.  

    So it is clear that institutions’ ability to raise money from fee-paying students varies very dramatically, increasing the financial pressures on those with low fee income.

    However, when looking at the issue, it is important to recognise that it is not just our universities who are struggling, Scotland’s colleges are facing huge financial pressures as well. 

    The current proposed Scottish budget would leave colleges struggling with a persistent, real-terms funding cut of 17 per cent since 2021/22. Our college sector is hugely important in terms of the delivery of skills, working with local economies and as a route to university for so many, but for too long colleges have been treated like the Cinderella service in Scotland. The prioritising of ‘free’ university tuition over the college sector is adding to this problem.

    Regardless of who wins the Holyrood election next year, money is, and will remain, tight for some time. It would be lovely to be able to have lots of taxpayer funded ‘free’ services, but that is simply unsustainable and difficult choices need to be made. 

    This is why we believe that the current situation is unfair, unsustainable, unaffordable and needs to change.

    Reform Scotland would offer another alternative solution. We believe that there needs to be a better balance between the individual graduate and Scottish taxpayers in the contribution towards higher education. 

    One way this could be achieved is through a fee after graduation, to be repaid once they earn more than the Scottish average salary. This would not be a fee incurred on starting university and deferred until after graduation, rather the fee would be incurred on graduation.

    In terms of what that fee could be, the Cubie report over 25 years ago suggested a graduate fee of £3,000, which would be about £5,500 today.  This could perhaps be the starting point for consideration.  

    Any figure should take account of different variations in terms of the true cost of the course and potential skill shortages. 

    However, introducing a graduate fee would not necessarily mean an end to ‘free’ tuition. 

    Rather it provides an opportunity to look at the skills gaps that exist in Scotland and the possibility of developing schemes which cut off or scrap repayments for graduates who work in specific geographic areas or sectors of Scotland for set periods of time. 

    Such schemes could also look to incorporate students from elsewhere for Scotland is facing a demographic crisis. Our population is set to become older and smaller, and we are the only part of the UK projected to have a smaller population by 2045. 

    We desperately need to retain and attract more working-age people. Perhaps such graduate repayment waiver schemes could also be offered to students from the rest of the UK who choose to study in Scotland – stay here and work after graduation and we will pay a proportion of your fee. A wide range of different schemes could be considered and linked into the wider policy issues facing Scotland. 

    According to the Higher Education Statistics Authority (HESA) there were 3,370 graduates from the rest of the UK who attended a Scottish institution in 2020/21. Of those, only 990 chose to remain in Scotland for work after graduation. Could we encourage more people to stay after studying?

    Conclusion

    A graduate fee is only one possible solution, but I would argue that it is also one with a short shelf life. As graduates would not incur the fee until they graduated, there would be a four-year delay between the change in policy and revenue beginning to be received. Our institutions are facing very real fiscal problems and there is a danger of a university going to the wall. 

    If we get to the 2026 election and political parties refuse to shift the dial and at least recognise that the current system is unsustainable, then there is a danger that nothing will change for another Parliamentary term. I don’t think we can afford to wait until 2031.

    There is another interesting dynamic now as well. Labour in Scotland currently, publicly at least, oppose tuition fees. However, there are now 37 Scottish Labour MPs at Westminster who are backing the increase of fees on students from outside Scotland, or Scottish students studying down south. Given the unpopularity of the Labour government as well as the tight contest between the SNP and Labour for Holyrood, it seems unlikely that position can be maintained.

    All across the UK there are increasing signs of the stark financial situation we are facing. Against that backdrop, along with the restrictions placed on the number being able to attend, free university tuition is unsustainable and unaffordable. People outside Scottish politics seem to be able to see this reality, privately so do many of our politicians. We need to shift this debate in to the public domain in Scotland and develop a workable solution.

    Source link

  • Social Security Offsets and Defaulted Student Loans (CFPB)

    Social Security Offsets and Defaulted Student Loans (CFPB)

    Executive Summary

    When
    borrowers default on their federal student loans, the U.S. Department
    of Education (“Department of Education”) can collect the outstanding
    balance through forced collections, including the offset of tax refunds
    and Social Security benefits and the garnishment of wages. At the
    beginning of the COVID-19 pandemic, the Department of Education paused
    collections on defaulted federal student loans.
    This year, collections are set to resume and almost 6 million student
    loan borrowers with loans in default will again be subject to the
    Department of Education’s forced collection of their tax refunds, wages,
    and Social Security benefits.
    Among the borrowers who are likely to experience forced collections are
    an estimated 452,000 borrowers ages 62 and older with defaulted loans
    who are likely receiving Social Security benefits.

    This
    spotlight describes the circumstances and experiences of student loan
    borrowers affected by the forced collection of Social Security benefits.
    It also describes how forced collections can push older borrowers into
    poverty, undermining the purpose of the Social Security program.

    Key findings

    • The
      number of Social Security beneficiaries experiencing forced collection
      grew by more than 3,000 percent in fewer than 20 years; the count is
      likely to grow as the age of student loan borrowers trends older.

      Between 2001 and 2019, the number of Social Security beneficiaries
      experiencing reduced benefits due to forced collection increased from
      approximately 6,200 to 192,300. This exponential growth is likely driven
      by older borrowers who make up an increasingly large share of the
      federal student loan portfolio. The number of student loan borrowers
      ages 62 and older increased by 59 percent from 1.7 million in 2017 to
      2.7 million in 2023, compared to a 1 percent decline among borrowers
      under the age of 62.
    • The total amount
      of Social Security benefits the Department of Education collected
      between 2001 and 2019 through the offset program increased from $16.2
      million to $429.7 million
      . Despite the exponential increase in
      collections from Social Security, the majority of money the Department
      of Education has collected has been applied to interest and fees and has
      not affected borrowers’ principal amount owed. Furthermore, between
      2016 and 2019, the Department of the Treasury’s fees alone accounted for
      nearly 10 percent of the average borrower’s lost Social Security
      benefits.
    • More than one in three
      Social Security recipients with student loans are reliant on Social
      Security payments, meaning forced collections could significantly
      imperil their financial well-being.
      Approximately 37 percent of the
      1.3 million Social Security beneficiaries with student loans rely on
      modest payments, an average monthly benefit of $1,523, for 90 percent of
      their income. This population is particularly vulnerable to reduction
      in their benefits especially if benefits are offset year-round. In 2019,
      the average annual amount collected from individual beneficiaries was
      $2,232 ($186 per month).
    • The physical well-being of half of Social Security beneficiaries with student loans in default may be at risk.
      Half of Social Security beneficiaries with student loans in default and
      collections skipped a doctor’s visit or did not obtain prescription
      medication due to cost.
    • Existing minimum income protections fail to protect student loan borrowers with Social Security against financial hardship.
      Currently, only $750 per month of Social Security income—an amount that
      is $400 below the monthly poverty threshold for an individual and has
      not been adjusted for inflation since 1996—is protected from forced
      collections by statute. Even if the minimum protected income was
      adjusted for inflation, beneficiaries would likely still experience
      hardship, such as food insecurity and problems paying utility bills. A
      higher threshold could protect borrowers against hardship more
      effectively. The CFPB found that for 87 percent of student loan
      borrowers who receive Social Security, their benefit amount is below 225
      percent of the federal poverty level (FPL), an income level at which
      people are as likely to experience material hardship as those with
      incomes below the federal poverty level.
    • Large
      shares of Social Security beneficiaries affected by forced collections
      may be eligible for relief or outright loan cancellation, yet they are
      unable to access these benefits, possibly due to insufficient
      automation or borrowers’ cognitive and physical decline.
      As many as
      eight in ten Social Security beneficiaries with loans in default may be
      eligible to suspend or reduce forced collections due to financial
      hardship. Moreover, one in five Social Security beneficiaries may be
      eligible for discharge of their loans due to a disability. Yet these
      individuals are not accessing such relief because the Department of
      Education’s data matching process insufficiently identifies those who
      may be eligible.

    Taken together,
    these findings suggest that the Department of Education’s forced
    collections of Social Security benefits increasingly interfere with
    Social Security’s longstanding purpose of protecting its beneficiaries
    from poverty and financial instability.

    Introduction

    When
    borrowers default on their federal student loans, the Department of
    Education can collect the outstanding balance through forced
    collections, including the offset of tax refunds and Social Security
    benefits, and the garnishment of wages. At the beginning of the COVID-19
    pandemic, the Department of Education paused collections on defaulted
    federal student loans. This year, collections are set to resume and
    almost 6 million student loan borrowers with loans in default will again
    be subject to the Department of Education’s forced collection of their
    tax refunds, wages, and Social Security benefits.

    Among
    the borrowers who are likely to experience the Department of
    Education’s renewed forced collections are an estimated 452,000
    borrowers with defaulted loans who are ages 62 and older and who are
    likely receiving Social Security benefits.
    Congress created the Social Security program in 1935 to provide a basic
    level of income that protects insured workers and their families from
    poverty due to situations including old age, widowhood, or disability.
    The Social Security Administration calls the program “one of the most
    successful anti-poverty programs in our nation’s history.”
    In 2022, Social Security lifted over 29 million Americans from poverty,
    including retirees, disabled adults, and their spouses and dependents.
    Congress has recognized the importance of securing the value of Social
    Security benefits and on several occasions has intervened to protect
    them.

    This
    spotlight describes the circumstances and experiences of student loan
    borrowers affected by the forced collection of their Social Security
    benefits.
    It also describes how the purpose of Social Security is being
    increasingly undermined by the limited and deficient options the
    Department of Education has to protect Social Security beneficiaries
    from poverty and hardship.

    The forced collection of Social Security benefits has increased exponentially.

    Federal
    student loans enter default after 270 days of missed payments and
    transfer to the Department of Education’s default collections program
    after 360 days. Borrowers with a loan in default face several
    consequences: (1) their credit is negatively affected; (2) they lose
    eligibility to receive federal student aid while their loans are in
    default; (3) they are unable to change repayment plans and request
    deferment and forbearance; and (4) they face forced collections of tax refunds, Social Security benefits, and wages among other payments.
    To conduct its forced collections of federal payments like tax refunds
    and Social Security benefits, the Department of Education relies on a
    collection service run by the U.S. Department of the Treasury called the
    Treasury Offset Program.

    Between
    2001 and 2019, the number of student loan borrowers facing forced
    collection of their Social Security benefits increased from at least
    6,200 to 192,300.
    That is a more than 3,000 percent increase in fewer than 20 years. By
    comparison, the number of borrowers facing forced collections of their
    tax refunds increased by about 90 percent from 1.17 million to 2.22
    million during the same period.

    This exponential growth of Social Security offsets between 2001 and 2019 is likely driven by multiple factors including:

    • Older
      borrowers accounted for an increasingly large share of the federal
      student loan portfolio due to increasing average age of enrollment and
      length of time in repayment.
      Data from the Department of Education
      (which is only available since 2017), show that the number of student
      loan borrowers ages 62 and older, increased 24 percent from 1.7 million
      in 2017 to 2.1 million in 2019, compared to less than 1 percent among
      borrowers under the age of 62.
    • A larger number of borrowers, especially older borrowers, had loans in default.
      Data from the Department of Education show that the number of student
      loan borrowers with a defaulted loan increased by 230 percent from 3.8
      million in 2006 to 8.8 million in 2019. Compounding these trends is the fact that older borrowers are twice as likely to have a loan in default than younger borrowers.

    Due
    to these factors, the total amount of Social Security benefits the
    Department of Education collected between 2001 and 2019 through the
    offset program increased annually from $16.2 million to $429.7 million
    (when adjusted for inflation).
    This increase occurred even though the average monthly amount the
    Department of Education collected from individual beneficiaries was the
    same for most years, at approximately $180 per month.

    Figure 1: Number of Social Security beneficiaries and total amount collected for student loans (2001-2019)

    Source: CFPB analysis of public data from U.S. Treasury’s Fiscal Data portal. Amounts are presented in 2024 dollars.

    While the total collected from
    Social Security benefits has increased exponentially, the majority of
    money the Department of Education collected has not been applied to
    borrowers’ principal amount owed. Specifically, nearly three-quarters of
    the monies the Department of Education collects through offsets is
    applied to interest and fees, and not towards paying down principal
    balances.
    Between 2016 and 2019, the U.S. Department of the Treasury charged the
    Department of Education between $13.12 and $15.00 per Social Security
    offset, or approximately between $157.44 and $180 for 12 months of
    Social Security offsets per beneficiary with defaulted federal student
    loans. As a matter of practice, the Department of Education often passes these fees on directly to borrowers.
    Furthermore, these fees accounted for nearly 10 percent of the average
    monthly borrower’s lost Social Security benefits which was $183 during
    this time.
    Interest and fees not only reduce beneficiaries’ monthly benefits, but
    also prolong the period that beneficiaries are likely subject to forced
    collections.

    Forced collections are compromising Social Security beneficiaries’ financial well-being.

    Forced
    collection of Social Security benefits affects the financial well-being
    of the most vulnerable borrowers and can exacerbate any financial and
    health challenges they may already be experiencing. The CFPB’s analysis
    of the Survey of Income and Program Participation (SIPP) pooled data for
    2018 to 2021 finds that Social Security beneficiaries with student
    loans receive an average monthly benefit of $1,524.
    The analysis also indicates that approximately 480,000 (37 percent) of
    the 1.3 million beneficiaries with student loans rely on these modest
    payments for 90 percent or more of their income,
    thereby making them particularly vulnerable to reduction in their
    benefits especially if benefits are offset year-round. In 2019, the
    average annual amount collected from individual beneficiaries was $2,232
    ($186 per month).

    A
    recent survey from The Pew Charitable Trusts found that more than nine
    in ten borrowers who reported experiencing wage garnishment or Social
    Security payment offsets said that these penalties caused them financial
    hardship.
    Consequently, for many, their ability to meet their basic needs,
    including access to healthcare, became more difficult. According to our
    analysis of the Federal Reserve’s Survey of Household Economic and
    Decision-making (SHED), half of Social Security beneficiaries with
    defaulted student loans skipped a doctor’s visit and/or did not obtain
    prescription medication due to cost.
    Moreover, 36 percent of Social Security beneficiaries with loans in
    delinquency or in collections report fair or poor health. Over half of
    them have medical debt.

    Figure 2: Selected financial experiences and hardships among subgroups of loan borrowers

    Bar graph showing that borrowers who receive Social Security benefits and are delinquent or in collections are more likely to report that their spending is same or higher than their income, they are unable to pay some bills, have fair or poor health, and skip medical care than borrowers who receive Social Security benefits and are not delinquent or in collections.

    Source: CFPB analysis of the Federal Reserve Board Survey of Household Economic and Decision-making (2019-2023).

    Social Security recipients
    subject to forced collection may not be able to access key public
    benefits that could help them mitigate the loss of income. This is
    because Social Security beneficiaries must list the unreduced amount of
    their benefits prior to collections when applying for other means-tested
    benefits programs such as Social Security Insurance (SSI), Supplemental
    Nutrition Assistance Program (SNAP), and the Medicare Savings Programs.
    Consequently, beneficiaries subject to forced collections must report
    an inflated income relative to what they are actually receiving. As a
    result, these beneficiaries may be denied public benefits that provide
    food, medical care, prescription drugs, and assistance with paying for
    other daily living costs.

    Consumers’
    complaints submitted to the CFPB describe the hardship caused by forced
    collections on borrowers reliant on Social Security benefits to pay for
    essential expenses.
    Consumers often explain their difficulty paying for such expenses as
    rent and medical bills. In one complaint, a consumer noted that they
    were having difficulty paying their rent since their Social Security
    benefit usually went to paying that expense.
    In another complaint, a caregiver described that the money was being
    withheld from their mother’s Social Security, which was the only source
    of income used to pay for their mother’s care at an assisted living
    facility.
    As forced collections threaten the housing security and health of
    Social Security beneficiaries, they also create a financial burden on
    non-borrowers who help address these hardships, including family members
    and caregivers.

    Existing minimum income protections fail to protect student loan borrowers with Social Security against financial hardship.

    The
    Debt Collection Improvement Act set a minimum floor of income below
    which the federal government cannot offset Social Security benefits and
    subsequent Treasury regulations established a cap on the percentage of
    income above that floor.
    Specifically, these statutory guardrails limit collections to 15
    percent of Social Security benefits above $750. The minimum threshold
    was established in 1996 and has not been updated since. As a result, the
    amount protected by law alone does not adequately protect beneficiaries
    from financial hardship and in fact no longer protects them from
    falling below the federal poverty level (FPL). In 1996, $750 was nearly
    $100 above the monthly poverty threshold for an individual.
    Today that same protection is $400 below the threshold. If the
    protected amount of $750 per month ($9,000 per year) set in 1996 was
    adjusted for inflation, in 2024 dollars, it would total $1,450 per month
    ($17,400 per year).

    Figure
    3: Comparison of monthly FPL threshold with the current protected
    amount established in 1996 and the amount that would be protected with
    inflation adjustment

    Image with a bar graph showing the difference in monthly amounts for different thresholds and protections, from lowest to highest: (a) existing protections ($750), (b) the federal poverty level in 2024 ($1,255), (c) the amount set in 1996 if it had been CPI adjusted ($1,450), and (e) 225% of the FPL under the SAVE Plan ($2,824).

    Source: Calculations by the CFPB. Notes: Inflation adjustments based on the consumer price index (CPI).

    Even if the minimum protected
    income of $750 is adjusted for inflation, beneficiaries will likely
    still experience hardship as a result of their reduced benefits.
    Consumers with incomes above the poverty line also commonly experience
    material hardship. This suggests that a threshold that is higher than the poverty level will more effectively protect against hardship.
    Indeed, in determining an income threshold for $0 payments under the
    SAVE plan, the Department of Education researchers used material
    hardship (defined as being unable to pay utility bills and reporting
    food insecurity) as their primary metric, and found similar levels of
    material hardship among those with incomes below the poverty line and
    those with incomes up to 225 percent of the FPL.
    Similarly, the CFPB’s analysis of a pooled sample of SIPP respondents
    finds the same levels of material hardship for Social Security
    beneficiaries with student loans with incomes below 100 percent of the
    FPL and those with incomes up to 225 percent of the FPL.
    The CFPB found that for 87 percent of student loan borrowers who
    receive Social Security, their benefit amount is below 225 percent of
    the FPL.
    Accordingly, all of those borrowers would be removed from forced
    collections if the Department of Education applied the same income
    metrics it established under the SAVE program to an automatic hardship
    exemption program.

    Existing options for relief from forced collections fail to reach older borrowers.

    Borrowers
    with loans in default remain eligible for certain types of loan
    cancellation and relief from forced collections. However, our analysis
    suggests that these programs may not be reaching many eligible
    consumers. When borrowers do not benefit from these programs, their
    hardship includes, but is not limited to, unnecessary losses to their
    Social Security benefits and negative credit reporting.

    Borrowers who become disabled after reaching full retirement age may miss out on Total and Permanent Disability

    The
    Total and Permanent Disability (TPD) discharge program cancels federal
    student loans and effectively stops all forced collections for disabled
    borrowers who meet certain requirements. After recent revisions to the
    program, this form of cancelation has become common for those borrowers
    with Social Security who became disabled prior to full retirement age. In 2016, a GAO study documented the significant barriers to TPD that Social Security beneficiaries faced.
    To address GAO’s concerns, the Department of Education in 2021 took a
    series of mitigating actions, including entering into a data-matching
    agreement with the Social Security Administration (SSA) to automate the
    TPD eligibility determination and discharge process.
    This process was expanded further with new final rules being
    implemented July 1, 2023 that expanded the categories of borrowers
    eligible for automatic TPD cancellation. In total, these changes successfully resulted in loan cancelations for approximately 570,000 borrowers.

    However,
    the automation and other regulatory changes did not significantly
    change the application process for consumers who become disabled after
    they reach full retirement age or who have already claimed the Social
    Security retirement benefits. For these beneficiaries, because they are
    already receiving retirement benefits, SSA does not need to determine
    disability status. Likewise, SSA does not track disability status for
    those individuals who become disabled after they start collecting their
    Social Security retirement benefits.

    Consequently,
    SSA does not transfer information on disability to the Department of
    Education once the beneficiary begins collecting Social Security
    retirement.
    These individuals therefore will not automatically get a TPD discharge
    of their student loans, and they must be aware and physically and
    mentally able to proactively apply for the discharge.

    The
    CFPB’s analysis of the Census survey data suggests that the population
    that is excluded from the TPD automation process could be substantial.
    More than one in five (22 percent) Social Security beneficiaries with
    student loans are receiving retirement benefits and report a disability
    such as a limitation with vision, hearing, mobility, or cognition.
    People with dementia and other cognitive disabilities are among those
    with the greatest risk of being excluded, since they are more likely to
    be diagnosed after the age 70, which is the maximum age for claiming
    retirement benefits.

    These
    limitations may also help explain why older borrowers are less likely
    to rehabilitate their defaulted student loans. Specifically, 11 percent
    of student loan borrowers ages 50 to 59 facing forced collections
    successfully rehabilitated their loans, while only five percent of borrowers over the age of 75 do so.

    Figure
    4: Number of student loan borrowers ages 50 and older in forced
    collection, borrowers who signed a rehabilitation agreement, and
    borrowers who successfully rehabilitated a loan by selected age groups

    Age Group Number of Borrowers in Offset Number of Borrowers Who Signed a Rehabilitation Agreement Percent of Borrowers Who Signed a Rehabilitation Agreement Number of Borrowers Successfully Rehabilitated Percent of Borrowers who Successfully Rehabilitated
    50 to 59 265,200 50,800 14% 38,400 11%
    60 to 74 184,900 24,100 11% 18,500 8%
    75 and older 15,800 1,000 6% 800 5%

    Source: CFPB analysis of data provided by the Department of Education.

    Shifting demographics of
    student loan borrowers suggest that the current automation process may
    become less effective to protect Social Security benefits from forced
    collections as more and more older adults have student loan debt. The
    fastest growing segment of student loan borrowers are adults ages 62 and
    older. These individuals are generally eligible for retirement
    benefits, not disability benefits, because they cannot receive both
    classifications at the same time. Data from the Department of Education
    reflect that the number of student loan borrowers ages 62 and older
    increased by 59 percent from 1.7 million in 2017 to 2.7 million in 2023.
    In comparison, the number of borrowers under the age of 62 remained
    unchanged at 43 million in both years.
    Furthermore, additional data provided to the CFPB by the Department of
    Education show that nearly 90,000 borrowers ages 81 and older hold an
    average amount of $29,000 in federal student loan debt, a substantial
    amount despite facing an estimated average life expectancy of less than
    nine years.

    Existing exceptions to forced collections fail to protect many Social Security beneficiaries

    In
    addition to TPD discharge, the Department of Education offers reduction
    or suspension of Social Security offset where borrowers demonstrate
    financial hardship.
    To show hardship, borrowers must provide documentation of their income
    and expenses, which the Department of Education then uses to make its
    determination.
    Unlike the Debt Collection Improvement Act’s minimum protections, the
    eligibility for hardship is based on a comparison of an individual’s
    documented income and qualified expenses. If the borrower has eligible
    monthly expenses that exceed or match their income, the Department of
    Education then grants a financial hardship exemption.

    The
    CFPB’s analysis suggests that the vast majority of Social Security
    beneficiaries with student loans would qualify for a hardship
    protection. According to CFPB’s analysis of the Federal Reserve Board’s
    SHED, eight in ten (82 percent) of Social Security beneficiaries with
    student loans in default report that their expenses equal or exceed
    their income.
    Accordingly, these individuals would likely qualify for a full
    suspension of forced collections. Yet the GAO found that in 2015 (when
    the last data was available) less than ten percent of Social Security
    beneficiaries with forced collections applied for a hardship exemption
    or reduction of their offset.
    A possible reason for the low uptake rate is that many beneficiaries or
    their caregivers never learn about the hardship exemption or the
    possibility of a reduction in the offset amount.
    For those that do apply, only a fraction get relief. The GAO study
    found that at the time of their initial offset, only about 20 percent of
    Social Security beneficiaries ages 50 and older with forced collections
    were approved for a financial hardship exemption or a reduction of the
    offset amount if they applied.

    Conclusion

    As
    hundreds of thousands of student loan borrowers with loans in default
    face the resumption of forced collection of their Social Security
    benefits, this spotlight shows that the forced collection of Social
    Security benefits causes significant hardship among affected borrowers.
    The spotlight also shows that the basic income protections aimed at
    preventing poverty and hardship among affected borrowers have become
    increasingly ineffective over time. While the Department of Education
    has made some improvements to expand access to relief options,
    especially for those who initially receive Social Security due to a
    disability, these improvements are insufficient to protect older adults
    from the forced collection of their Social Security benefits.

    Taken
    together, these findings suggest that forced collections of Social
    Security benefits increasingly interfere with Social Security’s
    longstanding purpose of protecting its beneficiaries from poverty and
    financial instability. These findings also suggest that alternative
    approaches are needed to address the harm that forced collections cause
    on beneficiaries and to compensate for the declining effectiveness of
    existing remedies. One potential solution may be found in the Debt
    Collection Improvement Act, which provides that when forced collections
    “interfere substantially with or defeat the purposes of the payment
    certifying agency’s program” the head of an agency may request from the
    Secretary of the Treasury an exemption from forced collections.
    Given the data findings above, such a request for relief from the
    Commissioner of the Social Security Administration on behalf of Social
    Security beneficiaries who have defaulted student loans could be
    justified. Unless the toll of forced collections on Social Security
    beneficiaries is considered alongside the program’s stated goals, the
    number of older adults facing these challenges is only set to grow.

    Data and Methodology

    To
    develop this report, the CFPB relied primarily upon original analysis
    of public-use data from the U.S. Census Bureau Survey of Income and
    Program Participation (SIPP), the Federal Reserve Board Board’s Survey
    of Household Economics and Decision-making (SHED), U.S. Department of
    the Treasury, Fiscal Data portal, consumer complaints received by the
    Bureau, and administrative data on borrowers in default provided by the
    Department of Education. The report also leverages data and findings
    from other reports, studies, and sources, and cites to these sources
    accordingly. Readers should note that estimates drawn from survey data
    are subject to measurement error resulting, among other things, from
    reporting biases and question wording.

    Survey of Income and Program Participation

    The
    Survey of Income and Program Participation (SIPP) is a nationally
    representative survey of U.S. households conducted by the U.S. Census
    Bureau. The SIPP collects data from about 20,000 households (40,000
    people) per wave. The survey captures a wide range of characteristics
    and information about these households and their members. The CFPB
    relied on a pooled sample of responses from 2018, 2019, 2020, and 2021
    waves for a total number of 17,607 responses from student loan borrowers
    across all waves, including 920 respondents with student loans
    receiving Social Security benefits. The CFPB’s analysis relied on the
    public use data. To capture student loan debt, the survey asked to all
    respondents (variable EOEDDEBT): Owed any money for student loans or
    educational expenses in own name only during the reference period. To
    capture receipt of Social Security benefits, the survey asked to all
    respondents (variable ESSSANY): “Did … receive Social Security
    benefits for himself/herself at any time during the reference period?”
    To capture amount of Social Security benefits, the survey asked to all
    respondents (variable TSSSAMT): “How much did … receive in Social
    Security benefit payment in this month (1-12), prior to any deductions
    for Medicare premiums?”

    The public-use version of the survey dataset, and the survey documentation can be found at: https://www.census.gov/programs-surveys/sipp.html

    Survey of Household Economics and Decision-making

    The
    Federal Reserve Board’s Survey of Household Economics and
    Decision-making (SHED) is an annual web-based survey of households. The
    survey captures information about respondents’ financial situations. The
    CFPB relied on a pooled sample of responses from 2019 through 2023
    waves for a total number of 1,376 responses from student loan borrowers
    in collection across all waves. The CFPB analysis relied on the public
    use data. To capture default and collection, the survey asked all
    respondents with student loans (variable SL6): “Are you behind on
    payments or in collections for one or more of the student loans from
    your own education?” To capture receipt of Social Security benefits, the
    survey asked to all respondents (variable I0_c): “In the past 12
    months, did you (and/or your spouse or partner) receive any income from
    the following sources: Social Security (including old age and DI)?”

    The public-use version of the survey dataset, and the survey documentation can be found at https://www.federalreserve.gov/consumerscommunities/shed_data.htm  

    Appendix
    A: Number of student loan borrowers ages 60 and older, total
    outstanding balance, and average balance by age group, August 2024

    Age Group Borrower Count (in thousands) Balance (in billions) Average balance

    60 to 65

    1,951.4

    $87.49

    $44,834

    66 to 70

    909.8

    $39.47

    $43,383

    71 to 75

    457.5

    $18.95

    $41,421

    76 to 80

    179.0

    $6.80

    $37,989

    81 to 85

    59.9

    $1.90

    $31,720

    86 to 90

    20.1

    $0.51

    $25,373

    91 to 95

    7.0

    $0.14

    $20,000

    96+

    2.8

    $0.05

    $17,857

    Source: Data provided by the Department of Education.

    The endnotes for this report are available here

    Source link

  • College Student Satisfaction: Reflecting on 30 Years

    College Student Satisfaction: Reflecting on 30 Years

    College students have changed greatly in 30 years, but how has student satisfaction changed?

    Think back 30 years ago to 1995. What is different for you now? Where were you and what were you doing in the mid 1990s? Perhaps you were still in school and living at home, or not even born yet. Perhaps you were in your early years of working in higher education. Take a moment to reflect on what has (and has not) changed for you in that span of time. 

    Thirty years ago, I was just starting my position at what was then Noel-Levitz. What stands out for me was that I was about to become a mom for the first time. Now my baby is grown and will be a new mom herself later this year. And I find myself being on one of the “seasoned professionals” in the company, working alongside members of my team who were still in elementary school back in 1995. 

    Thirty years ago, we were just beginning to utilize email and the internet. Now they have become the primary way we do business, communicate professionally, and discover information.  Artificial intelligence (AI) is the new technology that we are learning to embrace to improve our professional and personal lives.   

    Thirty years ago, students were arriving on our campuses, seeking an education, guidance, growth, belonging, value for their investment and ultimately a better life.  That’s still the case today.  Plus, students are navigating more technology options, they are more openly seeking mental health support, and they are living in a world full of distractions. Online learning is a reality now and continues to become more accepted as a modality, especially after the experiences of 2020. As the demographic cliff looms, colleges are expanding their focus to include lifelong learners. 

    Thirty years ago is also when the Student Satisfaction Inventory (SSI) was launched to provide four-year and two-year institutions with a tool to better understand the priorities of their students. (In the early 2000s, we added survey instruments specifically for adult and online populations.) The data identified where the college was performing well and where it mattered for them to do better in order to retain their students to graduation. The concept of looking at satisfaction within the context of the level of importance was new back then, but in the past three decades, it has become the standard for capturing student perceptions. Since 1995, we have worked with thousands of institutions and collected data from millions of individuals, documenting what is important and where students are satisfied or dissatisfied with their experience. As we reach this 30-year milestone for the SSI, I took some time to reflect on what has changed in students’ perceptions and what has stayed the same.

    Consistent priorities

    What stood out to me as I reviewed the national data sets over the past 30 years is that what matters to students has largely stayed the same. Students continue to care about good advising, quality instruction and getting access to classes. The academic experience is highly valued by students and is the primary reason they are enrolled, now and then. 

    Another observation is that there are two areas that have been consistent priorities for improvement, especially at four-year private and public institutions:

    • Tuition paid is a worthwhile investment.
    • Adequate financial aid is available for most students. 

    These two items have routinely appeared as national challenges (areas of high importance and low satisfaction) over the decades, which shows that institutions continue to have opportunities to communicate value and address the financial pain points of students to make higher education accessible and affordable. 

    Campus climate is key

    One thing we have learned over the past thirty years is how students feel on campus is key to student success and retention. The research reflects the strongest links between students’ sense of belonging, feeling welcome, and enjoying their campus experience to their overall levels of satisfaction. High levels of satisfaction are linked to individual student retention and institutional graduation rates. Campuses that want to best influence students remaining enrolled are being intentional with efforts to show concern for students individually, building connections between students from day one, and continuing those activities as students progress each year. It is important for institutions to recognize that students have lots of options to receive a quality education, but the environment and the potential student “fit” is more likely to vary from location to location. What happens while a student is at the college they have selected is more impactful on them than which institution they ultimately chose. Creating welcoming environments and supporting students’ sense of belonging in the chosen college is a way for institutions to stand out and succeed in serving students. Colleges often ask, “Why do students leave?” when they could be asking, “Why do students stay?” Building positive campus cultures and expanding the “good stuff” being done for students is a way to critical way to improve student and institutional success.

    One sector where the data reflect high satisfaction scores and good consistency, especially in the past five years since the pandemic, is community colleges. Students attending their (often local) two-year institutions want to be there, with high percentages of students indicating the school is their first choice.  Community college students nationally indicate areas such as the campus staff being caring/helpful, students being made to feel welcome, and people on the campus respecting each other, as strengths (high importance and high satisfaction). These positive perceptions are also reflected with overall high levels of satisfaction and indications of a likelihood to re-enroll if the student had it to do over again. The data indicate that two-year institutions are doing a nice job of building a sense of community among primarily commuter student populations. 

    Systemic issues and pockets of improvement

    Everyone talks about “kids today,” but in reality, they have been doing that for generations. It can’t be a reason not to change and respond appropriately to the needs of current students. When we consider the priorities for improvement in higher education that have remained at the forefront, we may need to recognize that some of these areas are systemic to higher education, along with recognizing that higher education generally has not done enough to respond. There are certainly pockets of improvement at schools that have prioritized being responsive and, as a result, are seeing positive movement in student satisfaction and student retention, but that is not happening everywhere. Taking action based on student feedback is a powerful way to influence student success. The campuses that have bought into that concept are seeing the results. 

    Current student satisfaction national results

    Want to learn more about the current trends in student satisfaction?  I invite you to download the 2024 National Student Satisfaction and Priorities Report

    This year’s analysis takes a closer look at the national results by demographic subpopulations, primarily by class level, to get a clearer view on how to improve the student experience. Institutions have found that targeting initiatives for particular student populations can be an effective way to have the biggest impact on student satisfaction. Download your free copy today.

    Source link

  • Cosmetologists can’t shoot a gun? FIRE ‘blasts’ tech college for punishing student over target practice video

    Cosmetologists can’t shoot a gun? FIRE ‘blasts’ tech college for punishing student over target practice video

    Language can be complicated. According to Merriam-Webster, the verb “blast” has as many as 15 different meanings — “to play loudly,” “to hit a golf ball out of a sand trap with explosive force,” “to injure by or as if by the action of wind.”

    Recently, the word has added another definition to the list. Namely, “to attack vigorously” with criticism, as in, “to blast someone online” or “to put someone on blast.” This usage has becomecommon expression.

    That’s what Leigha Lemoine, a student at Horry-Georgetown Technical College, meant when she posted in a private Snapchat group that a non-student who had insulted her needed to get “blasted.” 

    But HGTC’s administration didn’t see it that way. When some students claimed they felt uncomfortable with Lemoine’s post, the college summoned her to a meeting. Lemoine explained that the post was not a threat of physical harm, but rather a simple expression of her belief that the person who had insulted her should be criticized for doing so. The school’s administrators agreed and concluded there was nothing threatening in her words.

    But two days later, things took a turn. Administrators discovered a video on social media of Lemoine firing a handgun at a target. The video was recorded off campus a year prior to the discovery, and had no connection to the “blasted” comment, but because she had not disclosed the video’s existence (why would she be required to?), the college decided to suspend her until the 2025 fall semester. Adding insult to injury, HGTC indicated she Lemoine would be on disciplinary probation when she returned. 

    Screenshots of Leigha Lemoine’s video on social media.

    HGTC administrators claim Lemoine’s post caused “a significant amount of apprehension related to the presence and use of guns.” 

    “In today’s climate, your failure to disclose the existence of the video, in conjunction with group [sic] text message on Snapchat where you used the term ‘blasted,’ causes concern about your ability to remain in the current Cosmetology cohort,” the college added.

    Never mind the context of the gun video, which had nothing to do with campus or the person she said needed to get “blasted.” HGTC was determined to jeopardize Lemoine’s future over one Snapchat message and an unrelated video. 

    Colleges and universities would do well to take Lemoine’s case as a reminder to safeguard the expressive freedoms associated with humor and hyperbolic statements. Because make no mistake, FIRE will continue to blast the ones that don’t.

    FIRE wrote to HGTC on Lemoine’s behalf on Oct. 7, 2024, urging the college to reverse its disciplinary action against Lemoine. We pointed out the absurdity of taking Lemoine’s “blasted” comment as an unprotected “true threat” and urged the college to rescind her suspension. Lemoine showed no serious intent to commit unlawful violence with her comment urging others to criticize an individual, and tying the gun video to the comment was both nonsensical and deeply unjust. 

    But HGTC attempted to blow FIRE off and plowed forward with its discipline. So we brought in the big guns — FIRE Legal Network member David Ashley at Le Clercq Law Firm took on the case, filing an emergency motion for a temporary restraining order. On Dec. 17, a South Carolina federal district court ordered HGTC to allow her to return to classes immediately while the case works its way through the courts

    Jokes and hyperbole are protected speech

    Colleges and universities must take genuine threats of violence on campus seriously. That sometimes requires investigations and quick institutional action to ensure campus safety. But HGTC’s treatment of Lemoine is the latest in a long line of colleges misusing the “true threats” standard to punish clearly protected speech — remarks or commentary that are meant as jokes, hyperbole, or otherwise unreasonable to treat as though they are sincere. 

    Take over-excited rhetoric about sports. In 2022, Meredith Miller, a student at the University of Utah, posted on social media that she would detonate the nuclear reactor on campus (a low-power educational model with a microwave-sized core that one professor said “can’t possibly melt down or pose any risk”) if the football team lost its game. Campus police arrested her, and the Salt Lake County District Attorney’s Office charged her with making a terroristic threat

    The office eventually dropped the charge, but the university tried doubling down by suspending her for two years. It was only after intervention from FIRE and an outside attorney that the university relented. But that it took such significant outside pressure — especially over a harmless joke that was entirely in line with the kind of hyperbolic rhetoric one expects in sports commentary — reveals how dramatically the university overreacted.

    Political rhetoric is often targeted as well. In 2020, Babson College professor Asheen Phansey found himself in hot water after posting a satirical remark on Facebook. After President Trump tweeted a threat that he might bomb 52 Iranian cultural sites, Phansey jokingly suggested that Iran’s leadership should publicly identify a list of American cultural heritage sites it wanted to bomb, including the “Mall of America” and the “Kardashian residence.” Despite FIRE’s intervention, Babson College’s leadership suspended Phansey and then fired him less than a day later. 

    Or consider an incident in which Louisiana State University fired a graduate instructor who left a heated, profanity-laced voicemail for a state senator in which he criticized the senator’s voting record on trans rights. The senator reported the voicemail to the police, who investigated and ultimately identified the instructor. The police closed the case after concluding that the instructor had not broken the law. You’re supposed to be allowed to be rude to elected officials. LSU nevertheless fired him.

    More examples of universities misusing the true threats standard run the political gamut: A Fordham student was suspended for a post commemorating the anniversary of the Tianneman Square massacre; a professor posted on social media in support of a police officer who attacked a journalist and was placed on leave; an adjunct instructor wished for President Trump’s assassination and had his hiring revoked; another professor posted on Facebook supporting Antifa, was placed on leave, and then sued his college. Too often, the university discipline is made more egregious by the fact that administrators continue to use the idea of “threatening” speech to punish clearly protected expression even after local police departments conclude that the statements in question were not actually threatening.

    What is a true threat?

    Under the First Amendment, a true threat is defined as a statement where “the speaker means to communicate a serious expression of an intent to commit an act of unlawful violence to a particular individual or group of individuals.” 

    That eliminates the vast majority of threatening speech you hear each day, and for good reason. One of the foundational cases for the true threat standard is Watts v. U.S., in which the Supreme Court ruled that a man’s remark about his potential draft into the military — “If they ever make me carry a rifle, the first man I want to get in my sights is LBJ” — constituted political hyperbole, not a true threat. The Court held that such statements are protected by the First Amendment. And rightfully so: Political speech is where the protection of the First Amendment is “at its zenith.” An overbroad definition of threatening statements would lead to the punishment of political advocacy. Look no further than controversies in the last year and a half over calls for genocide to see how wide swathes of speech would become punishable if the standard for true threats was lower. 

    Colleges and universities would do well to take Lemoine’s case as a reminder to safeguard the expressive freedoms associated with humor and hyperbolic statements. Because make no mistake, FIRE will continue to blast the ones that don’t.

    Source link

  • Five areas of focus for student equity in CTE completion

    Five areas of focus for student equity in CTE completion

    Career and technical education can support students’ socioeconomic mobility, but inequitable completion rates for students of color leave some behind.

    NewSaetiew/iStock/Getty Images Plus

    Career and technical education programs have grown more popular among prospective students as ways to advance socioeconomic mobility, but they can have inequitable outcomes across student demographics.

    A December report from the Urban Institute offers best practices in supporting students of color as they navigate their institution, including in advising, mentoring and orientation programming.

    Researchers identified five key themes in equity-minded navigation strategies that can impact student persistence and social capital building, as well as future areas for consideration at other institutions.

    The background: The Career and Technical Education CoLab (CTE CoLab) Community of Practice is a group led by the Urban Institute to improve education and employment outcomes for students of color.

    In February and May 2024, the Urban Institute invited practitioners from four colleges—Chippewa Valley Technical College in Wisconsin, Diablo Valley College in California, Wake Technical Community College in North Carolina and WSU Tech in Kansas—to virtual roundtables to share ideas and practices. The brief includes insights from the roundtables and related research, as well as an in-person convening in October 2024 with college staff.

    “Practitioners and policymakers can learn from this knowledge and experience from the field to consider potential strategies to address student needs and improve outcomes for students of color and other historically marginalized groups,” according to the brief authors.

    Strategies for equity: The four colleges shared how they target and support learners with navigation including:

    • Using data to identify student needs, whether those be academic, basic needs or job- and career-focused. Data collection includes tracking success metrics such as completion and retention rates, as well as student surveys. Practitioners noted the need to do this early in the student experience—like during orientation—to help connect them directly with resources, particularly for learners in short courses. “Surveying students as part of new student orientation also provides program staff immediate information on the current needs of the student population, which may change semester to semester,” according to the report.
    • Reimagining their orientation processes to acclimate first-year students and ensure students are aware of resources. Chippewa Valley Technical College is creating an online, asynchronous orientation for one program, and Diablo Valley College is leveraging student interns to collect feedback on a new orientation program for art digital media learners. Some future considerations practitioners noted are ways to incentivize participation or attendance in these programs to ensure equity and how to engage faculty to create relationships between learners and instructors.
    • Supporting navigation in advising, mentoring and tutoring to help students build social capital and build connections within the institution. Colleges are considering peer mentoring and tutoring programs that are equity-centered, and one practitioner suggested implementing a checklist for advisers to highlight various resources.
    • Leveraging existing initiatives and institutional capacity to improve navigation and delivery of services to students, such as faculty training. One of the greatest barriers in this work is affecting change across the institution to shift culture, operations, structures and values for student success, particularly when it disrupts existing norms. To confront this, practitioners identify allies and engage partners across campus who are aligned in their work or vision.
    • Equipping faculty members to participate in navigation through professional development support. Community colleges employ many adjunct faculty members who may be less aware of supports available to students but still play a key role in helping students navigate the institution. Adjuncts can also have fewer contract hours available for additional training or development, which presents challenges for campus leaders. Diablo Valley College revised its onboarding process for adjuncts to guarantee they have clear information on college resources available to students and student demographic information to help these instructors feel connected to the college.

    Do you have an academic intervention that might help others improve student success? Tell us about it.

    Source link