Finding the right accommodation is one of the most important decisions facing university students, especially in cities like Melbourne, where enrolments are high and housing supply is limited. Currently, the market offers a range of options, each differing in cost, support services, and overall stability.
For many, student housing in Melbourne is about more than proximity to campus. It’s also about access to a secure, well-managed environment that promotes academic progress and social well-being.
To help with this decision, here’s a breakdown of some of the most common housing models and how they align with students’ needs.
Purpose-built student accommodation
For students balancing academic demands with independent living, accommodation designed specifically for study and support can offer greater stability. This is the approach taken by Journal Student Living. It combines private rooms with shared study, kitchen, and recreational facilities, supported by on-site staff and secure building access.
At Campus House, students live just 20 metres from the University of Melbourne, 150 metres from Trinity College, and 850 metres from RMIT, with easy access to nearby institutions. The building also includes dedicated study zones, rooftop gardens, and communal areas designed to support focused study and social connection.
University-operated housing
Many universities offer accommodation either directly or through affiliated providers, often located near campus. These options provide convenience and a built-in student community. However, places are limited, applications are competitive, and inclusions vary by provider.
Shared living arrangements
Shared living is common for students, especially those moving in with friends or joining an existing flat. While it can seem cheaper upfront, it often comes with split bills, unclear responsibilities, and limited privacy. There’s also no formal support, which can make daily life harder for students settling into a new city.
As a new Journal Student Living location opening in 2026, Market Way offers a purpose-built alternative to shared living. It provides furnished rooms, dedicated study areas, social spaces, and onsite support, all covered by one weekly fee that includes internet, utilities, and building access.
The building is also centrally located, just 380 metres from RMIT and close to other major institutions. This makes it easier to stay connected to classes and campus life.
Private market rentals
Renting through the private market gives students full control over where and how they live, but it also means managing everything independently. Lease terms are often rigid, with tenants responsible for bills, maintenance, and any disputes.
For students balancing assignments and deadlines, this can add unnecessary stress. Availability can also be limited near major campuses, and students without a rental history may struggle to secure a lease.
Journal Student Living provides a simpler option, with move-in-ready rooms available in a range of layouts. Options include studios, suites, and two-, three-, and four-bedroom ensuite apartments. All rooms are fully furnished and located close to major universities, helping students stay focused without the complications of renting privately.
Compare options and find what fits
Students have access to a range of accommodation types, but not all offer the same level of support, comfort, or convenience. For those looking for well-located, move-in-ready housing with community and privacy built in, Journal Student Living offers a purpose-built model that addresses the gaps found in other types of housing.
To learn more about availability, room types, and support services, visit the Journal Student Living website.
Institutions spend a lot of time surveying students for their feedback on their learning experience, but once you have crunched the numbers the hard bit is working out the “why.”
The qualitative information institutions collect is a goldmine of insight about the sentiments and specific experiences that are driving the headline feedback numbers. When students are especially positive, it helps to know why, to spread that good practice and apply it in different learning contexts. When students score some aspect of their experience negatively, it’s critical to know the exact nature of the perceived gap, omission or injustice so that it can be fixed.
Any conscientious module leader will run their eye down the student comments in a module feedback survey – but once you start looking across modules to programme or cohort level, or to large-scale surveys like NSS, PRES or PTES, the scale of the qualitative data becomes overwhelming for the naked eye. Even the most conscientious reader will find that bias sets in, as comments that are interesting or unexpected tend to be foregrounded as having greater explanatory power over those that seem run of the mill.
Traditional coding methods for qualitative data require someone – or ideally more than one person – to manually break down comments into clauses or statements that can be coded for theme and sentiment. It’s robust, but incredibly laborious. For student survey work, where the goal might be to respond to feedback and make improvements at pace, institutions are open that this kind of robust analysis is rarely, if ever, the standard practice. Especially as resources become more constrained, devoting hours to this kind of detailed methodological work is rarely a priority.
Let me blow your mind
That is where machine learning technology can genuinely change the game. Student Voice AI was founded by Stuart Grey, an academic at the University of Strathclyde (now working at the University of Glasgow), initially to help analyse student comments for large engineering courses. Working with Advance HE he was able to train the machine learning model on national PTES and PRES datasets. Now, further training the algorithm on NSS data, Student Voice AI offers literally same-day analysis of student comments for NSS results for subscribing institutions.
Put the words “AI” and “student feedback” in the same sentence and some people’s hackles will immediately rise. So Stuart spends quite a lot of time explaining how the analysis works. The word he uses to describe the version of machine learning Student Voice AI deploys is “supervised learning” – humans manually label categories in datasets and “teach” the machine about sentiment and topic. The larger the available dataset the more examples the machine is exposed to and the more sophisticated it becomes. Through this process Student Voice AI has landed on a discreet number of comment themes and categories for taught students and the same for postgraduate research students that the majority of student comments consistently fall into – trained on and distinctive to UK higher education student data. Stuart adds that the categories can and do evolve:
“The categories are based on what students are saying, not what we think they might be talking about – or what we’d like them to be talking about. There could be more categories if we wanted them, but it’s about what’s digestible for a normal person.”
In practice that means that institutions can see a quantitative representation of their student comments, sorted by category and sentiment. You can look at student views of feedback, for example, and see the balance of positive, neutral and negative sentiment, overall, segment it into departments or subject areas, or years of study, then click through to see the relevant comments to see what’s driving that feedback. That’s significantly different from, say, dumping your student comments into a third party generative AI platform (sharing confidential data with a third party while you’re at it) and asking it to summarise. There’s value in the time and effort saved, but also in the removal of individual personal bias, and the potential for aggregation and segmentation for different stakeholders in the system. And it also becomes possible to compare student qualitative feedback across institutions.
Now, Student Voice AI is partnering with student insight platform evasys to bring machine learning technology to qualitative data collected via the evasys platform. And evasys and Student Voice AI have been commissioned by Advance HE to code and analyse open comments from the 2025 PRES and PTES surveys – creating opportunities to drill down into a national dataset that can be segmented by subject discipline and theme as well as by institution.
Bruce Johnson, managing director at evasys is enthused about the potential for the technology to drive culture change both in how student feedback is used to inform insight and action across institutions:
“When you’re thinking about how to create actionable insight from survey data the key question is, to whom? Is it to a module leader? Is it to a programme director of a collection of modules? Is it to a head of department or a pro vice chancellor or the planning or quality teams? All of these are completely different stakeholders who need different ways of looking at the data. And it’s also about how the data is presented – most of my customers want, not only quality of insight, but the ability to harvest that in a visually engaging way.”
“Coming from higher education it seems obvious to me that different stakeholders have very different uses for student feedback data,” says Stuart Grey. “Those teaching at the coalface are interested in student engagement; at the strategic level the interest is in strategic level interest in trends and sentiment analysis and there are also various stakeholder groups in professional services who never get to see this stuff normally, but we can generate the reports that show them what students are saying about their area. Frequently the data tells them something they knew anyway but it gives them the ammunition to be able to make change.”
The results are in
Duncan Berryman, student surveys officer at Queens University Belfast, sums up the value of AI analysis for his small team: “It makes our life a lot easier, and the schools get the data and trends quicker.” Previously schools had been supplied with Excel spreadsheets – and his team were spending a lot of time explaining and working through with colleagues how to make sense of the data on those spreadsheets. Being able to see a straightforward visualisation of student sentiment on the various themes means that, as Duncan observes rather wryly, “if change isn’t happening it’s not just because people don’t know what student surveys are saying.”
Parama Chaudhury, professor of economics and pro vice provost education (student academic experience) at University College London explains where qualitative data analysis sits in the wider ecosystem for quality enhancement of teaching and learning. In her view, for enhancement purposes, comparing your quantitative student feedback scores to those of another department is not particularly useful – essentially it’s comparing apples with oranges. Yet the apparent ease of comparability of quantitative data, compared with the sense of overwhelm at the volume and complexity of student comments, can mean that people spend time trying to explain the numerical differences, rather than mining the qualitative data for more robust and actionable explanations that can give context to your own scores.
It’s not that people weren’t working hard on enhancement, in other words, but they didn’t always have the best possible information to guide that work. “When I came into this role quite a lot of people were saying ‘we don’t understand why the qualitative data is telling us this, we’ve done all these things,’” says Parama. “I’ve been in the sector a long time and have received my share of summaries of module evaluations and have always questioned those summaries because it’s just someone’s ‘read.’ Having that really objective view, from a well-trained algorithm makes a difference.”
UCL has tested two-page summaries of student comments to specific departments this academic year, and plans to roll out a version for every department this summer. The data is not assessed in a vacuum; it forms part of the wider institutional quality assurance and enhancement processes which includes data on a range of different perspectives on areas for development. Encouragingly, so far the data from students is consistent with what has emerged from internal reviews, giving the departments that have had the opportunity to engage with it greater confidence in their processes and action plans.
None of this stops anyone from going and looking at specific student comments, sense-checking the algorithm’s analysis and/or triangulating against other data. At the University of Edinburgh, head of academic planning Marianne Brown says that the value of the AI analysis is in the speed of turnaround – the institutionl carries out a manual reviewing process to be sure that any unexpected comments are picked up. But being able to share the headline insight at pace (in this case via a PowerBI interface) means that leaders receive the feedback while the information is still fresh, and the lead time to effect change is longer than if time had been lost to manual coding.
The University of Edinburgh is known for its cutting edge AI research, and boasts the Edinburgh (access to) Language Models (ELM) a platform that gives staff and students access to generative AI tools without sharing data with third parties, keeping all user data onsite and secured. Marianne is clear that even a closed system like ELM is not appropriate for unfettered student comment analysis. Generative AI platforms offer the illusion of a thematic analysis but it is far from robust because generative AI operates through sophisticated guesswork rather than analysis of the implications of actual data. “Being able to put responses from NSS or our internal student survey into ELM to give summaries was great, until you started to interrogate those summaries. Robust validation of any output is still required,” says Marianne. Similarly Duncan Berryman observes: “If you asked a gen-AI tool to show you the comments related to the themes it had picked out, it would not refer back to actual comments. Or it would have pulled this supposed common theme from just one comment.”
The holy grail of student survey practice is creating a virtuous circle: student engagement in feedback creates actionable data, which leads to education enhancement, and students gain confidence that the process is authentic and are further motivated to share their feedback. In that quest, AI, deployed appropriately, can be an institutional ally and resource-multiplier, giving fast and robust access to aggregated student views and opinions. “The end result should be to make teaching and learning better,” says Stuart Grey. “And hopefully what we’re doing is saving time on the manual boring part, and freeing up time to make real change.”
ROCHESTER, N.Y., June 18, 2025 — The Foundation for Individual Rights and Expression is urging the University of Rochester to reinstate an Eastman School of Music student who was expelled after blowing the whistle on a professor who sexually harassed her.
The case lays bare a university system that moved quickly to protect itself at the expense of a student’s right to voice criticisms — even though an internal investigation found the professor responsible for violating the harassment policy.
“There was no due process or hearing,” the student, Rebecca Bryant Novak, said. “The university’s administrators were more concerned about protecting the faculty than adhering to their own rules and addressing bad behavior. They basically tried to destroy my career beyond all comprehension.”
Shortly into her first semester as a Ph.D. student in fall 2023, Bryant Novak complained about abusive behavior by a professor who she said would scream at students and make lewd, sexist comments.
After a yearlong investigation, a panel of faculty and administrators agreed that the professor had indeed violated Rochester’s harassment policy and that Eastman’s Title IX coordinator had grossly mishandled her complaint.
Despite all this, Eastman allowed the same school authorities to retain oversight of Bryant Novak’s academic trajectory — with one official telling her that the school restricted her performance times because of her complaint against the professor.
When Bryant Novak complained, Eastman did nothing. As a result of the alleged retaliation, Rochester opened a second investigation into Eastman’s mishandling of the situation in December 2024, and Bryant Novak publicly disclosed the university’s new investigation in a Substack article on Feb. 10.
Tell Rochester to Stop Muzzling its Students
Take Action
Tell the University of Rochester: Reinstate Rebecca Bryant Novak, restore due process, and stop muzzling students into a culture of silence.
Two weeks later, Eastman abruptly expelled Bryant Novak, citing a failure to make academic progress. In doing so, the school ignored its written policy that calls for students to be given ample notice if they are in danger of falling short of academic standards.
“Rebecca’s expulsion smacks of retaliation for speech that is explicitly protected by the university’s policy,” FIRE Program Counsel Jessie Appleby said. “This is a profound violation of her free speech rights and sends a chilling message to every student at Eastman.”
FIRE is calling on university President Sarah C. Mangelsdorf to immediately reinstate Bryant Novak and ensure that she is able to complete her doctorate under the oversight of Eastman faculty and officials who are not already subject to investigation for misconduct in her case.
“I hope that by taking a stand here, I can help force Rochester to extend the kinds of protections to other students that were denied to me,” Bryant Novak said.
The Foundation for Individual Rights and Expression (FIRE) is a nonpartisan, nonprofit organization dedicated to defending and sustaining the individual rights of all Americans to free speech and free thought — the most essential qualities of liberty. FIRE educates Americans about the importance of these inalienable rights, promotes a culture of respect for these rights, and provides the means to preserve them.
CONTACT:
Karl de Vries, Director of Media Relations, FIRE: 215-717-3473; [email protected]
Let me tell you about Andrew, a motivated student who graduated high school early with impressive dual-enrollment credits. After attending a private college for a year and taking some time to work, he rekindled his educational ambitions at a community college. With approximately 30 credits remaining for his bachelor’s degree, he applied to an R-1 university, ready to complete his journey.
What should have been a seamless transition became an unexpected challenge. Despite submitting his transfer work in October and regularly checking in with his adviser, Andrew discovered in January—after classes had already begun—that he faced “at least three years of coursework” rather than the anticipated single year to graduation.
This isn’t a rare occurrence or some administrative anomaly. Rather, it is the norm for individuals who aren’t pursuing a four-year degree on the traditional timeline. Higher education talks endlessly about completion and student success while maintaining systems and policies that actively undermine these goals.
Andrew’s story represents a critical opportunity for higher education. While his family successfully advocated for a refund and found another institution that better recognized his prior learning, his experience highlights a fundamental challenge we must address collectively.
The Scale of the Challenge
We have 42 million Americans with some college credit but no degree. We have 200,000 military personnel transitioning to civilian life annually. We have an economy desperately needing upskilled workers. Yet higher education’s response to credit mobility remains anchored in outdated policies and processes that fail to serve today’s students, institutions or workforce needs.
Many institutions have made meaningful progress in supporting diverse student needs through childcare services, flexible scheduling and online options. These are important steps. Now we must extend this same commitment to the academic evaluation processes that directly impact students’ time to degree and financial investment.
The Disconnect
Transfer articulation agreements—where they have been struck—have created valuable pathways, but their implementation often lacks the consistency and transparency students deserve. When agreements include qualifying language without firm commitments, students can’t effectively plan their educational journeys or make informed financial decisions.
The contradiction is striking: We express concern about student debt and extended time to degree, questioning why students take 150 credits when they only need 120 to graduate. Meanwhile, our credit evaluation processes remain opaque, slow and often costly.
The current reality—where students frequently must apply, pay deposits or even enroll before understanding how their previous academic work will be valued—creates unnecessary barriers. We can do better—and, frankly, must. It’s like buying a car and finding out the price after you’ve signed the paperwork. In what other industry would this be acceptable?
The Opportunity
Consider the possibilities if we fully embraced credit mobility as a cornerstone of student success:
Students could make informed decisions about their educational pathways before committing financially.
Institutions could demonstrate their commitment to affordability by recognizing prior learning.
Graduation rates would improve as students avoid unnecessary course repetition.
The workforce would benefit from skilled professionals entering more quickly.
Addressing the Objections
The objections to credit mobility typically fall into three categories:
Faculty workload: Faculty are being asked to do more, and evaluating credits for prospective students can feel like an unnecessary burden. But what if more students could see that their learning had value, that their degree was within reach, that they didn’t have to retake classes they’ve already mastered? This shift in perspective could transform the evaluation process from a burden to an opportunity.
Lost revenue: The focus on enrollments often overshadows the reality that only 50 percent of students who start college actually finish within six years. What if our goal was to expand opportunities so more students could complete their degrees? What if students were taking classes that genuinely added to their experience and built their confidence rather than repeating content they’ve already learned?
Quality concerns: Quality is often cited as justification for delayed evaluation. In reality, transparent evaluation supports faculty’s desire to maintain academic standards. Clear processes allow for informed decisions and data collection that ensures the focus remains on student outcomes.
The AI Opportunity
The emergence of artificial intelligence presents a tremendous opportunity to enhance our credit-evaluation processes—addressing issues of time and cost while creating transparency for data analysis. A new study just released by AACRAO on the role of AI in credit mobility makes a compelling case as to why the technology could help unlock new ways of working. We can harness technology as a powerful tool to support faculty decision-making and administrative resource allocations. AI could:
Identify potential course equivalencies based on learning outcomes.
Highlight relevant information in transfer documentation.
Streamline evaluation processes, allowing human experts to focus on complex cases.
Provide leadership with insights into where credit mobility is operating effectively.
Identify areas needing additional resources or training.
With proper implementation and training, AI can become a tool to achieve our goals of access and completion at scale—reducing both the cost and timeline to graduation.
The Path Forward
If we truly believe in access and completion, then credit mobility must become a shared priority across higher education. This means:
Making course information, learning outcomes and sample syllabi readily accessible.
Expanding recognition of diverse learning experiences, including microcredentials, corporate training, internships and apprenticeships.
Establishing and honoring clear timelines for credit evaluation.
Eliminating financial barriers to credit assessment.
Providing updated articulation and equivalency tables in easy-to-find locations on admissions websites.
Andrew’s experience should be the exception, not the rule. Colleges and universities that embrace this challenge will not only better serve their students but will also position themselves for long-term sustainability in an increasingly competitive landscape. Those that resist change risk becoming irrelevant to the very students they aim to serve and perpetuating the cost and time-to-completion conundrum.
The Call to Action
The question before us isn’t whether credit mobility matters—it’s whether we have the collective will to make it a reality at scale, not just at a handful of institutions, but across systems and all institutions. We must recognize that our students are learning in new ways, on new timelines, and bringing knowledge that evolves faster than our curriculum. Our students deserve nothing less than our full commitment to recognizing their learning, regardless of where it occurred.
So I’ll ask: How committed are you to credit mobility at scale? Your answer says everything about how seriously you take college completion.
Jesse Boeding is the co-founder of Education Assessment System, an AI-powered platform mapping transfer, microcredentials and prior learning to an institution’s curriculum to enable decision-making and resourcing.
A growing number of programs in higher education focus on student athletes’ mental health, recognizing that the pressures of competing in collegiate athletics, combined with academic challenges, financial concerns and team relationships, can negatively impact student well-being.
At the University of Richmond, the athletics department created a new program to emphasize holistic student well-being, taking into account the different dimensions of a student athlete’s identity and development.
Spider Performance, named after the university mascot, unites various stakeholders on campus to provide a seamless experience for student athletes, ensuring they’re properly equipped to tackle challenges on the field, in the classroom and out in the world beyond college.
“The athlete identity is a really special part of [students’ identities], but it’s not the only part, so making sure they are [considered] human beings first—even before they’re students, they’re humans first. Let’s examine and explore that identity,” said Lauren Wicklund, senior associate athletics director for leadership and student-athlete development.
How it works: The university hosts 17 varsity sports in NCAA Division I, which include approximately 400 student athletes. Richmond has established four pillars of the student athlete experience: athletic, academic, personal and professional achievement.
“The whole concept is to build champions for life,” said Wicklund, who oversees the program. “It’s not just about winning in sport; it’s about winning in the classroom, winning personally and then getting the skills and tools to win for the rest of your life.”
These pillars have driven programming in the athletics department for years, but their messaging and implementation created confusion.
Now, under Spider Performance, the contributions and collaborations of stakeholders who support student athletes are more visible and defined, clarifying the assistance given to the athletes and demonstrating the program’s value to recruits. The offices in Spider Performance include academic support, sports medicine, leadership, strength and conditioning, mental health, and well-being.
“It’s building a team around them,” Wicklund explained. “Rather than our student athlete thinking, ‘I have to go eat here, I have to do my homework here, I have to do my workout here,’ it’s, ‘No, we want you to win at everything you do, and how you do one thing is how you do everything.’”
Outside of the specific athletic teams, Wicklund and her staff collaborate with other campus entities including faculty members, career services and co-curricular supports.
Preparing for launch: Richmond facilitates a four-year development model for student athletes, starting with an orientation experience for first-year students that helps them understand their strengths and temperament, up to more career-focused programming for seniors.
Recognizing how busy students’ schedules get during their athletic season, the university has also created other high-impact learning experiences that are more flexible and adaptive. Students can engage in a career trek to meet alumni across the country, study abroad for a short period, participate in a service project or take a wellness course, all designed to fit into their already-packed schedules.
Part of the goal is to help each student feel confident discussing their experience as an athlete and how it contributes to their long-term goals. For instance, students might feel ill-equipped for a full-time job because they never had a 12-week internship, but university staff help them translate their experiences on the field or the court into skills applicable to a workplace environment, Wicklund said.
The university is also adapting financial literacy programming to include information on name, image and likeness rights for student athletes, covering not just budgeting, investing and financial literacy topics but also more specific information related to their teams.
Encouraging athletes to attend extra sessions can be a challenge, but the Spider Performance team aims to help students understand the value of the program and how it applies to their daily lives. The program also requires buy-in from other role models in students’ lives, including trainers, coaches and professors.
“We work really hard to customize fits to different programs so we’re speaking the same language as our coaches,” which helps create a unified message to students, Wicklund said.
If your student success program has a unique feature or twist, we’d like to know about it. Click here to submit.
Just over half of student loan borrowers consider themselves financially insecure, while about three-quarters said they had experienced an adverse financial event, like skipping a bill, in the past year, according to a survey from the Pew Charitable Trusts exploring the attitudes of student loan borrowers after federal student loan repayments restarted in October 2023 following a three-year pause. The survey was conducted in the summer of 2024.
Existing financial challenges are closely associated with struggles to repay student loans, the survey found. About 23 percent of respondents indicated they had missed some or all of their student loan payments since October 2023, but that number was higher among those who are financially insecure (34 percent) and those who had experienced a negative financial event (30 percent).
But paying off student loans isn’t just challenging for those facing other financial difficulties. Among all borrowers, 57 percent said they found it difficult to afford their loans, including 41 percent of those who said they do not consider themselves financially insecure. Over a third of borrowers also said they found repaying their student loans more stressful than paying their other bills.
The Education Department estimates that nearly 25 percent of borrowers have either defaulted on their loans or will default in the next several months. In May, the agency restarted collections on unpaid loans.
New license agreement provides all students and faculty with free access to Top Hat, reinforcing UGA’s strategic focus on affordability, student success, and innovation in teaching.
TORONTO – June 17, 2025 – Top Hat, the leader in student engagement solutions for higher education, today announced that the University of Georgia has entered into a new enterprise agreement that will provide campus-wide access to the Top Hat platform at no cost to students or faculty. This initiative supports UGA’s continued efforts to promote high-impact teaching practices, student affordability, and innovation in the classroom.
Top Hat’s interactive teaching platform as well as content authoring and customization tools will be available to UGA faculty to enhance in-person, online, and hybrid courses across disciplines. With this agreement, UGA joins a growing number of leading institutions investing in Top Hat to empower instructors to improve learning outcomes and student success at scale.
“We are proud to support the University of Georgia in its efforts to deliver proven, student-centered teaching practices,” said Maggie Leen, CEO of Top Hat. “This partnership ensures every student and educator at UGA has access to the tools they need to drive learning and achievement, while reinforcing the university’s focus on affordability, innovation, and evidence-based instruction.”
This initiative reflects UGA’s commitment to both student affordability and instructional excellence. With Top Hat, faculty can adopt and customize low- or no-cost course materials—including OpenStax and OER—helping to reduce costs for students while delivering engaging, evidence-based instruction. The platform enables instructors to easily integrate active learning strategies, such as frequent low-stakes assessments and reflection prompts, which are proven to enhance student engagement and academic outcomes. Top Hat’s AI-powered assistant, Ace, streamlines course prep by generating high-quality questions directly from lecture content, and supports students with on-demand study help and unlimited practice opportunities—reinforcing learning both in and out of the classroom. Real-time data from polls, quizzes, and assignments also empowers educators to continuously monitor progress and improve instructional impact.
The University of Georgia is recognized nationally for excellence in teaching and learning, student completion, and affordability. The enterprise agreement with Top Hat is part of UGA’s broader commitment to building a world-class learning environment and increasing access to affordable, high impact teaching and learning resources.
About Top Hat
As the leader in student engagement solutions for higher education, Top Hat enables educators to employ proven student-centered teaching practices through interactive content and tools enhanced by AI, and activities in in-person, online and hybrid classroom environments. To accelerate student impact and return on investment, the company provides a range of change management services, including faculty training and instructional design support, integration and data management services, and digital content customization. Thousands of faculty at 900 leading North American colleges and universities use Top Hat to create meaningful, engaging and accessible learning experiences for students before, during, and after class.
This audio is auto-generated. Please let us know if you have feedback.
In Arkansas, a $7 million program approved last year aims to support students’ mental health by restricting their cellphone use and using telehealth to connect more students to mental health providers.
In Texas, a multiyear effort to study student mental and behavioral health yielded a host of recommendations, including putting Medicaid funds toward school-based mental health supports and better tracking of interventions.
And in West Virginia, state education leaders and partnership organizations have amassed a trove of resource documents and built out training to help schools address student mental health challenges.
All three states are working to proactively to respond to the student mental health crisis that worsened due to the COVID-19 pandemic.
All three states are also considering or expected to pass laws allowing schools to implement tougher discipline policies.
Likewise, many states are tweaking their discipline policies at the same time they are putting more resources toward supporting students’ mental well-being.
Although school discipline and mental health supports are mostly addressed at the local level, state leadership is critical for setting expectations for accountability and requiring transparency in disciplinary actions, said Richard Welsh, founding director of the School Discipline Lab, a research center that shares information about school discipline.
And states are using a variety of measures from proactively providing mental health supports to loosening restrictions for exclusionary discipline, said Welsh, who is also an associate professor of education and public policy at Vanderbilt University.
Post pandemic, “we did have an uptick in student misbehavior,” Welsh said. “But I think what also gets missing in that was we also had an uptick in student and teacher needs.”
The COVID factor
Post-COVID, schools have reported a rise in unruly behaviors, including among young students. Some of the behaviors have been violent and have even injured teachers, leading them to turn away from the profession.
Research published by the American Psychological Association last year found an increase in violence against K-12 educators over the past decade. After COVID restrictions ended in 2022, a survey of 11,814 school staff, including teachers and administrators, found that 2% to 56% of respondents reported physical violence at least once during the year, with rates varying by school staff role and aggressor.
Data also shows that student verbal abuse occurring at least once a week on average, doubled from 4.8% in the 2009-10 school year to 9.8% in 2019-20, according to APA.
Students’ mental health needs increased during and after the pandemic, according to studies. Additional research showed that teachers, administrators and other school staff lacked resources to properly address students’ needs.
Some educators, parents and advocates worry that harsher student discipline policies will undermine evidenced-based practices for decreasing challenging behaviors and keeping students in school. They are also concerned that after several years of expanding positive behavior supports and restorative practices, a focus on stricter discipline policies will disproportionately affect students of color and those with disabilities.
The legislative activity at the state level is occurring at the same time President Donald Trump is calling for “reinstating common sense” to school discipline policies. An April executive order calls for the U.S. Department of Education to issue guidance to districts and states regarding their obligations under Title VI to protect students against racial discrimination in relation to the discipline of students. Title VI of the Civil Rights Act prohibits discrimination based on race, color or national origin in federally funded programs.
The Trump administration has called for the federal government to enact policies that are “colorblind,” not favoring one race over others.
The order also directs the Education Department to submit a report by late August on the “status of discriminatory-equity-ideology-based school discipline and behavior modification techniques in American public education.”
Welsh predicts that the executive order will lead to more state activity addressing student behavior and a specific focus on the guidelines for administering punitive discipline.
Some groups are urging caution over stricter discipline policies, concerned that harsher approaches will harm students from marginalized groups. Black boys were disciplined at higher rates than boys of other races during the 2021-22 school year, according to the Civil Rights Data Collection. While Black boys represented 8% of K-12 student enrollment, they accounted for 22% of those receiving one or more out-of-school suspensions, and 21% of those who were expelled.
“Reinstating ‘common sense’ school discipline policies may have significant adverse effects on Black students, who already face disproportionate disciplinary actions in educational settings,” said the Congressional Black Caucus Foundation in an April 23 statement. “By eliminating considerations of racial disparities in discipline, the order risks exacerbating existing inequities.”
Welsh said states can address racial disparities in discipline through accountability measures and by not conflating school discipline with school safety. “When that happens, we tend to bring safety reforms as a response to student behavior,” he said.
As districts and states consider reforms to school discipline policies, leaders should view school safety and school discipline concerns as distinct issues with overlapping yet separate opportunities for improvement and interventions, said a recent National Education Policy Center report.
This includes investing in supportive measures to behavior management; providing resources to educators and interventions for students; and addressing the causes for students’ challenging behaviors and educators’ perceptions and responses to misbehavior, the report said.
State proposals
According to the National Conference of State Legislatures, 97 state bills concerning school discipline and behavioral supports have been introduced so far this year. Of those, 10 have become law as of June 13.
Some of the introduced bills harden discipline policies.
West Virginia’s Senate Bill 199, signed by Gov. Patrick Morrisey in April, allows teachers to remove from the classroom students who are threatening or intimidating other students.
“This legislation provides teachers with the tools to regain control of the classroom and ensure safe learning environments for our kids,” Morrisey said in an April 15 statement.
Texas House Bill6, which is awaiting Gov. Greg Abbott’s signature, extends in-school suspensions from a maximum of three days to the length of time administrators deem appropriate, although they would need to evaluate on the 10th day whether to extend the suspension or bring the student back. The legislation also would allow a teacher to remove a student from a classroom who “demonstrates behavior that is unruly, disruptive, or abusive toward the teacher, another adult, or another student.”
In Arkansas, HB 1062, known as the Teacher and Student Protection Act, dictates that if a student is removed from a classroom for violent behavior, that student may not re-enter a classroom that has the teacher or student who was the target for the violent behavior. The bill was signed into law in April.
However, a bill that would give teachers authority to remove a disruptive student from a classroom languished in the Montana legislature.
Other states are seeing a mix of legislation, including reforms to encourage discipline data analysis, expand behavior interventions, and provide student protections during discipline or emergency situations.
For example, HB 1248 in Colorado clarifies when physical restraint can be used and the limitations for such actions. That bill was signed into law May 24.
A bill being considered by the Illinois General Assembly, HB 3772, would limit who could determine if a preschooler can be suspended and for how long.
At the same time, a bill to prohibit corporal punishment failed in the Indiana General Assembly, while a bill to allow corporal punishment in schools lost momentum in the West Virginia Legislature, according to NCSL.
There are 23 states that permit corporal punishment in schools, a National Education Association report said last year.
Prior research shows attendance is one of the best predictors of class grades and student outcomes, creating a strong argument for faculty to incentivize or require attendance.
Attaching grades to attendance, however, can create its own challenges, because many students generally want more flexibility in their schedules and think they should be assessed on what they learn—not how often they show up. A student columnist at the University of Washington expressed frustration at receiving a 20 percent weighted participation grade, which the professor graded based on exit tickets students submitted at the end of class.
“Our grades should be based on our understanding of the material, not whether or not we were in the room,” Sophie Sanjani wrote in The Daily, UW’s student paper.
Keenan Hartert, a biology professor at Minnesota State University, Mankato, set out to understand the factors affecting students’ performance in his own course and found that attendance was one of the strongest predictors of their success.
His finding wasn’t an aha moment, but reaffirmed his position that attendance is an early indicator of GPA and class community building. The challenge, he said, is how to apply such principles to an increasingly diverse student body, many of whom juggle work, caregiving responsibilities and their own personal struggles.
“We definitely have different students than the ones I went to school with,” Hartert said. “We do try to be the most flexible, because we have a lot of students that have a lot of other things going on that they can’t tell us. We want to be there for them.”
Who’s missing class? It’s not uncommon for a student to miss class for illness or an outside conflict, but higher rates of absence among college students in recent years are giving professors pause.
An analysis of 1.1 million students across 22 major research institutions found that the number of hours students have spent attending class, discussion sections and labs declined dramatically from the 2018–19 academic year to 2022–23, according to the Student Experience in the Research University (SERU) Consortium.
More than 30 percent of students who attended community college in person skipped class sometimes in the past year, a 2023 study found; 4 percent said they skipped class often or very often.
Students say they opt out of class for a variety of reasons, including lack of motivation, competing priorities and external challenges. A professor at Colorado State University surveyed 175 of his students in 2023 and found that 37 percent said they regularly did not attend class because of physical illness, mental health concerns, a lack of interest or engagement, or simply because it wasn’t a requirement.
A 2024 survey from Trellis Strategies found that 15 percent of students missed class sometimes due to a lack of reliable transportation. Among working students, one in four said they regularly missed class due to conflicts with their work schedule.
High rates of anxiety and depression among college students may also impact their attendance. More than half of 817 students surveyed by Harmony Healthcare IT in 2024 said they’d skipped class due to mental health struggles; one-third of respondents indicated they’d failed a test because of negative mental health.
A case study: MSU Mankato’s Hartert collected data on about 250 students who enrolled in his 200-level genetics course over several semesters.
Using an end-of-term survey, class activities and his own grade book information, Hartert collected data measuring student stress, hours slept, hours worked, number of office hours attended, class attendance and quiz grades, among other metrics.
Mapping out the various factors, Hartert’s case study modeled other findings in student success literature: a high number of hours worked correlated negatively with the student’s course grade, while attendance in class and at review sessions correlated positively with academic outcomes.
Data analysis by Keenan Hartert, a biology professor at Minnesota State University, Mankato, found student employment negatively correlated with their overall class grade.
Keenan Hartert
The data also revealed to Hartert some of the challenges students face while enrolled. “It was brutal to see how many students [were working full-time]. Just seeing how many were [working] over 20 [hours] and how many were over 30 or 40, it was different.”
Nationally, two-thirds of college students work for pay while enrolled, and 43 percent of employed students work full-time, according to fall 2024 data from Trellis Strategies.
Hartert also asked students if they had any financial resources to support them in case of emergency; 28 percent said they had no fallback. Of those students, 90 percent were working more than 20 hours per week.
Data analysis of student surveys show students who are working are less likely to have financial resources to support them in an emergency.
The findings illustrated to him the challenges many students face in managing their job shifts while trying to meet attendance requirements.
A Faculty Aside
While some faculty may be less interested in using predictive analytics for their own classes, Hartert found tracking factors like how often a student attends office hours was beneficial to helping him achieve his own career goals, because he could include those measurements in his tenure review.
An interpersonal dynamic: A less measured factor in the attendance debate is not a student’s own learning, but the classroom environment they contribute to. Hartert framed it as students motivating their peers unknowingly. “The people that you may not know that sit around you and see you, if you’re gone, they may think, ‘Well, they gave up, why should I keep trying?’ Even if they’ve never spoken to you.”
One professor at the University of Oregon found that peer engagement positively correlated with academic outcomes. Raghuveer Parthasarathy restructured his general education physics course to promote engagement by creating an “active zone,” or a designated seating area in the classroom where students sat if they wanted to participate in class discussions and other active learning conversations.
Compared to other sections of the course, the class was more engaged across the board, even among those who didn’t opt to sit in the participation zone. Additionally, students who sat in the active zone were more likely to earn higher grades on exams and in the course over all.
Attending class can also create connections between students and professors, something students say they want and expect.
A May 2024 student survey by Inside Higher Ed and Generation Lab found that 35 percent of respondents think their academic success would be most improved by professors getting to know them better. In a separate question, 55 percent of respondents said they think professors are at least partly responsible for becoming a mentor.
The SERU Consortium found student respondents in 2023 were less likely to say a professor knew or had learned their name compared to their peers in 2013. Students were also less confident that they knew a professor well enough to ask for a letter of recommendation for a job or graduate school.
“You have to show up to class then, so I know who you are,” Hartert said.
Meeting in the middle: To encourage attendance, Hartert employs active learning methods such as creative writing or case studies, which help demonstrate the value of class participation. His favorite is a jury scenario, in which students put their medical expertise into practice with criminal cases. “I really try and get them in some gray-area stuff and remind them, just because it’s a big textbook doesn’t mean that you can’t have some creative, fun ideas,” Hartert said.
For those who can’t make it, all of Hartert’s lectures are recorded and available online to watch later. Recording lectures, he said, “was a really hard bridge to cross, post-COVID. I was like, ‘Nobody’s going to show up.’ But every time I looked at the data [for] who was looking at the recording, it’s all my top students.” That was reason enough for him to leave the recordings available as additional practice and resources.
Students who can’t make an in-person class session can receive attendance credit by sending Hartert their notes and answers to any questions asked live during the class, proving they watched the recording.
Hartert has also made adjustments to how he uses class time to create more avenues for working students to engage. His genetics course includes a three-hour lab section, which rarely lasts the full time, Hartert said. Now, the final hour of the lab is a dedicated review session facilitated by peer leaders, who use practice questions Hartert designed. Initial data shows working students who stayed for the review section of labs were more likely to perform better on their exams.
“The good news is when it works out, like when we can make some adjustments, then we can figure our way through,” Hartert said. “But the reality of life is that time marches on and things happen, and you gotta choose a couple priorities.”
Do you have an academic intervention that might help others improve student success? Tell us about it.