Tag: Improves

  • Temple Research Lab Improves Student Athlete Support

    Temple Research Lab Improves Student Athlete Support

    As the landscape of college athletics continues to shift, Temple University is experimenting with a new initiative that embeds academic research into the day-to-day operations of its athletics department.

    Launched last month, the Athletic Innovation, Research and Education Lab formalizes a partnership between the School of Sport, Tourism and Hospitality Management (STHM) and Temple Athletics.

    The AIRE Lab functions as both a research center and a practical hub, aiming to improve program management and student athletes’ development through evidence-based solutions.

    Jonathan Howe, an assistant professor at STHM and AIRE Lab co-director, said supporting the student-athlete experience is especially important at an institution like Temple University, which has fewer resources for name, image and likeness and revenue sharing than larger schools.

    “We’re able to engage in research and leverage university resources in a way that the athletics department may not traditionally be able to do,” Howe said.

    Elizabeth Taylor, an associate professor at STHM and AIRE Lab co-director, emphasized the importance of data-driven decision-making.

    “The folks who work in student athlete development may not have the capacity to do their full-time jobs while also staying up-to-date on the literature or evaluating the impact and effectiveness of the programs they offer,” Taylor said.

    She added that the goal is to “connect with people on campus who are already doing this work and share resources instead of recreating the wheel or paying someone from outside the university.”

    State of play: The launch of the AIRE Lab comes amid rapid changes in college athletics, including the rise of NIL compensation, evolving transfer rules and ongoing debates over athlete eligibility and governance. Taylor and Howe said these shifts have increased the need for institutions to understand how policy, culture and organizational decisions affect student athletes.

    “The additional opportunities through NIL and revenue-sharing create more time demands on student athletes,” Taylor said, noting that potential brand deals can complicate efforts to balance practices and competitions with classes, extracurriculars and internships.

    “What the research shows us is that they’re already strapped for time and what comes with that is stress, anxiety and mental health challenges,” she added.

    Transfer rules can further complicate the student athlete experience, particularly for athletes arriving from other institutions, Howe said. “Navigating the academic setting is a lot for athletes who may be transferring in or may have a lucrative NIL deal, so academics may be put on the back burner,” he said.

    To bridge the gap between research and daily operations, the athletics department appointed two staff members as lab practitioners to help translate research into practice.

    “Everything is changing by the second, and student athletes are having to navigate these changes,” Howe said. “So how can we provide a system that identifies the most beneficial programming to help athletes be as successful as possible in their professional pursuits once they leave campus?”

    In practice: One of the lab’s first initiatives was a cooking demonstration held at Temple University’s public health school. The session was designed to help student athletes learn how to prepare simple, nutritious meals.

    Taylor said the goal was to encourage student athletes to make practical, healthy choices and develop skills they can use outside of structured team meals.

    “The idea behind the cooking demonstration came from a research article on the experiences of college athletes, and one of the things that the athletes talked about is how so much of their life is planned out for them,” said Taylor. She added that while what student athletes eat and how they work out is often prescribed, they aren’t necessarily taught why they’re eating certain foods or doing specific workouts in the weight room.

    “It was a great experience for them to learn more about cooking safely and making healthy meals,” she added, noting that over 20 student athletes participated in the session.

    What’s next: Looking ahead, Howe said he hopes the lab will serve as a model for other institutions seeking to better integrate research, student athlete well-being and athletics administration.

    “We want to continue leveraging institutional, federal and state resources to provide athletes with opportunities they normally wouldn’t get, especially at a time when higher education budgets are being cut,” Howe said.

    “For me, the AIRE Lab allows us to break down some of the long-standing barriers we’ve had at the higher education level. Just because the budget is cut doesn’t mean we have to eliminate programs,” he said.

    Get more content like this directly to your inbox. Subscribe here.

    Source link

  • “Say My Name, Say My Name”: Why Learning Names Improves Student Success – Faculty Focus

    “Say My Name, Say My Name”: Why Learning Names Improves Student Success – Faculty Focus

    Source link

  • “Say My Name, Say My Name”: Why Learning Names Improves Student Success – Faculty Focus

    “Say My Name, Say My Name”: Why Learning Names Improves Student Success – Faculty Focus

    Source link

  • Explainable AI That Improves Testing Decisions

    Explainable AI That Improves Testing Decisions

    Artificial intelligence is now a practical tool reshaping how software teams work. It appears in code reviews, helps spot bugs early, and speeds deployment workflows. In testing, it is starting to take on a bigger role, like helping teams design better test cases, automate routine checks, and find patterns in test results. As AI becomes more involved in the software testing lifecycle, the key question is not just what it can do, but whether we understand how it works.

    A critical question arises: Can we explain how these models arrive at their decisions? 

    This blog is for developers, quality engineers, and DevOps teams who work extensively with AI. I hope to help clarify Explainable AI so that you can build transparent, dependable, and responsible systems.

    As someone architecting AI solutions across the software testing lifecycle – from test design and scripting to optimization and reporting, I have seen firsthand how teams struggle to interpret the outputs of models. Whether it is a prompt-driven LLM suggesting test cases or a machine learning algorithm flagging anomalies in test results, the lack of clarity around why a decision was made can lead to hesitation, misalignment, or even rejection of the solution.

    Let me introduce Explainable AI (XAI) in a way that’s practical, relevant, and actionable for technical teams.

     

    What Explainable AI Really Means for Your Team

    When we use AI in testing, whether it is generating test scripts or making predictions (test optimization, recommendation), it’s easy to lose track of how those decisions are made. That’s where XAI comes in. It helps teams understand the “why” behind each output, so they can trust the results, catch mistakes early, and improve how the system works.

    For instance, in our work building AI‑powered tools across the testing lifecycle, explainability has become a mandatory requirement. Whether it’s intelligent test design, web and mobile automation, API validation, optimization, or reporting, each solution we develop relies on models and agents making decisions that impact how teams test, deploy, and monitor software.

    When models make decisions, teams rightly ask why:

    • Why did the test optimization agent prioritize these specific test cases?
    • What factors influenced the bug prediction?
    • How was the optimization path determined?
    • What logic identifies DOM locators for UI automation?

    These question patterns build trust, the reason people want to use the system. That is where XAI steps in. XAI shows how AI tools make decisions so developers, QEs, and DevOps teams can understand the logic, catch issues faster, and trust the results.

     

    Why Developers and QE Teams Need XAI

    Explainability is not optional; it is essential.

    • Trust in Automation: Teams adopt AI tools more readily when they grasp the underlying logic. For example, if a model suggests skipping regression tests, stakeholders need to know why.
    • Debugging and Iteration: When a model behaves oddly, like giving biased outputs or brittle prompts, XAI helps diagnose and fix issues faster.
    • Compliance and Auditing: Regulated industries need to explain how automated decisions are made. XAI makes that possible and keeps us on the right side of regulations.
    • Fairness and Ethics: XAI helps spot bias in how models treat data, so decisions remain fair, especially when they affect users or resource allocation.

     

    Real‑World Relevance in the Software Testing Lifecycle (STLC)

    Let’s ground this in practical scenarios:

    • Test Design: XAI clarifies which requirements or user stories guided LLM‑generated tests.
    • Test Automation: XAI provides explanations for how AI agents choose DOM locators, API endpoints, or interaction flows, which increases transparency in automation scripts.
    • Test Optimization: XAI reveals data patterns behind recommendations.
    • Reporting: XAI explains the logic of dashboard anomalies or trends, such as time‑series analysis or clustering.

     

    How to Integrate XAI into Your Workflow

    Actionable strategies:

    • Use Interpretable Models: Opt for decision trees or rule‑based systems. They’re simpler to explain and troubleshoot.
    • Layer Explanations on Complex Models: For deep learning or ensembles, use tools that provide post‑hoc explanations. These don’t change the model but help interpret its behavior.
    • Make It Easy to Follow: When building your interface, think about how someone on your team would use it. Keep the explanations simple and clear.
    • Check for Bias Early: Before your model goes live, evaluate fairness and safety (for example, LLM‑as‑a‑Judge, fairness checkers) to catch bias or PII exposure.
    • Document Decisions: Record model results and reasons for transparency and improvement.

    Challenges to Watch For

    1. Pick What Works Best: Simple models are easier to explain, but they don’t always give the most accurate results. Sometimes you need clarity, other times precision. So, choose based on what your project really needs.

    2. Scalability: Explaining every prediction uses resources. Focus on key cases.

    3. User Misinterpretation: Explanations can be misunderstood. Training and UX matter.

    4. Security Risks: Revealing model details can create vulnerabilities. Share selectively.

     

    Best Practices for Software Teams

    1. Speak Their Language: Tailor explanations to the audience; developers may want details, while business users need the big picture.

    2. Listen and Adjust: Share explanations with real users, see what makes sense to them, and keep tweaking until it clicks.

    3. Mix Your Methods:  Don’t rely on just one way to explain things. Combine multiple techniques to give a fuller, clearer picture.

    4. Stay Updated: Track new XAI tools and research to keep practices up to date.

     

    XAI: What’s Next

    AI systems will soon not only explain decisions but also answer “what if” questions and provide causal reasoning. For teams building AI into STLC, this means:

    • Interactive Debugging: Find out why your model skipped a test with clear answers.
    • Causal Insights: Identify cause‑and‑effect links in failures or performance drops.
    • Standardized Explainability: Industry benchmarks and compliance rules will guide AI transparency.

     

    The Real Value of XAI

    Explainability isn’t just a technical checkbox; it’s what helps teams trust the tools they use. As we build smarter systems, making sure people understand how they work should be part of the plan from the beginning.

    Integrating XAI into our strategy helps teams collaborate efficiently, iterate quickly, and deliver effective, ethical solutions.

     

    Source link

  • School bus driver shortage improves slightly with bump in hiring, pay

    School bus driver shortage improves slightly with bump in hiring, pay

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • Higher hourly wages are credited for modest growth in the number of school bus drivers over the past year, but employment in the field remains down 9.5% compared to 2019 staffing levels, according to a recent analysis from the Economic Policy Institute.
    • The median hourly wage for school bus drivers was $22.45 on Aug. 1, a 4.2% increase year over year when accounting for inflation.
    • Nonetheless, the K-12 staffing outlook overall shows instability as school systems continue adjusting to the end of federal COVID-19 emergency funding and as changes implemented by the Trump administration put more fiscal pressures on state and local school systems, EPI said.

    Dive Insight:

    Employment for all K-12 positions is up 1.4% from August 2019 to August 2025, EPI found.  Custodian positions dropped 12.4%, joining school bus drivers among those seeing the largest decreases. Slots for paraprofessionals, on the other hand, increased 16.5% during the same period, according to EPI. 

    The recent wage growth for school bus drivers is not the typical pattern seen over the past 15 years, EPI said. In fact, from Nov. 1, 2012, through June 1, 2015, school bus drivers saw negative year-over-year wage growth. Negative growth also occurred for this role in July 2018, November 2018 and September 2019.

    EPI said the split-shift schedule required for the beginning and end of school days makes it difficult to recruit bus drivers. Moreover, school bus drivers — along with paraprofessionals, custodians and food service workers — tend to receive low pay. These jobs also are disproportionately held by women, Black and brown workers, and older employees, according to a 2024 EPI report.

    School bus driver employment has grown by about 2,300 jobs over the past year. This growth is due to state and local government school bus driver employment, which saw an increase of nearly 9,900 drivers since the fall of 2024. Private-sector school bus employment fell by 8,200 jobs over the same period.

    The institute’s most recent report said it’s hard to draw meaningful conclusions about the school bus driver wage growth over the last few years due to COVID-influenced changes in the labor market, as well as difficulty collecting labor data during the pandemic. 

    Still, EPI said “the wage growth for school bus drivers in the last year stands out as a much-needed investment in this critical segment of the education workforce.”

    Several schools in Pennsylvania and one school system in Ohio closed for at least a day this school year due to school bus driver shortages, according to local news reports. Other localities have consolidated bus routes or made other adjustments to respond to driver shortages.

    Source link

  • Belonging Intervention Improves Pass Rates

    Belonging Intervention Improves Pass Rates

    Sense of belonging is a significant predictor of student retention and completion in higher education; students who believe they belong are more likely to bounce back from obstacles, take advantage of campus resources and remain enrolled.

    For community colleges, instilling a sense of belonging among students can be challenging, since students often juggle competing priorities, including working full-time, taking care of family members and commuting to and from campus.

    To help improve retention rates, the California Community Colleges replicated a belonging intervention developed at Indiana University’s Equity Accelerator and the College Transition Collaborative.

    Data showed the intervention not only increased students’ academic outcomes, but it also helped close some equity gaps for low-income students and those from historically marginalized backgrounds.

    What’s the need: Community college students are less involved on campus than their four-year peers; they’re also less likely to say they’re aware of or have used campus resources, according to survey data from Inside Higher Ed.

    This isolation isn’t desired; a recent survey by the ed-tech group EAB found that 42 percent of community college students said their social life was a top disappointment. A similar number said they were disappointed they didn’t make friends or meet new people.

    Methodology

    Six colleges in the California Community Colleges system participated in the study, for a total of 1,160 students—578 in the belonging program and 582 in a control group. Students completed the program during the summer or at the start of the term and then filled out a survey at the end.

    Moorpark Community College elected to deliver the belonging intervention during first-semester math and English courses to ensure all students could benefit.

    How it works: The Social Belonging for College Students intervention has three components:

    1. First, students analyze survey data from peers at their college, which shows that many others also worry about their academic success, experience loneliness or face additional challenges, to help normalize anxieties about college.
    2. Then, students read testimonies from other students about their initial concerns starting college and how they overcame the challenges.
    3. Finally, students write reflections of their own transition to college and offer advice to future students about how to overcome these concerns or reassure them that these feelings are normal.

    The goal of the exercise is to achieve a psychological outcome called “saying is believing,” said Oleg Bespalov, dean of institutional effectiveness and marketing at Moorpark Community College, part of the Ventura Community College District in California.

    “If you’ve ever worked in sales, like, say I worked at Toyota. I might not like Toyota; I just really need a job,” Bespalov said. “But the more I sell the Toyota, the more I come to believe that Toyota is a great car.” In the same way, while a student might not think they can succeed in college, expressing that belief to someone else can change their behaviors.

    Without the intervention, students tend to spiral, seeing a poor grade as a reflection of themselves and their capabilities. They may believe they’re the only ones who are struggling, Bespalov said. Following the intervention, students are more likely to embrace the idea that everyone fails sometimes and that they can rebound from the experience.

    At Moorpark, the Social Belonging for College Students intervention is paired with teaching on the growth mindset, explained Tracy Tennenhouse, English instructor and writing center co-coordinator.

    “Belonging is a mindset,” Bespalov said. “You have to believe that you belong here, and you have to convince the student to change their mindset about that.”

    The results: Students who participated in the belonging program were more likely to re-enroll for the next term, compared to their peers in the control group. This was especially true for students with high financial need or those from racial minorities.

    In the control group, there was a 14-percentage-point gap between low- and high-income students’ probability of re-enrolling. After the intervention, the re-enrollment gap dropped to six percentage points.

    Similarly, low-income students who participated in the intervention had a GPA that was 0.21 points higher than their peers who did not. Black students who participated in the exercise saw average gains of 0.46 points in their weighted GPA.

    To researchers, the results suggest that students from underrepresented backgrounds had more positive experiences at the end of the fall term if they completed the belonging activity. Intervention participants from these groups also reported fewer identity-related concerns and better mental and physical health, compared to their peers who didn’t participate.

    What’s next: Based on the positive findings, Moorpark campus leaders plan to continue delivering the intervention in future semesters. Tennenhouse sees an opportunity to utilize the reflection as a handwritten writing sample for English courses, making the assignment both a line of defense against AI plagiarism and an effective measure for promoting student belonging.

    Administrators have also considered delivering the intervention during summer bridge programs to support students earlier in their transition, or as a required assignment for online learners who do not meet synchronously.

    In addition, Tennenhouse would like to see more faculty share their own failure stories. Research shows students are more likely to feel connected to instructors who open up about their own lives with students.

    How does your college campus encourage feelings of belonging in the classroom? Tell us more here.

    Source link

  • Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Source link

  • Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Source link

  • Data Shows Attendance Improves Student Success

    Data Shows Attendance Improves Student Success

    Prior research shows attendance is one of the best predictors of class grades and student outcomes, creating a strong argument for faculty to incentivize or require attendance.

    Attaching grades to attendance, however, can create its own challenges, because many students generally want more flexibility in their schedules and think they should be assessed on what they learn—not how often they show up. A student columnist at the University of Washington expressed frustration at receiving a 20 percent weighted participation grade, which the professor graded based on exit tickets students submitted at the end of class.

    “Our grades should be based on our understanding of the material, not whether or not we were in the room,” Sophie Sanjani wrote in The Daily, UW’s student paper.

    Keenan Hartert, a biology professor at Minnesota State University, Mankato, set out to understand the factors affecting students’ performance in his own course and found that attendance was one of the strongest predictors of their success.

    His finding wasn’t an aha moment, but reaffirmed his position that attendance is an early indicator of GPA and class community building. The challenge, he said, is how to apply such principles to an increasingly diverse student body, many of whom juggle work, caregiving responsibilities and their own personal struggles.

    “We definitely have different students than the ones I went to school with,” Hartert said. “We do try to be the most flexible, because we have a lot of students that have a lot of other things going on that they can’t tell us. We want to be there for them.”

    Who’s missing class? It’s not uncommon for a student to miss class for illness or an outside conflict, but higher rates of absence among college students in recent years are giving professors pause.

    An analysis of 1.1 million students across 22 major research institutions found that the number of hours students have spent attending class, discussion sections and labs declined dramatically from the 2018–19 academic year to 2022–23, according to the Student Experience in the Research University (SERU) Consortium.

    More than 30 percent of students who attended community college in person skipped class sometimes in the past year, a 2023 study found; 4 percent said they skipped class often or very often.

    Students say they opt out of class for a variety of reasons, including lack of motivation, competing priorities and external challenges. A professor at Colorado State University surveyed 175 of his students in 2023 and found that 37 percent said they regularly did not attend class because of physical illness, mental health concerns, a lack of interest or engagement, or simply because it wasn’t a requirement.

    A 2024 survey from Trellis Strategies found that 15 percent of students missed class sometimes due to a lack of reliable transportation. Among working students, one in four said they regularly missed class due to conflicts with their work schedule.

    High rates of anxiety and depression among college students may also impact their attendance. More than half of 817 students surveyed by Harmony Healthcare IT in 2024 said they’d skipped class due to mental health struggles; one-third of respondents indicated they’d failed a test because of negative mental health.

    A case study: MSU Mankato’s Hartert collected data on about 250 students who enrolled in his 200-level genetics course over several semesters.

    Using an end-of-term survey, class activities and his own grade book information, Hartert collected data measuring student stress, hours slept, hours worked, number of office hours attended, class attendance and quiz grades, among other metrics.

    Mapping out the various factors, Hartert’s case study modeled other findings in student success literature: a high number of hours worked correlated negatively with the student’s course grade, while attendance in class and at review sessions correlated positively with academic outcomes.

    Data analysis by Keenan Hartert, a biology professor at Minnesota State University, Mankato, found student employment negatively correlated with their overall class grade.

    Keenan Hartert

    The data also revealed to Hartert some of the challenges students face while enrolled. “It was brutal to see how many students [were working full-time]. Just seeing how many were [working] over 20 [hours] and how many were over 30 or 40, it was different.”

    Nationally, two-thirds of college students work for pay while enrolled, and 43 percent of employed students work full-time, according to fall 2024 data from Trellis Strategies.

    Hartert also asked students if they had any financial resources to support them in case of emergency; 28 percent said they had no fallback. Of those students, 90 percent were working more than 20 hours per week.

    Four pie charts show how working students often lack financial support and how working more hours is connected to passing or failing a course.

    Data analysis of student surveys show students who are working are less likely to have financial resources to support them in an emergency.

    The findings illustrated to him the challenges many students face in managing their job shifts while trying to meet attendance requirements.

    A Faculty Aside

    While some faculty may be less interested in using predictive analytics for their own classes, Hartert found tracking factors like how often a student attends office hours was beneficial to helping him achieve his own career goals, because he could include those measurements in his tenure review.

    An interpersonal dynamic: A less measured factor in the attendance debate is not a student’s own learning, but the classroom environment they contribute to. Hartert framed it as students motivating their peers unknowingly. “The people that you may not know that sit around you and see you, if you’re gone, they may think, ‘Well, they gave up, why should I keep trying?’ Even if they’ve never spoken to you.”

    One professor at the University of Oregon found that peer engagement positively correlated with academic outcomes. Raghuveer Parthasarathy restructured his general education physics course to promote engagement by creating an “active zone,” or a designated seating area in the classroom where students sat if they wanted to participate in class discussions and other active learning conversations.

    Compared to other sections of the course, the class was more engaged across the board, even among those who didn’t opt to sit in the participation zone. Additionally, students who sat in the active zone were more likely to earn higher grades on exams and in the course over all.

    Attending class can also create connections between students and professors, something students say they want and expect.

    A May 2024 student survey by Inside Higher Ed and Generation Lab found that 35 percent of respondents think their academic success would be most improved by professors getting to know them better. In a separate question, 55 percent of respondents said they think professors are at least partly responsible for becoming a mentor.

    The SERU Consortium found student respondents in 2023 were less likely to say a professor knew or had learned their name compared to their peers in 2013. Students were also less confident that they knew a professor well enough to ask for a letter of recommendation for a job or graduate school.

    “You have to show up to class then, so I know who you are,” Hartert said.

    Meeting in the middle: To encourage attendance, Hartert employs active learning methods such as creative writing or case studies, which help demonstrate the value of class participation. His favorite is a jury scenario, in which students put their medical expertise into practice with criminal cases. “I really try and get them in some gray-area stuff and remind them, just because it’s a big textbook doesn’t mean that you can’t have some creative, fun ideas,” Hartert said.

    For those who can’t make it, all of Hartert’s lectures are recorded and available online to watch later. Recording lectures, he said, “was a really hard bridge to cross, post-COVID. I was like, ‘Nobody’s going to show up.’ But every time I looked at the data [for] who was looking at the recording, it’s all my top students.” That was reason enough for him to leave the recordings available as additional practice and resources.

    Students who can’t make an in-person class session can receive attendance credit by sending Hartert their notes and answers to any questions asked live during the class, proving they watched the recording.

    Hartert has also made adjustments to how he uses class time to create more avenues for working students to engage. His genetics course includes a three-hour lab section, which rarely lasts the full time, Hartert said. Now, the final hour of the lab is a dedicated review session facilitated by peer leaders, who use practice questions Hartert designed. Initial data shows working students who stayed for the review section of labs were more likely to perform better on their exams.

    “The good news is when it works out, like when we can make some adjustments, then we can figure our way through,” Hartert said. “But the reality of life is that time marches on and things happen, and you gotta choose a couple priorities.”

    Do you have an academic intervention that might help others improve student success? Tell us about it.

    Source link