Tag: students

  • Student-Led Teaching Doesn’t Help Underprepared Students

    Student-Led Teaching Doesn’t Help Underprepared Students

    miodrag ignjatovic/E+/Getty Images

     

     

     

     

    Introductory STEM courses serve as a gatekeeper for students interested in majors or careers in STEM fields, and students from less privileged backgrounds are often less likely to succeed in those courses.

    As a result, researchers have explored what practices can make a difference in student outcomes in such courses, including creating sections with diverse student populations and offering grade forgiveness for students who performed poorly.

    A recent research article from the University of Texas at Austin examined the role peer instructors play in helping students from a variety of backgrounds. Researchers discovered that students who were enrolled in an interactive peer-led physics course section had worse learning outcomes and grades than their peers in a lecture section taught by an instructor. Students with lower SAT scores were also less likely to achieve a high grade in the student-taught class.

    The research: Historically, instructors teaching STEM courses have delivered content through lectures, with students taking a largely passive role, according to the paper. However, more active learning environments have been tied to higher student engagement and are largely preferred by learners. A May 2024 Student Voice survey by Inside Higher Ed found that 44 percent of respondents said an interactive lecture format helps them learn and retain information best, compared to 25 percent who selected traditional lectures.

    Interactive lectures can include instructors asking students questions throughout the class period or creating opportunities for them to reflect on course material, according to the paper. Peer instruction is touted as an effective means of flipped classroom teaching, requiring students to finish readings prior to class and reserving class time for interactive activities.

    While previous studies show the value of peer-led courses, much of that research focused on selective, private institutions, where students may have more similar backgrounds or levels of academic preparation, according to the authors of the new study.

    So researchers designed a study that would compare apples to apples: They looked at the outcomes for students learning physics in one course section taught by a professor versus one taught by fellow students to see which had a greater impact.

    The results: The paper analyzed the learning outcomes of two sections of students in an introductory mechanics course at a large public institution over three years. One section of the course was taught by a peer instructor, mostly in a small-group discussion format. The other section was taught by the professor using interactive lectures.

    Students completed identical homework and midterm exams on the same days and had the opportunity to attend identical tutoring sessions supported by teaching assistants.

    Though the course is designed for physics and astronomy majors, students from other majors participated as well. Each section had between 41 and 82 students, for a total of 367 students taking the course over three years.

    Not only did students in the peer-instruction section have lower grades, but students who had lower SAT scores from high school were less likely to demonstrate learning in fundamental concepts, as well as less likely to earn an A, compared to their peers with similar test scores taught by a professor. However, students with higher SAT scores made smaller gains (less dramatic grade increases or learning demonstrated) in the lecture section compared to students taught by peers, which researchers believe could mean that students with less academic preparation may benefit more from an instructor-led course, while their peers who had a high achievement history in high school could thrive more in a peer-led section.

    The analysis does not provide an explanation for why these differences exist, but researchers theorized that group work or peer dependency could result in some students being less knowledgeable about content matter because they trust others in the class to answer correctly. Creating postdiscussion follow-up questions can lessen this learning gap.

    We bet your colleague would like this article, too. Send them this link to subscribe to our newsletter on Student Success.

    Source link

  • Rates of Admitted Students Who Are Black, Hispanic Have Decreased

    Rates of Admitted Students Who Are Black, Hispanic Have Decreased

    skynesher/E+/Getty Images

    In the wake of the Supreme Court’s 2023 decision to ban affirmative action in college admissions, no one knew exactly what the impact on Black and Hispanic enrollment might be going forward. In fall 2024, the numbers varied substantially by institution; Inside Higher Ed’s analysis of 31 institutions’ enrollment data showed massive drops in Black and Hispanic enrollment at some institutions and less drastic decreases—and even slight increases—at others.

    But enrollment data only tells part of the story. A new report from the Urban Institute, which uses data from 18 colleges and universities, highlights how the demographics of college applicants—and admits—shifted after the court’s decision in Students for Fair Admissions v. Harvard and the University of North Carolina at Chapel Hill. While the share of applicants who were Black or Hispanic increased from fall 2023 to fall 2024—by 0.47 and 0.65 percentage points, respectively—the portion who were admitted decreased.

    It marked the first time since at least 2018 that the share of admitted students who were Black had declined; Hispanic students hadn’t seen a drop since 2021, when the share of applicants also declined. White students’ share of applicants, admitted students and enrolled students has shrunk every year since 2018, a trend consistent with the declining number of white high school graduates (and of white Americans in general). White graduates are also the only group that consistently make up a larger percentage of admitted students than applicants.

    Jason Cohn, a higher education and workforce research associate for the Urban Institute and one of the report’s authors, said that these numbers shed more light on the impacts of affirmative action than enrollment figures alone.

    “We’ve seen a lot of enrollment numbers in news articles here and there since last fall. In some cases, they stay the same; in some cases, they change. But I think what these data are showing is that that’s not fully reflective of what might actually be happening,” he said. “One of the big takeaways for me is just how much can be hidden if you only look at the enrollment numbers and aren’t seeing what’s happening in the rest of the admissions pipeline.”

    For this study, the researchers partnered with two organizations, the Association of Undergraduate Education at Research Universities and the University of Southern California’s Center for Enrollment Research, Policy and Practice, to solicit data from a diverse group of 18 research universities (which they did not name). Although the sample is small, they said, it’s consistent with similar research conducted by the College Board, whose sample included about 60 institutions, indicating that the data is likely reflective of broader trends.

    It’s difficult to say definitively that the Supreme Court’s decision caused the decline in the share of admitted students from underrepresented backgrounds. That same class of high school seniors faced other barriers, including the lingering effects of the COVID-19 pandemic and delays and errors with the Free Application for Federal Student Aid. Bryan J. Cook, another author of the report and the Urban Institute’s director of higher education policy, noted that colleges in some states had begun rolling back diversity, equity and inclusion efforts at that time, including some programs aimed at recruiting students of color.

    “In this particular analysis, we’re not looking to isolate causation, but I think as we continue to look at this type of thing in future years, I think that’ll help us get a little closer,” Cohn said.

    But Robert Massa, a veteran enrollment professional, said he believes the shifts were likely caused in large part by the end of affirmative action.

    “I’m not at all surprised that Black students have increased their representation in the applicant pool and decreased their representation in the accepted pool, because universities are taking careful steps to make sure they don’t use race in and of itself as criteria in the admissions process,” he said.

    (Edward Blum, the president of SFFA, the anti–affirmative action nonprofit that was the plaintiff in the Supreme Court case, told Inside Higher Ed in an email that the organization has no opinion on the study.)

    The researchers plan to dig deeper into the data, analyzing other demographic information, including gender and family income, as well as academic variables such as the standardized test scores and grade point averages of the applicants and admitted students at these institutions.

    One possible hiccup for future research: The report also showed that post-SFFA, the share of applicants who chose not to identify their race increased, from 3.2 percent in 2023 to 5.1 percent in 2024. If that upward trend continues, Cohn said, it might make it “more difficult, over time, to unpack these trends and see who’s being served by the higher education system.”

    Source link

  • It is high time higher education adopted a harm reduction approach to drug use among students

    It is high time higher education adopted a harm reduction approach to drug use among students

    While Gen Z is showing less interest in the normalised alcohol and drug excess that dogged their preceding generations, England and Wales’ most recent stats confirmed that 16.5 per cent of people aged between 16 to 24-years-old took an illegal drug in the last year.

    Additionally, a 2024 report by Universities UK found that 18 per cent of students have used a drug in the past with 12 per cent imbibing across the previous 12 months. With a UK student population of 2.9m, this suggests the drug-savvy portion is around 348,000 to 522,000 people.

    It’s prudent, therefore, for anyone involved within student safety provision to know that the UK is currently mired in a drug death crisis – a record 5,448 fatalities were recorded in England and Wales in the most recent statistics, while Scotland had 1,172, the highest rate of drug deaths in Europe.

    In an attempt to ameliorate some of this risk, seven UK universities recently took delivery of nitazene strips to distribute among students, facilitated by the charity Students Organising for Sustainability (SOS-UK). These instant testing kits – not dissimilar to a Covid-19 lateral flow test in appearance – examine pills or powders for nitazenes: a class of often deadly synthetic opioids linked with 458 UK deaths between July 2023 and December 2024.

    While these fatalities will have most likely been amongst older, habitual users of heroin or fake prescription medicines, these strips form part of a suite of innovative solutions aimed at helping students stay safe(r) if they do choose to use drugs.

    The 2024 Universities UK report suggested drug checking and education as an option in reducing drug-related harm, and recommended a harm reduction approach, adding: “A harm reduction approach does not involve condoning or seeking to normalise the use of drugs. Instead, it aims to minimise the harms which may occur if students take drugs.”

    With that in mind, let’s consider a world where harm reduction – instead of zero tolerance – is the de facto policy and how drug checking or drug testing plays a part in that.

    Drug checking and drug testing

    Drug checking and drug testing are terms that often get used interchangeably but have different meanings. Someone using a drug checking service can get expert lab-level substance analysis, for contents and potency, then a confidential consultation on these results during which they receive harm reduction advice. In the UK, this service is offered by The Loop, a non-profit NGO that piloted drug checking at festivals in 2016 and now have a monthly city centre service in Bristol.

    Drug testing can take different forms. First, there is the analysis of a biological sample to detect whether a person has taken drugs, typically done in a workplace or medical setting. There are also UK-based laboratories offering substance analysis, that then gets relaid to the public in different ways.

    WEDINOS is an anonymous service, run by Public Health Wales since 2013, where users send a small sample of their substance alongside a downloadable form. After testing, WEDINOS posts the results on their website (in regards to content but not potency) normally within a few weeks.

    MANDRAKE is a laboratory operating out of Manchester Metropolitan University. It works in conjunction with key local stakeholders to test found or seized substances in the area. It is often first with news regarding adulterated batches of drugs or dangerously high-strength substances on the market.

    Domestic testing is also possible with reagents tests. These are legally acquired chemical compounds that change colour when exposed to specific drugs and can be used at home. They can provide valuable information as to the composition of a pill or powder but do not provide information on potency. The seven UK universities that took delivery of nitazene strips were already offering reagents kits to students as part of their harm reduction rationale.

    Although the Misuse of Drugs Act 1971 specifies that certain activities must not be allowed to take place within a university setting, the Universities UK report argued that universities have some discretion on how to manage this requirement. Specifically, it stated: “The law does not require universities to adopt a zero tolerance approach to student drug use.”

    How to dispense testing apparatus

    The mechanisms differ slightly between SUs but have broad similarities. We spoke with Newcastle and Leeds (NUSU and LUU) who both offer Reagent Tests UK reagent testing kits, fentanyl strips and now nitazene strips. Reagent’s UK kits do not test for either of these two synthetic opioids. They vary in strength compared to heroin – and there are multiple analogues of nitazene that vary in potency – but an established ballpark figure is at least 50 times as strong.

    All kits and tests are free. Newcastle’s are available from the Welfare and Support centre, in an area where students would have to take part in an informal chat with a member of staff to procure a kit. “We won’t ask for personal details. However, we do monitor this and will check in with a welfare chat if we think this would be helpful,” says Kay Hattam, Wellbeing and Safeguarding Specialist at NUSU. At Leeds, they’re available from the Advice Office and no meeting is required to collect a kit.

    Harm reduction material is offered alongside the kits. “We have developed messaging to accompany kits which is clear on the limitations of drug testing, and that testing does not make drugs safe to use,” says Leeds University Union.

    Before the donation, kits were both paid for by the respective unions and neither formally collected data on the results. Both SUs both make clear that offering these kits is not an encouragement of drug use. Kay Hattam draws an analogy: “If someone was eating fast food every day and I mentioned ways to reduce the risks associated with this, would they feel encouraged to eat more? I would think not. But it might make them think more about the risks.”

    You’ll only encourage them

    In 2022, in a report for HEPI, Arda Ozcubukcu and Graham Towl argued, “Drug use matters may be much more helpfully integrated into mental health and wellbeing strategies, rather than being viewed as a predominantly criminal justice issue.”

    The evidence backs up the view that a harm reduction approach does not encourage drug use. A 2021 report authored by The Loop’s co-founder Fiona Measham, Professor in Criminology at the University of Liverpool, found that over half of The Loop’s users disposed of a substance that tested differently to their intended purchase, reducing the risk of poisoning. Additionally, three months after checking their drugs at a festival, around 30 per cent of users reported being more careful about polydrug use (mixing substances). One in five users were taking smaller doses of drugs overall. Not only does this demonstrate that better knowledge reduces risk of poisoning in the short-term, but it also has enduring positive impacts on drug-using behaviours.

    SOS-UK has developed the Drug and Alcohol Impact scheme with 16 universities and students’ unions participating. This programme supports institutions in implementing the Universities UK guidance by using a variety of approaches to educate and support students in a non-stigmatising manner.

    Alongside them is SafeCourse, a charity founded by Hilton Mervis after his son Daniel, an Oxford University student, died from an overdose in 2019. The charity – which counts the High Court judge Sir Robin Knowles and John de Pury, who led the development of the 2024 sector harm reduction framework, among its trustees – is working to encourage universities to move away from zero tolerance.This is through various means, including commissioning legal advice to provide greater clarity on universities’ liability if they are not adopting best practice, and checking in one year on from the Universities UK report, to ascertain how they’re adapting to the new era of harm reduction.

    SafeCourse takes the view that universities must not allow themselves to be caught up prosecuting a failed war on drugs when their focus should be student safety, wellbeing and success. A harm reduction approach is the best way of achieving those ends.

    Source link

  • ED Won’t Fund CTE, Dual Enrollment for “Illegal” Students

    ED Won’t Fund CTE, Dual Enrollment for “Illegal” Students

    The Education Department said Thursday that federal money shouldn’t fund dual enrollment, adult education and certain career and technical education for “illegal alien” students, whether they’re adults or K–12 pupils who are accessing postsecondary education.

    Department officials said in a news release that they are rescinding parts of a 1997 Dear Colleague letter that had allowed undocumented students to access those programs.

    In the interpretative rule published on the Federal Register, the department declared that “non-qualified alien adults are not permitted to receive education benefits (postsecondary education benefits or otherwise) and non-qualified alien children are not eligible to receive postsecondary education benefits and certain other education benefits, so long as such benefits are not basic public education benefits. Postsecondary education benefits include dual enrollment and other similar early college programs.”

    Education Secretary Linda McMahon said in the release that “under President Trump’s leadership, hardworking American taxpayers will no longer foot the bill for illegal aliens to participate in our career, technical, or adult education programs or activities. The department will ensure that taxpayer funds are reserved for citizens and individuals who have entered our country through legal means who meet federal eligibility criteria.”

    Augustus Mays, vice president of partnerships and engagement at EdTrust, an education equity group, said in a statement that the change “derails individual aspirations and undercuts workforce development at a time when our nation is facing labor shortages in critical fields like healthcare, education, and skilled trades. This decision raises barriers even higher for undocumented students who are already barred from accessing federal financial aid like Pell Grants and student loans.

    “Across the country, we’re seeing migrant communities targeted with sweeping raids, amplified surveillance, and fear-based rhetoric designed to divide and dehumanize,” Mays said. “Policies like this don’t exist in a vacuum. They are rooted in a political agenda that scapegoats immigrants and uses fear to strip rights and resources from the most vulnerable among us.”

    Source link

  • Missouri governor signs legislation securing students’ rights to freely associate on campus

    Missouri governor signs legislation securing students’ rights to freely associate on campus

    Missouri has passed a law protecting the right of students to gather and speak on campuses across the state. On Wednesday, Missouri Gov. Mike Kehoe signed into law SB 160, which defends the freedom of student organizations to set leadership and membership requirements that are consistent with their beliefs. 

    Although the bill was later amended to include provisions unrelated to the student organization protections for which we advocated, the final law still marks a meaningful victory for students at Missouri’s public colleges and universities.

    The First Amendment guarantees the right to freely associate with others who share their beliefs — or not associate with those who don’t. FIRE has consistently opposed policies that force student groups to eliminate belief-based membership rules to gain official college recognition. As we said in March when Utah signed similar protections into law, it makes little sense, for example, “to force a Muslim student group to let atheists become voting members or for an environmentalist student group that raises awareness about the threats of climate change to allow climate change skeptics to hold office.”

    In a letter to Missouri’s legislature supporting SB 160, we explained that the right to associate freely extends to students at public universities and to the student organizations they form. The Supreme Court agrees, and has repeatedly upheld this principle, affirming in Healy v. James that public colleges cannot deny official recognition to student organizations solely based on their beliefs or associations. Similarly, in Widmar v. Vincent, the Court ruled that a public university violated the First Amendment by denying a religious student group access to campus facilities because of its religious beliefs.

    However, the Court’s decision in Christian Legal Society v. Martinez upheld the constitutionality of “all-comers” policies — requiring student organizations to accept any student as a member or leader, even those who oppose the group’s core beliefs. But the ruling applies only when such policies are enforced uniformly. In practice, universities often apply these policies selectively. For example, some religious organizations have been forced to accept members and leaders who do not share their faith, while secular groups have been allowed to set their own membership and leadership requirements without administrative intervention. 

    This selective enforcement results in viewpoint discrimination. SB 160 is meant to correct that imbalance. It states that schools cannot take any action against a student association or potential student association:

    (a) Because such association is political, ideological, or religious; 

    (b) On the basis of such association’s viewpoint or expression of the viewpoint by the association or the association’s members; or

    (c) Based on such association’s requirement that the association’s leaders be committed to furthering the association’s mission or that the association’s leaders adhere to the association’s sincerely held beliefs, practice requirements, or standards of conduct.

    With the enactment of this bill, Missouri joins a growing number of states strengthening protections for the First Amendment rights of student organizations on campus. 

    FIRE thanks Missouri lawmakers and Gov. Kehoe for affirming that students don’t shed their constitutional rights at the campus gates.

    Source link

  • How Gimkit engages my students

    How Gimkit engages my students

    Key points:

    During the height of the COVID-19 outbreak, teachers needed to become resourceful in how they delivered content to students. During this time, students experienced significant change and evolved into a more technologically-dependent group.

    This sparked a period when online learning and digital resources gained substantial popularity, and one tool that helps students learn–while also feeling like a game instead of a lesson–is Gimkit.

    I am an 8th-grade science teacher in a fairly large district, and I recognize the importance of these engaging and interactive resources to help students build knowledge and continue learning.

    What is Gimkit?

    To begin with, what is Gimkit? According to a tutorial, “Gimkit is an excellent game-based learning platform that combines fun and education, making it a highly engaging tool for both teachers and students. It works like a mashup of Kahoot and flash card platforms, but with several unique features that set it apart.

    “Unlike other platforms, Gimkit allows students to earn virtual currency for every correct answer, which they can use to purchase power-ups, adding a competitive edge that keeps students motivated.”

    Gimkit offers so much more than just a game-based learning experience for students–it can be used as an introduction to a lesson, as assigned homework, or as a tool for reviewing.

    Building a Gimkit

    From the teacher’s side of Gimkit, the platform makes it extremely easy to build lessons for the students to use. When you go to create a lesson, you are given many different options to help with the construction.

    Jamie Keet explains: “After establishing your basic Kit information, you will then move onto the fun part–adding your questions! You will be given the option of adding a question, creating your Kit with Flashcards, continuing with KitCollab, adding from Gimkit’s Question Bank, or importing from Spreadsheet.”

    Adding your questions is a great way to make sure your students are getting the exact information they have been provided in class, but some of the other options can help with a teacher’s time, which always seems to be scarce.

    The option to add questions from the question bank allows teachers to view other created kits similar to their topic. With a few simple clicks, a teacher can add questions that meet the needs of their lesson.

    Gimkit as data collection

    Gimkit isn’t just a tool for students to gain knowledge and play games; it is also an excellent way for teachers to collect data on their students. As Amelia Bree observes:

    “Gimkit reports explained show you both big pictures and small details. The look might change sometimes. But you will usually see:

    • Overall Class Performance: This shows the average right answers. It tells you the total questions answered. It also shows how long the game took. It’s a good first look at how everyone understood.
    • Individual Student Results: Click on each student’s name here. You see their personal game path. Their accuracy. Which questions did they get right or wrong? Sometimes, even how fast they answered.
    • Question Breakdown: This part is very powerful. It shows how everyone did on each question you asked. You see how many got it right. How many missed it? Sometimes, it shows common wrong answers for multiple-choice questions.”

    Being able to see this data can help ensure that your students are not just completing the required steps to finish the task, but are also working towards mastering the materials within your class.

    When examining the data, if you identify trends related to specific questions or concepts that students are struggling with, you have the opportunity to revisit and reteach these areas.

    Conclusion

    As you can see, Gimkit isn’t just a tool for students to play games and have fun in class; it is also an opportunity for students to gain knowledge in your lessons while potentially having some fun in the process. Teachers can make creating content for their classes much easier by utilizing some of the built-in features Gimkit provides.

    They can collect the meaningful data needed to ensure students are making progress in the areas where they want them to.

    Works Cited

    Breisacher, J. (2024, October 7). How Teachers Can Use Gimkit in the Classroom (a tutorial). Student-Centered World. https://www.studentcenteredworld.com/gimkit/

    Keet, J. (2021, July 9). How to Use Gimkit- Step By Step Guide. Teachers.Tech.
    https://teachers.tech/how-to-use-gimkit/

    Bree, E. (2025, June 6). Unlock Data-Driven Teaching: Using Gimkit for Meaningful
    Assessment Insights. GIMKIT JOIN.
    https://gimkitjoin.net/gimkit-for-meaningful-assessment-insights/

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Middle and high school students need education, career guidance

    Middle and high school students need education, career guidance

    Key points:

    Students need more support around education paths and career options, including hands-on experiences, according to a new nationwide survey from the nonprofit American Student Assistance.

    The survey of more than of 3,000 students in grades 7-12 offers insights into teens’ plans after high school. The research, Next Steps: An Analysis of Teens’ Post-High School Plans, uncovers evolving trends in teenagers’ attitudes, perceptions, and decision-making about their post-high school plans.

    “This analysis of teens’ post high school plans reveals shifts in students’ thinking and planning. We need to change the way we help young people navigate the complex and evolving landscape of education and career options,” said Julie Lammers, Executive Vice President of ASA. “Starting in middle school, our young people need early access to opportunities that empowers them to explore careers that match their interests and strengths; hands-on, skills-based experiences in high school; and information and resources to navigate their path to postsecondary education and career. All of this will enable them to graduate informed, confident, and empowered about what they want to do with their futures.” 

    The survey offers notable findings regarding parental influence on teens’ planning, perceptions of nondegree pathways like trade or technical school, apprenticeships, and certificate programs, and a continued drop-off in kids’ plans to go to college immediately after high school graduation.

    Key findings include:

    Teens’ interest in college is down while nondegree paths are on the rise. Nearly half of all students said they aren’t interested in going to college, with just 45 percent saying two- or four-year college was their most likely next step. Meanwhile 38 percent of teens said they were considering trade or technical schools, apprenticeships, and technical bootcamp programs, although only 14 percent say that such a path is their most likely next step.

    Parents are one of teens’ biggest influencersand they’re skeptical of nondegree options. A vast majority (82 percent) of teens said their parents agree with their plans to go to four-year college, while only 66 percent said parents supported plans to pursue a nondegree route. In fact, teens reported parents were actually more supportive (70 percent) of foregoing education altogether right after high school vs. pursuing a nondegree program.    

    A concerning number of young people don’t have plans for further education or training. Nearly one quarter (23 percent) said they have no immediate plans to continue formal education or training upon graduation. Teens not planning to continue education after high school indicated they were thinking of beginning full-time work, entering a family business, starting their own business, or joining the military.

    Teens, and especially middle schoolers, are feeling better prepared to plan their futures. In recent years policymakers, educators, employers, and other stakeholders have pushed to make career-connected learning a more prominent feature of our education to workforce system. Survey results say it’s paying off. Agreement with the statement “my school provides me with the right resources to plan for my next steps after high school” grew from just 59 percent in 2018, to 63 percent in 2021, to 82 percent in 2024. Notably, the largest increase occurred at the middle school level, where confidence in in-school planning resources jumped from 60 percent in 2018 to 90 percent in 2024.

    Girls are much more likely to plan to attend college than boys. Boys and girls are equally interested in college when they’re in middle school, but by high school, more than half (53 percent) of girls say they’re likely to attend college compared to just 39 percent of boys. The gender gap is smaller when it comes to nondegree pathways: 15 percent of high school boys say they will likely attend vocational/trade school, participate in an apprenticeship, or take a certificate program, compared to 10 percent of high school girls.

    City kids aren’t as “into” college. Urban teens were least likely (39 percent) to say they plan to go to college. Suburban teens are much more likely to plan to attend a college program (64 percent) while 46 percent of rural students planned on college.

    Students of color are college bound. More than half (54 percent) of Black teens and 51 percent of Hispanic youth are planning to go to college, compared to 42 percent of White teens.  

    This press release originally appeared online.

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link

  • Hard up for students, more colleges are offering college credit for life experience, or ‘prior learning’

    Hard up for students, more colleges are offering college credit for life experience, or ‘prior learning’

    PITTSBURGH — Stephen Wells was trained in the Air Force to work on F-16 fighter jets, including critical radar, navigation and weapons systems whose proper functioning meant life or death for pilots.

    Yet when he left the service and tried to apply that expertise toward an education at Pittsburgh’s Community College of Allegheny County, or CCAC, he was given just three credits toward a required class in physical education.

    Wells moved forward anyway, going on to get his bachelor’s and doctoral degrees. Now he’s CCAC’s provost and involved in a citywide project to help other people transform their military and work experience into academic credit.

    What’s happening in Pittsburgh is part of growing national momentum behind letting students — especially the increasing number who started but never completed a degree — cash in their life skills toward finally getting one, saving them time and money. 

    Colleges and universities have long purported to provide what’s known in higher education as credit for prior learning. But they have made the process so complex, slow and expensive that only about 1 in 10 students actually completes it

    Many students don’t even try, especially low-income learners who could benefit the most, according to a study by the Western Interstate Commission for Higher Education and the Council for Adult and Experiential Learning, or CAEL.

    “It drives me nuts” that this promise has historically proven so elusive, Wells said, in his college’s new Center for Education, Innovation & Training.

    Stephen Wells, provost at the Community College of Allegheny County in Pittsburgh. An Air Force veteran, Wells got only a handful of academic credits for his military experience. Now he’s part of an effort to expand that opportunity for other students. Credit: Nancy Andrews for The Hechinger Report

    That appears to be changing. Nearly half of institutions surveyed last year by the American Association of Collegiate Registrars and Admissions Officers, or AACRAO, said they have added more ways for students to receive these credits — electricians, for example, who can apply some of their training toward academic courses in electrical engineering, and daycare workers who can use their experience to earn degrees in teaching. 

    Related: Interested in innovations in higher education? Subscribe to our free biweekly higher education newsletter.

    The reason universities and colleges are doing this is simple: Nearly 38 million working-age Americans have spent some time in college but never finished, according to the National Student Clearinghouse Research Center. Getting at least some of them to come back has become essential to these higher education institutions at a time when changing demographics mean that the number of 18-year-old high school graduates is falling.

    “When higher education institutions are fat and happy, nobody looks for these things. Only when those traditional pipelines dry up do we start looking for other potential populations,” said Jeffrey Harmon, vice provost for strategic initiatives and institutional effectiveness at Thomas Edison State University in New Jersey, which has long given adult learners credit for the skills they bring.

    Being able to get credit for prior learning is a huge potential recruiting tool. Eighty-four percent of adults who are leaning toward going back to college say it would have “a strong influence” on their decision, according to research by CAEL, the Strada Education Foundation and Hanover Research. (Strada is among the funders of The Hechinger Report, which produced this story.)

    The Center for Education, Innovation & Training at the Community College of Allegheny County in Pittsburgh. The college is part of a citywide effort to give academic credit for older students’ life experiences. Credit: Nancy Andrews for The Hechinger Report

    When Melissa DiMatteo, 38, decided to get an associate degree at CCAC to go further in her job, she got six credits for her previous training in Microsoft Office and her work experience as everything from a receptionist to a supervisor. That spared her from having to take two required courses in computer information and technology and — since she’s going to school part time and taking one course per semester — saved her a year.

    “Taking those classes would have been a complete waste of my time,” DiMatteo said. “These are things that I do every day. I supervise other people and train them on how to do this work.”

    On average, students who get credit for prior learning save between $1,500 and $10,200 apiece and nearly seven months off the time it takes to earn a bachelor’s degree, the nonprofit advocacy group Higher Learning Advocates calculates. The likelihood that they will graduate is 17 percent higher, the organization finds.

    Related: The number of 18-year-olds is about to drop sharply, packing a wallop for colleges — and the economy 

    Justin Hand dropped out of college because of the cost, and became a largely self-taught information technology manager before he decided to go back and get an associate and then a bachelor’s degree so he could move up in his career.

    He got 15 credits — a full semester’s worth — through a program at the University of Memphis for which he wrote essays to prove he had already mastered software development, database management, computer networking and other skills.

    “These were all the things I do on a daily basis,” said Hand, of Memphis, who is 50 and married, with a teenage son. “And I didn’t want to have to prolong college any more than I needed to.”

    Meanwhile, employers and policymakers are pushing colleges to speed up the output of graduates with skills required in the workforce, including by giving more students credit for their prior learning. And online behemoths Western Governors University and Southern New Hampshire University, with which brick-and-mortar colleges compete, are way ahead of them in conferring credit for past experience.

    “They’ve mastered this and used it as a marketing tool,” said Kristen Vanselow, assistant vice president of innovative education and partnerships at Florida Gulf Coast University, which has expanded its awarding of credit for prior learning. “More traditional higher education institutions have been slower to adapt.”

    It’s also gotten easier to evaluate how skills that someone learns in life equate to academic courses or programs. This has traditionally required students to submit portfolios, take tests or write essays, as Hand did, and faculty to subjectively and individually assess them. 

    Related: As colleges lose enrollment, some turn to one market that’s growing: Hispanic students

    Now some institutions, states, systems and independent companies are standardizing this work or using artificial intelligence to do it. The growth of certifications from professional organizations such as Amazon Web Services and the Computing Technology Industry Association, or CompTIA, has helped, too.

    “You literally punch [an industry certification] into our database and it tells you what credit you can get,” said Philip Giarraffa, executive director of articulation and academic pathways at Miami Dade College. “When I started here, that could take anywhere from two weeks to three months.”

    Data provided by Miami Dade shows it has septupled the number of credits for prior learning awarded since 2020, from 1,197 then to 7,805 last year.

    “These are students that most likely would have looked elsewhere, whether to the [online] University of Phoenix or University of Maryland Global [Campus]” or other big competitors, Giarraffa said.

    Fifteen percent of undergraduates enrolled in higher education full time and 40 percent enrolled part time are 25 or older, federal data show — including people who delayed college to serve in the military, volunteer or do other work that could translate into academic credit. 

    “Nobody wants to sit in a class where they already have all this knowledge,” Giarraffa said. 

    At Thomas Edison, police academy graduates qualify for up to 30 credits toward associate degrees. Carpenters who have completed apprenticeships can get as many as 74 credits in subjects including math, management and safety training. Bachelor’s degrees are often a prerequisite for promotion for people in professions such as these, or who hope to start their own companies.

    Related: To fill ‘education deserts,’ more states want community colleges to offer bachelor’s degrees

    The University of Memphis works with FedEx, headquartered nearby, to give employees with supervisory training academic credit they can use toward a degree in organizational leadership, helping them move up in the company.

    The University of North Carolina System last year launched its Military Equivalency System, which lets active-duty and former military service members find out almost instantly, before applying for admission, if their training could be used for academic credit. That had previously required contacting admissions offices, registrars or department chairs. 

    Among the reasons for this reform was that so many of these prospective students — and the federal education benefits they get — were ending up at out-of-state universities, the UNC System’s strategic plan notes.

    “We’re trying to change that,” said Kathie Sidner, the system’s director of workforce and partnerships. It’s not only for the sake of enrollment and revenue, Sidner said. “From a workforce standpoint, these individuals have tremendous skill sets and we want to retain them as opposed to them moving somewhere else.”

    Related: A new way to help some college students: Zero percent, no-fee loans

    California’s community colleges are also expanding their credit for prior learning programs as part of a plan to increase the proportion of the population with educations beyond high school

    “How many people do you know who say, ‘College isn’t for me?’ ” asked Sam Lee, senior advisor to the system’s chancellor for credit for prior learning. “It makes a huge difference when you say to them that what they’ve been doing is equivalent to college coursework already.”

    In Pittsburgh, the Regional Upskilling Alliance — of which CCAC is a part — is connecting job centers, community groups, businesses and educational institutions to create comprehensive education and employment records so more workers can get credit for skills they already have.

    That can provide a big push, “especially if you’re talking about parents who think, ‘I’ll never be able to go to school,’ ” said Sabrina Saunders Mosby, president and CEO of the nonprofit Vibrant Pittsburgh, a coalition of business and civic leaders involved in the effort. 

    Pennsylvania is facing among the nation’s most severe declines in the number of 18-year-old high school graduates. 

    “Our members are companies that need talent,” Mosby said. 

    There’s one group that has historically pushed back against awarding credit for prior learning: university and college faculty concerned it might affect enrollment in their courses or unconvinced that training provided elsewhere is of comparable quality. Institutions have worried about the loss of revenue from awarding credits for which students would otherwise have had to pay.

    That also appears to be changing, as universities leverage credit for prior learning to recruit more students and keep them enrolled for longer, resulting in more revenue — not less. 

    “That monetary factor was something of a myth,” said Beth Doyle, chief of strategy at CAEL.

    Faculty have increasingly come around, too. That’s sometimes because they like having experienced students in their classrooms, Florida Gulf Coast’s Vanselow said. 

    Related: States want adults to return to college. Many roadblocks stand in the way 

    Still, while many recognize it as a recruiting incentive, most public universities and colleges have had to be ordered to confer more credits for prior learning by legislatures or governing boards. Private, nonprofit colleges remain stubbornly less likely to give it.

    More than two-thirds charge a fee for evaluating whether other kinds of learning can be transformed into academic credit, an expense that isn’t covered by financial aid. Roughly one in 12 charge the same as it would cost to take the course for which the credits are awarded. 

    Debra Roach, vice president for workforce development at the Community College of Allegheny County in Pittsburgh. The college is working on giving academic credit to students for their military, work and other life experience. Credit: Nancy Andrews for The Hechinger Report

    Seventy percent of institutions require that students apply for admission and be accepted before learning whether credits for prior learning will be awarded. Eighty-five percent limit how many credits for prior learning a student can receive.

    There are other confounding roadblocks and seemingly self-defeating policies. CCAC runs a noncredit program to train paramedics, for example, but won’t give people who complete it credits toward its for-credit nursing degree. Many leave and go across town to a private university that will. The college is working on fixing this, said Debra Roach, its vice president of workforce development.

    It’s important to see this from the students’ point of view, said Tracy Robinson, executive director of the University of Memphis Center for Regional Economic Enrichment.

    “Credit for prior learning is a way for us to say, ‘We want you back. We value what you’ve been doing since you’ve been gone,’ ” Robinson said. “And that is a total game changer.”

    Contact writer Jon Marcus at 212-678-7556, [email protected] or jpm.82 on Signal.

    This story about credit for prior learning was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for our higher education newsletter. Listen to our higher education podcast.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • How can students’ module feedback help prepare for success in NSS?

    How can students’ module feedback help prepare for success in NSS?

    Since the dawn of student feedback there’s been a debate about the link between module feedback and the National Student Survey (NSS).

    Some institutions have historically doubled down on the idea that there is a read-across from the module learning experience to the student experience as captured by NSS and treated one as a kind of “dress rehearsal” for the other by asking the NSS questions in module feedback surveys.

    This approach arguably has some merits in that it sears the NSS questions into students’ minds to the point that when they show up in the actual NSS it doesn’t make their brains explode. It also has the benefit of simplicity – there’s no institutional debate about what module feedback should include or who should have control of it. If there isn’t a deep bench of skills in survey design in an institution there could be a case for adopting NSS questions on the grounds they have been carefully developed and exhaustively tested with students. Some NSS questions have sufficient relevance in the module context to do the job, even if there isn’t much nuance there – a generic question about teaching quality or assessment might resonate at both levels, but it can’t tell you much about specific pedagogic innovations or challenges in a particular module.

    However, there are good reasons not to take this “dress rehearsal” approach. NSS endeavours to capture the breadth of the student experience at a very high level, not the specific module experience. It’s debatable whether module feedback should even be trying to measure “experience” – there are other possible approaches, such as focusing on learning gains, or skills development, especially if the goal is to generate actionable feedback data about specific module elements. For both students and academics seeing the same set of questions repeated ad nauseam is really rather boring, and is as likely to create disengagement and alienation from the “experience” construct NSS proposes than a comforting sense of familiarity and predictability.

    But separating out the two feedback mechanisms entirely doesn’t make total sense either. Though the totemic status of NSS has been tempered in recent years it remains strategically important as an annual temperature check, as a nationally comparable dataset, as an indicator of quality for the Teaching Excellence Framework and, unfortunately, as a driver of league table position. Securing consistently good NSS scores, alongside student continuation and employability, will feature in most institutions’ key performance indicators and, while vice chancellors and boards will frequently exercise their critical judgement about what the data is actually telling them, when it comes to the crunch no head of institution or board wants to see their institution slip.

    Module feedback, therefore, offers an important “lead indicator” that can help institutions maximise the likelihood that students have the kind of experience that will prompt them to give positive NSS feedback – indeed, the ability to continually respond and adapt in light of feedback can often be a condition of simply sustaining existing performance. But if simply replicating the NSS questions at module level is not the answer, how can these links best be drawn? Wonkhe and evasys recently convened an exploratory Chatham House discussion with senior managers and leaders from across the sector to gather a range of perspectives on this complex issue. While success in NSS remains part of the picture for assigning value and meaning to module feedback in particular institutional contexts there is a lot else going on as well.

    A question of purpose

    Module feedback can serve multiple purposes, and it’s an open question whether some of those purposes are considered to be legitimate for different institutions. To give some examples, module feedback can:

    • Offer institutional leaders an institution-wide “snapshot” of comparable data that can indicate where there is a need for external intervention to tackle emerging problems in a course, module or department.
    • Test and evaluate the impact of education enhancement initiatives at module, subject or even institution level, or capture progress with implementing systems, policies or strategies
    • Give professional service teams feedback on patterns of student engagement with and opinions on specific provision such as estates, IT, careers or library services
    • Give insight to module leaders about specific pedagogic and curriculum choices and how these were received by students to inform future module design
    • Give students the opportunity to reflect on their own learning journey and engagement
    • Generate evidence of teaching quality that academic staff can use to support promotion or inform fellowship applications
    • Depending on the timing, capture student sentiment and engagement and indicate where students may need additional support or whether something needs to be changed mid-module

    Needless to say, all of these purposes can be legitimate and worthwhile but not all of them can comfortably coexist. Leaders may prioritise comparability of data ie asking the same question across all modules to generate comparable data and generate priorities. Similarly, those operating across an institution may be keen to map patterns and capture differences across subjects – one example offered at the round table was whether students had met with their personal tutor. Such questions may be experienced at department or module level as intrusive and irrelevant to more immediately purposeful questions around students’ learning experience on the module. Module leaders may want to design their own student evaluation questions tailored to inform their pedagogic practice and future iterations of the module.

    There are also a lot of pragmatic and cultural considerations to navigate. Everyone is mindful that students get asked to feed back on their experiences A LOT – sometimes even before they have had much of a chance to actually have an experience. As students’ lives become more complicated, institutions are increasingly wary of the potential for cognitive overload that comes with being constantly asked for feedback. Additionally, institutions need to make their processes of gathering and acting on feedback visible to students so that students can see there is an impact to sharing their views – and will confirm this when asked in the NSS. Some institutions are even building questions that test whether students can see the feedback loop being closed into their student surveys.

    Similarly, there is also a strong appreciation of the need to adopt survey approaches that support and enable staff to take action and adapt their practice in response to feedback, affecting the design of the questions, the timing of the survey, how quickly staff can see the results and the degree to which data is presented in a way that is accessible and digestible. For some, trusting staff to evaluate their modules in the way they see fit is a key tenet of recognising their professionalism and competence – but there is a trade-off in terms of visibility of data institution-wide or even at department or subject level.

    Frameworks and ecosystems

    There are some examples in the sector of mature approaches to linking module evaluation data to NSS – it is possible to take a data-led approach that tests the correlation between particular module evaluation question responses and corresponding NSS question outcomes within particular thematic areas or categories, and builds a data model that proposes informed hypotheses about areas of priority for development or approaches that are most likely to drive NSS improvement. This approach does require strong data analysis capability, which not every institution has access to, but it certainly warrants further exploration where the skills are there. The use of a survey platform like evasys allows for the creation of large module evaluation datasets that could be mapped on to NSS results through business intelligence tools to look for trends and correlations that could indicate areas for further investigation.

    Others take the view that maximising NSS performance is something of a red herring as a goal in and of itself – if the wider student feedback system is working well, then the result should be solid NSS performance, assuming that NSS is basically measuring the right things at a high level. Some go even further and express concern that over-focus on NSS as an indicator of quality can be to the detriment of designing more authentic student voice ecosystems.

    But while thinking in terms of the whole system is clearly going to be more effective than a fragmented approach, given the various considerations and trade-offs discussed it is genuinely challenging for institutions to design such effective ecosystems. There is no “right way” to do it but there is an appetite to move module feedback beyond the simple assessment of what students like or don’t like, or the checking of straightforward hygiene factors, to become a meaningful tool for quality enhancement and pedagogic innovation. There is a sense that rather than drawing direct links between module feedback and NSS outcomes, institutions would value a framework-style approach that is able to accommodate the multiple actors and forms of value that are realised through student voice and feedback systems.

    In the coming academic year Wonkhe and evasys are planning to work with institutional partners on co-developing a framework or toolkit to integrate module feedback systems into wider student success and academic quality strategies – contact us to express interest in being involved.

    This article is published in association with evasys.

    Source link

  • Why students reject second-language study – Campus Review

    Why students reject second-language study – Campus Review

    Students are turning away from learning a second language other than English because they don’t see it as a viable qualification even though it is a core skill in other countries, experts have flagged.

    Please login below to view content or subscribe now.

    Membership Login

    Source link