Tag: Career

  • Where do states stand on college and career readiness metrics?

    Where do states stand on college and career readiness metrics?

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • While nearly every state has some form of college and career readiness criteria for high school students, there are still areas for growth in how data on students’ postsecondary readiness is collected, according to a July report from All4Ed and the Urban Institute. 
    • Though criteria vary depending on each state’s priorities and goals, 42 states currently use at least one college and career readiness indicator in their school accountability systems.
    • Accountability systems include both indicators and measures. The report defines indicators as offering information on a critical aspect of school performance, while measures are the data points used within an indicator to determine whether particular student inputs or outcomes were achieved.

    Dive Insight:

    “Forty-two states are using a college and career readiness indicator, that’s great progress,” said Anne Hyslop, All4Ed’s director of policy development and the report’s author. “All of these measures have been developed in the last decade or so.”

    The report found that 39 of the 42 states with indicators include both college and career readiness measures, and 20 of these states also measure military or civic readiness.

    Advanced Placement or International Baccalaureate courses and exams are the most common measures of college readiness, used by 35 states. They are followed by dual or concurrent enrollment coursework (34 states) and college admission test scores, such as the SAT and ACT (26 states).

    For career readiness assessment, earning industry-recognized credentials or completing a career and technical education pathway are the most common measures. Some states also use work-based learning or internships.  

    Hyslop noted that not all states have a clear distinction between indicators for college, career and military readiness. Some states combine several measures into a single indicator, while others group different sets of measures into multiple indicators. 

    “This is where getting better transparency and data would be really helpful,” Hyslop said. “A lot of states may report readiness across all of the measures, but they don’t report how many students are ready for college, how many are ready for career, etc. They don’t report it separately.”

    The report highlighted North Dakota as a good example of this distinction. The state’s indicator, Choice Ready, has a list of essential skills required of all students that align with the state’s graduation requirements. Once students have demonstrated these essential skills, they need to show readiness in two of three areas: postsecondary ready, workforce ready or military ready. 

    For Hyslop, improving data collection is the “lowest-hanging fruit.” 

    “There is so much data that is being collected on student readiness, but the way that it is reported is not necessarily leading to the maximum value from that data, because it’s not always fully disaggregated by student subgroups,” said Hyslop. “It’s just a matter of packaging it in more useful formats.”

    The outlier states that do not currently have a college and career readiness indicator are Alaska, Kansas, Maine, Minnesota, Nebraska, New Jersey, Oregon and Wisconsin, according to the report. Illinois is currently in the final stages of developing its indicator.

    Source link

  • How Technology Can Smooth Pain Points in Credit Evaluation

    How Technology Can Smooth Pain Points in Credit Evaluation

    Earlier this month, higher education policy leaders from all 50 states gathered in Minneapolis for the 2025 State Higher Education Executive Officers Higher Education Policy Conference. During a plenary session on the future of learning and work and its implications for higher education, Aneesh Raman, chief economic opportunity officer at LinkedIn, reflected on the growing need for people to be able to easily build and showcase their skills.

    In response to this need, the avenues for learning have expanded, with high numbers of Americans now completing career-relevant training and skill-building through MOOCs, microcredentials and short-term certificates, as well as a growing number of students completing postsecondary coursework while in high school through dual enrollment.

    The time for pontificating about the implications for higher education is past; what’s needed now is a pragmatic examination of our long-standing practices to ask, how do we evolve to keep up? We find it prudent and compelling to begin at the beginning—that is, with the learning-evaluation process (aka credit-evaluation process), as it stands to either help integrate more Americans into higher education or serve to push them out.

    A 2024 survey of adult Americans conducted by Public Agenda for Sova and the Beyond Transfer Policy Advisory Board found, for example, that nearly four in 10 respondents attempted to transfer some type of credit toward a college credential. This included credit earned through traditional college enrollment and from nontraditional avenues, such as from trade/vocational school, from industry certification and from work or military experience. Of those who tried to transfer credit, 65 percent reported one or more negative experiences, including having to repeat prior courses, feeling limited in where they could enroll based on how their prior learning was counted and running out of financial aid when their prior learning was not counted. Worse, 16 percent gave up on earning a college credential altogether because the process of transferring credit was too difficult.

    What if that process were drastically improved? The Council for Adult and Experiential Learning’s research on adult learners finds that 84 percent of likely enrollees and 55 percent of those less likely to enroll agree that the ability to receive credit for their work and life experience would have a strong influence on their college enrollment plans. Recognizing the untapped potential for both learners and institutions, we are working with a distinguished group of college and university leaders, accreditors, policy researchers and advocates who form the Learning Evaluation and Recognition for the Next Generation (LEARN) Commission to identify ways to improve learning mobility and promote credential completion.

    With support from the American Association of Collegiate Registrars and Admissions Officers and Sova, the LEARN Commission has been analyzing the available research to better understand the limitations of and challenges within current learning evaluation approaches, finding that:

    • Learning-evaluation decision-making is a highly manual and time-intensive process that involves many campus professionals, including back-office staff such as registrars and transcript evaluators and academic personnel such as deans and faculty.
    • Across institutions, there is high variability in who performs reviews; what information and criteria are used in decision-making; how decisions are communicated, recorded and analyzed; and how long the process takes.
    • Along with this variability, most evaluation decisions are opaque, with little data used, criteria established or transparency baked in to help campus stakeholders understand how these decisions are working for learners.
    • While there have been substantial efforts to identify course equivalencies, develop articulation agreements and create frameworks for credit for prior learning to make learning evaluation more transparent and consistent, the data and technology infrastructure to support the work remain woefully underdeveloped. Without adequate data documenting date of assessment and aligned learning outcomes, credit for prior learning is often dismissed in the transfer process; for example, a 2024 survey by AACRAO found that 54 percent of its member institutions do not accept credit for prior learning awarded at a prior institution.

    Qualitative research examining credit-evaluation processes across public two- and four-year institutions in California found that these factors create many pain points for learners. For one, students can experience unacceptable wait times—in some cases as long as 24 weeks—before receiving evaluation decisions. When decisions are not finalized prior to registration deadlines, students can end up in the wrong classes, take classes out of sequence or end up extending their time to graduation.

    In addition to adverse impacts on students, MDRC research illuminates challenges that faculty and staff experience due to the highly manual nature of current processes. As colleges face dwindling dollars and real personnel capacity constraints, the status quo becomes unsustainable and untenable. Yet, we are hopeful that the thoughtful application of technology—including AI—can help slingshot institutions forward.

    For example, institutions like Arizona State University and the City University of New York are leading the way in integrating technology to improve the student experience. The ASU Transfer Guide and CUNY’s Transfer Explorer democratize course equivalency information, “making it easy to see how course credits and prior learning experiences will transfer and count.” Further, researchers at UC Berkeley are studying how to leverage the plethora of data available—including course catalog descriptions, course articulation agreements and student enrollment data—to analyze existing course equivalencies and provide recommendations for additional courses that could be deemed equivalent. Such advances stand to reduce the staff burden for institutions while preserving academic quality.

    While such solutions are not yet widely implemented, there is strong interest due to their high value proposition. A recent AACRAO survey on AI in credit mobility found that while just 15 percent of respondents report currently using AI for credit mobility, 94 percent of respondents acknowledge the technology’s potential to positively transform credit-evaluation processes. And just this year, a cohort of institutions across the country came together to pioneer new AI-enabled credit mobility technology under the AI Transfer and Articulation Infrastructure Network.

    As the LEARN Commission continues to assess how institutions, systems of higher education and policymakers can improve learning evaluation, we believe that increased attention to improving course data and technology infrastructure is warranted and that a set of principles can guide a new approach to credit evaluation. Based on our emerging sense of the needs and opportunities in the field, we offer some guiding principles below:

    1. Shift away from interrogating course minutiae to center learning outcomes in learning evaluation. Rather than fixating on factors like mode of instruction or grading basis, we must focus on the learning outcomes. To do so, we must improve course data in a number of ways, including adding learning outcomes to course syllabi and catalog descriptions and capturing existing equivalencies in databases where they can be easily referenced and applied.
    2. Provide students with reliable, timely information on the degree applicability of their courses and prior learning, including a rationale when prior learning is not accepted or applied. Institutions can leverage available technology to automate existing articulation rules, recommend new equivalencies and generate timely evaluation reports for students. This can create more efficient advising workflows, empower learners with reliable information and refocus faculty time to other essential work (see No.3).
    1. Use student outcomes data to improve the learning evaluation process. Right now, the default is that all prior learning is manually vetted against existing courses. But what if we shifted that focus to analyzing student outcomes data to understand whether students can be successful in subsequent learning if their credits are transferred and applied? In addition, institutions should regularly review course transfer, applicability and student success data at the department and institution level to identify areas for improvement—including in the design of curricular pathways, student supports and classroom pedagogy.
    2. Overhaul how learning is transcripted and how transcripts are shared. We can shorten the time involved on the front end of credit-evaluation processes by shifting away from manual transcript review to machine-readable transcripts and electronic transcript transmittal. When accepting and applying prior learning—be it high school dual-enrollment credit, credit for prior learning or a course transferred from another institution—document that learning in the transcript as a course (or, as a competency for competency-based programs) to promote its future transferability.
    3. Leverage available technology to help learners and workers make informed decisions to reach their end goals. In the realm of learning evaluation, this can be facilitated by integrating course data and equivalency systems with degree-modeling software to enable learners and advisers to identify the best path to a credential that minimizes the amount of learning that’s left on the table.

    In these ways, we can redesign learning evaluation processes to accelerate students’ pathways and generate meaningful value in the changing landscape of learning and work. Through the LEARN Commission, we will continue to refine this vision and identify clear actionable steps. Stay tuned for the release of our full set of recommendations this fall and join the conversation at #BeyondTransfer.

    Beth Doyle is chief of strategy at the Council for Adult and Experiential Learning and is a member of the LEARN Commission.

    Carolyn Gentle-Genitty is the inaugural dean of Founder’s College at Butler University and is a member of the LEARN Commission.

    Jamienne S. Studley is the immediate past president of the WASC Senior College and University Commission and is a member of the LEARN Commission.

    Source link

  • SUNY Expands Local News Collaborations for Student Learning

    SUNY Expands Local News Collaborations for Student Learning

    Over the past decade, local newsrooms have been disappearing from the U.S., leaving communities without a trusted information source for happenings in their region. But a recently established initiative from the State University of New York aims to deploy student reporters to bolster the state’s independent and public news organizations.

    Last year SUNY launched the Institute for Local News, engaging a dozen student reporting programs at colleges across the state—including Stony Brook University, the University at Buffalo and the University at Albany—to produce local news content. Faculty direct and edit content produced by student journalists for local media partners.

    This summer, the Institute sent its first cohort of journalism interns out into the field, offering 20 undergraduates paid roles in established newsrooms. After a successful first year, SUNY leaders plan to scale offerings to include even more student interns in 2026.

    The background: The Institute for Local News has a few goals, SUNY chancellor John B. King told Inside Higher Ed: to mobilize students to engage in local news reporting in places that otherwise may not be covered, to instill students with a sense of civic service and to provide meaningful experiential learning opportunities.

    News deserts, or areas that lack news sources, can impact community members’ ability to stay informed about their region. New York saw a 40 percent decrease in newspaper publications from 2004 to 2019, according to data from the University of North Carolina.

    Research from the University of Vermont’s Center for Community News found that over 1,300 colleges and universities are located in or near counties defined as news deserts, but last year nearly 3,000 student journalists in university-led programs helped those communities by publishing tens of thousands of stories in local news outlets.

    A 2024 study from the Business–Higher Education Forum found a lack of high-quality internships available for all college students, compared to the number of students who want to partake in these experiences. Research also shows students believe internships are a must-have to launch their careers, but not everyone can participate, often due to competing priorities or financial constraints.

    To combat these challenges, SUNY, aided by $14.5 million in support from the New York State budget, is working to expand internship offerings—including in journalism—by providing pay and funds for transportation and housing as needed.

    “We think having those hands-on learning opportunities enriches students’ academic experience and better prepares them for postgraduation success,” King said.

    The Institute for Local News is backed by funding from the Lumina Foundation and is part of the Press Forward movement.

    On the ground: Grace Tran, a rising senior at SUNY Oneonta majoring in media studies, was one of the first 20 students selected to participate in an internship with a local news organization this summer.

    Tran and her cohort spent three days at Governor’s Island learning about journalism, climate issues and water quality in New York City before starting their assignments for the summer. Tran worked at Capital Region Independent Media in Clifton Park as a video editor and producer, cutting interviews, filming on-site and interviewing news sources.

    “I wasn’t a journalism buff but more [focused on] video production,” Tran said. “But having this internship got me into that outlet, and it taught me so much and now I feel like a journalism buff.”

    In addition to exploring new parts of the region and digging deeper into news principles, Tran built a professional network and learned how to work alongside career professionals.

    “It’s my first-ever media job and there were no other interns there; it was just me with everyone else who’s been in this industry for such a long time,” Tran said. “It built a lot of [my] communication skills—how you should act, professionalism, you know, you can’t go to a site in jeans or with a bad attitude.”

    Meeting the other SUNY journalism interns before starting full-time was important, Tran said, because it gave her peers for feedback and support.

    What’s next: SUNY hopes to replicate this year’s numbers of 160 students publishing work and 20 summer interns through the Institute for Local News and expand internships in the near future, King said.

    The Institute for Local News is just one avenue for students to get hands-on work experience, King said. SUNY is building out partnerships with the Brooklyn and New York Public Library systems for internships, as well as opportunities to place interns with the Department of Environmental Conservation to focus on climate action.

    “We have a ways to go to get to our goal for every SUNY undergraduate to have that meaningful internship experience,” King said. “But we really want to make sure every student has that opportunity.”

    Do you have a career-focused intervention that might help others promote student success? Tell us about it.

    Source link

  • When Majors Matter

    When Majors Matter

    I’ll admit a pet peeve when writers set out two extreme views, attributed vaguely to others, and then position themselves in the squishy middle as the embodiment of the golden mean. It seems too easy and feeds the cultural myth that the center is always correct.

    So, at the risk of annoying myself, I’ve been frustrated with the discourse recently around whether students’ choice of majors matters. It both does and doesn’t, though that may be more obvious from a community college perspective than from other places.

    “Comprehensive” community colleges, such as my own, are called that because they embrace both a transfer mission (“junior college”) and a vocational mission (“trade school”). The meaning of a major can be very different across that divide.

    For example, students who major in nursing have the inside track at becoming nurses in a way that students who major in, say, English don’t. Welding is a specific skill. HVAC repair is a skill set aimed squarely at certain kinds of jobs. In each case, the goal is a program—sometimes a degree, sometimes a diploma or certificate—that can lead a student directly into employment that pays a living wage. In some cases, such as nursing, it’s fairly normal to go on to higher degrees; in others, such as welding, it’s less common. Either way, though, the content of what’s taught is necessary to get into the field.

    In many transfer-focused programs, the opposite is true. A student with the eventual goal of, say, law school can take all sorts of liberal arts classes here, then transfer and take even more. Even if they want to stop at the bachelor’s level, the first two years of many bachelor’s programs in liberal arts fields are as much about breadth as about depth. Distribution requirements are called what they’re called because the courses are distributed across the curriculum.

    At the level of a community college, you might not be able to distinguish the future English major from the future poli sci major by looking at their transcripts. They’ll take basic writing, some humanities, some social science, some math, some science and a few electives. And many receiving institutions prefer that students don’t take too many classes in their intended major in the first two years. Whether that’s because of a concern for student well-roundedness or an economic concern among departments about giving away too many credits is another question.

    Of course, sometimes the boundary gets murky. Fields like social work straddle the divide between vocational and transfer, since the field often requires a bachelor’s degree. Similarly, a field like criminal justice can be understood as police training, but it also branches into criminology and sociology. And business, a perennially popular major, often leads to transfer despite defining itself as being all about the market.

    The high-minded defense of the view that majors don’t matter is that student interest is actually much more important than choice of major. I agree strongly with that. I’d much rather see a student who loves literature study that than force herself to slog through an HVAC program, hating every moment of it. The recent travails of computer science graduates in the job market should remind us that there are no guaranteed occupations. Students who love what they study, or who just can’t stop thinking about it, get the most out of it. And after a few years, most adults with degrees are working in fields unrelated to their degrees anyway. To me, that’s a strong argument for the more evergreen skills of communication, analysis, synthesis, research and teamwork: No matter what the next hot technology is, people who have those skills are much more likely to thrive than people who don’t. A candidate’s tech skill may get them the first job, but their soft skills—not a fan of the term—get them promoted.

    I want our students to be able to support themselves in the world that actually exists. I also want them to be able to support themselves in the world that will exist 20 years from now. Technological trends can be hard to get right. Remember when MOOCs were going to change everything? Or the Segway? In my more optimistic moments, I like to think that bridging the divide between the liberal arts and the vocational fields is one of the best things community colleges can do. Even if that feels squishy and centrist.

    Source link

  • Embracing Transparency After a Rankings Scandal

    Embracing Transparency After a Rankings Scandal

    It’s college rankings season again, a time of congratulations, criticism and, occasionally, corrections for institutions and the organizations that rate them.

    Typically U.S. News & World Report, the giant of the college rankings world, unranks some institutions months after its results are published over data discrepancies that are usually the result of honest mistakes. But in rare instances, erroneous data issues aren’t mistakes but outright fraud. And when that happens, it can result in soul-searching and, ideally, redemption for those involved.

    That’s what happened at Temple University, which was rocked by a rankings scandal in 2018, when it became clear that Moshe Porat, the dean of Temple’s Richard J. Fox School of Business and Management, had knowingly provided false data to U.S. News for years in a successful effort to climb the rankings. Temple’s online master of business administration soared to No. 1—until the scheme was exposed. U.S. News temporarily unranked the program, the U.S. Department of Education hit Temple with a $700,000 fine and Porat was convicted of fraud.

    Since then, Temple has worked hard to restore its reputation. In the aftermath of the scandal, officials imposed universitywide changes to how it handles facts and figures, establishing a Data Verification Unit within the Ethics and Compliance Office. Now any data produced by the university goes through a phalanx of dedicated fact-checkers, whether it’s for a rankings evaluation or an admissions brochure.

    A Culture Shift

    Temple’s Data Verification Unit was introduced in 2019 amid the fallout of the rankings scandal.

    At first, it gave rise to “friction points,” as university officials were required to go through new processes to verify data before it was disseminated, said Susan Smith, Temple’s chief compliance officer. But now she believes the unit has won the trust of colleagues on campus who have bought in to more rigorous fact-checking measures.

    “It’s been an incredibly positive thing for Temple and I think for data integrity over all,” Smith said.

    Initially, Temple partnered with an outside law firm to verify data and lay the groundwork for the unit. Now that is all handled in-house by a small team that works across the university.

    While Smith said “the vast majority of mistakes” she sees “are innocent,” her team is there “to act as a sort of backstop” and to “verify that the data is accurate, that there’s integrity in the data.”

    The Data Verification Unit also provides training on best practices for data use and dissemination.

    University officials believe placing the Data Verification Unit under the centralized Office of Compliance and Ethics—which reports directly to Temple’s Board of Trustees—is unique. And some say the process has created a bit of a culture shift as they run numbers by the unit.

    Temple spokesperson Stephen Orbanek, who joined the university after the rankings scandal, said running news releases by the Data Verification Unit represented a “total change” from the way he was accustomed to operating. And while it can sometimes slow down the release of certain data points or responses to media requests, he said he’s been able to give reporters more robust data.

    He also noted times when Temple has had to pull back on marketing claims and use “less impressive” statistics after the Data Verification Unit flagged issues with materials. As an example, he cited a fact sheet put out by the university in which officials wanted to refer to Temple as a top producer of Fulbright scholars. But the Data Verification Unit insisted that a caveat was needed: The statistic pertained only to the 2022–23 academic year.

    Ultimately, Orbanek sees the Data Verification Unit as a boon for a more transparent campus culture.

    “The culture has just kind of shifted, and you get on board,” Orbanek said.

    Other Rankings Scandals

    Other universities have been less forthcoming about fixing their own data issues.

    In 2022, a professor called out his employer, Columbia University, for submitting inaccurate data to U.S. News, which responded by unranking the institution for a short time. Following the scandal and accusations of fraud by some critics, Columbia announced the university would no longer submit data to U.S. News. Officials argued that the rankings have outsize influence on prospective students but don’t adequately measure institutional quality.

    Yet Columbia still publishes large swaths of data, such as its Common Data Set. Asked how the university has acted to verify data in the aftermath of the rankings scandal, a spokesperson wrote by email that data is “reviewed by a well-established, independent advisory firm to ensure reporting accuracy” but did not respond to a request for more details on the verification processes.

    The University of Southern California also navigated a rankings scandal in 2022. USC provided faulty data to U.S. News for its Rossier School of Education, omitting certain metrics, which helped it rise in the rankings, according to a third-party report that largely blamed a former dean.

    U.S. News temporarily unranked Rossier; graduate students sued the university, accusing officials of falsely advertising rankings based on fraudulent data. That legal battle is ongoing, and earlier this year a judge ruled that the case can proceed as a class action suit.

    Officials did not respond to a request from Inside Higher Ed for comment on whether or how USC has changed the way it verifies data for use in rankings or for other purposes.

    U.S. News also did not respond to specific questions about if or how it verifies that information submitted by institutions to be used for ranking purposes is accurate. A spokesperson told Inside Higher Ed, “U.S. News believes that data transparency and internal accountability practices by educational institutions are good for those institutions and good for consumers.”

    Source link

  • Mary Baldwin President Suddenly Resigns

    Mary Baldwin President Suddenly Resigns

    Liz Albro Photography/iStock/Getty Images

    Mary Baldwin University president Jeff Stein resigned Tuesday after two years in the role, The News Leader reported. Fall classes at the formerly all-women private university in Staunton, Va., started Monday. 

    A university spokesperson told Inside Higher Ed that Stein resigned for personal reasons, and the university has not shared any other information about his departure.

    Stein was the first male president at Mary Baldwin since 1976 and assumed the role in 2023 after former president Pamela Fox retired. The university’s Board of Trustees appointed Todd Telemeco, who was the vice president and dean of Mary Baldwin’s Murphy Deming College of Health Sciences, as Stein’s permanent replacement. 

    “We thank Dr. Stein and his wife, Chrissy, for their two years of service to the University, and we wish them the best in their future endeavors. We are especially grateful for Dr. Stein’s ability to reinvigorate the connection between the University and our alumni,” board co-chairs Eloise Chandler and Constance Dierickx wrote in a statement. “This renewed energy in alumni relations has also contributed to significantly higher alumni giving rates.”

    Prior to becoming president at Mary Baldwin, Stein served as vice president for strategic initiatives and partnerships and an associate professor of English at Elon University in North Carolina.

    Source link

  • “Happiness Effect” of Higher Ed “Fades in Richer Places”

    “Happiness Effect” of Higher Ed “Fades in Richer Places”

    In recent decades, the extra money that graduates earn has been touted as a good reason to attend university. But that has recently come under scrutiny with evidence suggesting the graduate premium has fallen.

    And now two separate papers have found that another supposed benefit of higher education—increased lifetime happiness—is also not quite as straightforward as thought.

    A new study, which analyzed data from 36 countries, reveals that both higher education graduates and the rest of the population experience a steady increase in well-being as a country’s social and economic prosperity gradually improves.

    However, the well-being gains associated with higher education were found to “level off” when a country becomes more economically developed.

    Therefore, the paper argues that graduates in countries with lower GDP per capita experience greater relative gains in terms of economic security, social mobility, higher social status and life satisfaction—leading to a higher sense of well-being.

    In contrast, the “happiness advantage” of a university degree in countries with a higher GDP per capita is less pronounced.

    The paper suggests that stress and dissatisfaction can be caused by rising expectations, increased competition and a “relentless emphasis on achievement,” particularly among highly educated individuals.

    “Highly educated individuals in more prosperous countries are generally much happier than their counterparts in less prosperous countries, although they may be less happy than less educated individuals within their own country,” writes author Samitha Udayanga, a doctoral candidate at the University of Bremen.

    This suggests that the happiness derived from higher education tends to weaken in wealthier countries, he adds.

    A separate study published in June found that the level of happiness associated with completing college has quadrupled since the mid-1970s.

    The study of over 35,000 people in the U.S. showed that higher education has shifted over this time from contributing to happiness through occupations to improving wages.

    The “happiness return” of higher education increased over the 45 years of the study and remains higher than the happiness linked to not studying for a degree.

    But the researchers discovered it “nosedived” in 2021–22 during the COVID-19 pandemic. And satisfaction linked to postgraduate degrees has stalled since the 2000s.

    “University graduates in contemporary America have a certain chance of gaining monetary rewards [by] bypassing occupations, resulting in a relatively higher probability of feeling happy,” they said. “Meanwhile, the same mechanism rarely operates for advanced degree holders, whose happiness largely depends on their occupational attainment.”

    The paper concludes that the overall happiness premium for higher education at both the undergraduate and postgraduate level may “vanish once their economic rewards become less pronounced.”

    Source link

  • Are States Prepared for Workforce Pell?

    Are States Prepared for Workforce Pell?

    Thanks to the One Big Beautiful Bill Act becoming law this summer, workforce Pell is now a reality and federal aid dollars are expected to flow to low-income students in short-term programs as soon as next July.

    But now comes the hard work of figuring out which programs are eligible—and some states aren’t ready, according to a new report from the State Noncredit Data Project, which helps community college systems track data related to noncredit programs. Not all states collect the data needed to make that determination, and some offer programs that wouldn’t make the cut, the report concluded.

    Under the legislation, short-term programs need to meet certain requirements to qualify for Pell money. For example, state governors need to verify they align with high-skill, high-wage or in-demand jobs. Programs also must be able to build toward a credit-bearing certificate or degree program and be “stackable and portable across more than one employer” unless preparing students for jobs with just one recognized credential. They have to exist for at least a year and meet outcomes goals, including completion and job-placement rates of at least 70 percent. And programs can’t charge tuition higher than graduates’ median “value-added earnings,” or the degree to which their income exceeds 150 percent of the federal poverty line three years out of the program.

    But some states collect more data than others on community colleges’ noncredit education, which encompasses many of the programs likely to qualify for workforce Pell, according to the report. It based its findings on course and program-level data from eight states: Iowa, Louisiana, Maryland, New Jersey, Oregon, South Carolina, Tennessee and Virginia.

    “What we’re going to see is varying degrees of difficulty” for different states, said co-author Mark D’Amico, a higher education professor at University of North Carolina at Charlotte. “States that have more robust data on noncredit community college education are going to be at a little bit of an advantage.”

    The report found that most states track basic metrics such as the length of a program. But two out of the eight states had no state-level data on noncredit credential outcomes. Half of the states didn’t collect any data on labor market outcomes like earnings and employment rates. And multiple states didn’t keep track of whether students completed credentials or went on to pursue credit-bearing programs. The report emphasized that while individual institutions might have more detailed data on their programs, gaps in statewide data could create challenges as states work with institutions to prove their programs’ eligibility for workforce Pell.

    “Most states have some of the fundamental data,” D’Amico said, “but I think when it comes to the credentials’ labor market outcomes, completion, stackability, those are going to be a little bit more difficult to identify.”

    The report predicted that some states, like Iowa, Louisiana and Virginia, may have an easier time proving which programs meet the criteria because they already have state funding for noncredit programs that requires colleges to report relevant data. For example, Iowa includes noncredit education in its state funding formula for workforce training programs, and Louisiana has a state scholarship for such programs.

    Co-author Michelle Van Noy, director of the Education and Employment Research Center at Rutgers University, said states’ data infrastructure for noncredit programs is still a “work in progress,” but she’s seen “quite a progression” in recent years. She’s optimistic they’ll continue to improve.

    “It is my hope that Workforce Pell implementation can be done in a way that will support the broader development of data and quality systems for noncredit education and nondegree credentials within states,” Van Noy wrote in an email to Inside Higher Ed.

    But data isn’t the only issue. The report also found that typical noncredit programs weren’t necessarily long enough to meet the standards for workforce Pell. Except for lengthier workforce programs at the Tennessee Colleges of Applied Technology, the median number of hours for occupational training programs ranged from 15 hours in New Jersey to 100 hours in Virginia, falling short of the 150-hour, eight-week threshold. Institutions could group their courses into longer programs in the coming months. But it’s not yet clear if making such a change would affect the requirement that programs exist for at least a year.

    “Anyone that may be thinking that all of a sudden, all noncredit programs are going to be eligible, the data show that’s not the case,” D’Amico said. “We’ll see what happens over time.”

    The report offered a set of recommendations for how states can ready themselves for workforce Pell. For example, it urged state officials to take stock of which metrics they still need to collect to fall in line with the policy’s guardrails and encouraged state and college officials to work together to start identifying programs that could be eligible. The report also suggested colleges consider reconfiguring programs so noncredit offerings serve as on-ramps to credit-bearing programs and meet other structural requirements.

    Further details about how workforce Pell will work are going to be hashed out in a negotiated rule-making process this fall, but D’Amico said states shouldn’t wait for that.

    “I would use the guardrails now, use the data that they have now, to begin to do that pre-identification” so they have “a little bit of time to begin to fill some of those gaps in existing data,” D’Amico said.

    He also hopes states’ preparation for workforce Pell pushes forward “a larger conversation” they’re already having about the quality of short-term noncredit programs over all.

    The overarching goal is “ensuring that noncredit programs are designed well, have credentials associated with them linked to further education and are really designed in a way that’s going to be beneficial to students and ultimately help the local and state economies that these programs are going to serve,” D’Amico said.

    Source link

  • Extremist Group Claims Responsibility for “Swatting” Calls

    Extremist Group Claims Responsibility for “Swatting” Calls

    Aaron Ontiveroz/The Denver Post/Getty Images

    A person who goes by the name Gores online claimed responsibility for the flurry of so-called swatting calls made to colleges and universities over the past several days, Wired reported.

    Gores is the self-proclaimed leader of an online group called Purgatory, which is linked to a violent online extremist network called The Com, according to Wired. Alongside another Purgatory member called tor, Gores began placing fake calls to campus and local emergency services about active shooters about noon Aug. 21, the same day the University of Tennessee at Chattanooga and Villanova University received swatting calls. 

    As of Wednesday afternoon, Inside Higher Ed counted 19 confirmed swatting calls since Aug. 19, including at Mercer University, the University of Wisconsin at Madison, the University of Utah and the University of New Hampshire.

    Not all of the calls placed by Purgatory have been successful. In some cases, authorities correctly identified the calls as hoaxes. When the group placed a call to Bucknell University in Lewisburg, Pa., a researcher listening in on the call was able to alert the university. The FBI is investigating the uptick in swatting calls and has not publicly confirmed Purgatory’s involvement. Gores told Wired that the swatting spree will continue for another two months. 

    Purgatory offers to make swatting calls for as little as $20, though the price has increased to $95 since this recent campaign of calls began, according to Wired. Three members of Purgatory were arrested in 2024 and pleaded guilty earlier this year for threats made to a Delaware high school, a trailer park in Alabama, Albany International Airport, an Ohio casino and a private residence in Georgia. 

    Ashley Mowreader contributed to this article.

    Source link

  • New York Passes Law Requiring Title VI Coordinators

    New York Passes Law Requiring Title VI Coordinators

    Photo illustration by Justin Morrison/Inside Higher Ed | howtogoto/iStock/Getty Images

    New York is mandating that all colleges in the state designate a coordinator to oversee investigations into discrimination on the basis of race, color, national origin and shared ancestry, which is prohibited under Title VI of the Civil Rights Act of 1964, Gov. Kathy Hochul’s office announced Wednesday.

    According to Hochul, the state is the first in the country to pass such a law.

    “By placing Title VI coordinators on all college campuses, New York is combating antisemitism and all forms of discrimination head-on,” she said in the press release. “No one should fear for their safety while trying to get an education. It’s my top priority to ensure every New York student feels safe at school, and I will continue to take action against campus discrimination and use every tool at my disposal to eliminate hate and bias from our school communities.”

    Many colleges have begun hiring for Title VI coordinator roles in the past several months in response to the surge in reports of antisemitism and Islamophobia following Hamas’s fatal Oct. 7, 2023 attack on Israeli civilians. In some cases, the Department of Education’s Office for Civil Rights required institutions to add these roles after finding that they failed to adequately address complaints of discrimination on their campuses.

    The State University of New York system had already mandated each of its campuses to bring on a Title VI coordinator by the fall 2025 semester.

    Source link