Tag: Higher

  • Why Ind. Fans Are Excited About First Football National Champs

    Why Ind. Fans Are Excited About First Football National Champs

    The Indiana Hoosiers defeated the Miami Hurricanes 27 to 21 to win the university’s first-ever NCAA Division I college football national championship this week. Any school would be thrilled to clinch this title and take home the trophy that accompanies it. But I will explain in this article why it hits different for IU students, alumni, employees and other supporters. Before doing so, I’ll first disclose how I know.

    Five of the best years of my life were spent in Bloomington. I have a master’s degree and Ph.D. from the extraordinary university that is the heartbeat of that beloved community. IU subsequently bestowed upon me two distinguished alumni awards. The university presented its first Bicentennial Medal to Indiana governor Eric Holcomb in July 2019; that same month, I became the second recipient.

    Since graduating with my doctorate 23 years ago, I have returned to campus to deliver several lectures and keynote speeches, including the 2024 Martin Luther King Jr. Day Address. My favorite trip back was in 2011 to celebrate my fraternity’s centennial. Ten visionary Black male students founded Kappa Alpha Psi there, a brotherhood that now has more than 150,000 members. I am proud to be one of them. These are just a few of countless reasons why I have long been one of IU’s proudest alums.

    Here is what I remember about football games in the late ’90s and early 2000s: Whew, yikes! Tons of people showed up to tailgate outside our stadium on Saturday mornings before home games. I was often one of them. Those gatherings were probably just as fun there as they were at schools that had won Power 4 conference titles and national championships. But there was one embarrassing feature of our pregame tailgates: Few people actually went inside Memorial Stadium for games. When I say “few,” I mean at least two-thirds of stadium seats were empty. I thought it rude and unsupportive of student athletes to eat and drink in the parking lot for hours then skip the game—hence, I opted for the tailgate-only experience no more than four times each season. I was inside cheering all the other times.

    Despite what had long been its shady tailgating culture, IU has amazing fans. I often screamed alongside them at basketball games. During one of my most recent visits to campus, President Pam Whitten generously hosted me for a Big Ten matchup in her fabulous suite inside the iconic Assembly Hall. I was instantly reminded that my beloved alma mater has an electrifying, inspiringly loyal fan base—for basketball. As it turns out, winning five men’s national basketball championships, clinching 22 Big Ten conference titles and making 41 NCAA tournament appearances (advancing to the Final Four eight times) excites people. Suffering so many defeats in football year after year, not so much.

    Throughout the last two seasons, ESPN commentators and other sportscasters have annoyingly repeated that Indiana has long been the losingest major college football team of all time; I will leave it to someone else to fact-check that. Going from being so bad for so long to an 11–2 season and playoff berth last year, followed by a Big Ten Championship, a flawless 16–0 season and a national championship win this year, are just some reasons why IU alumni and others are so excited. Oh, and then there is Fernando Mendoza, our first-ever Heisman Trophy winner, and Curt Cignetti, the inspirational head coach who accelerated our football program to greatness in just two seasons.

    Instantly improving from (reportedly) worst of all time to college football’s undisputed best is indeed exciting. Nevertheless, it is not the only reason why the Indiana faithful are so amped. Our university is beyond extraordinary in numerous domains. Academic programs there are exceptional; many, including the one from which I graduated, are always ranked in the top nationally. The university employs many of the world’s best professors and researchers. Its connection to the Hoosier State is deep, measurable and in many ways transformative. The Bloomington campus, framed by its gorgeous tulip-filled Sample Gates, is a vibrant, exciting place to be a student. It feels like a great university because it has long been, still is and forever will be. It is birthplace of the greatest collegiate fraternity, a fact that requires no verification.

    Finally having a football program that matches all the other great things that IU is and does is why those of us who have experienced the place are so freakin’ excited about our first-ever college football national championship. Greatness deserves greatness. Thanks to Cignetti and his staff, Mendoza and every other student athlete on their team, Indiana University has finally achieved football greatness. They have given others and me one more reason to be incredibly proud of a great American university that excels in academics, public outreach, athletics and so many other domains. I conclude with this: Hoo-Hoo-Hoo-Hoosiers!

    Shaun Harper is University Professor and Provost Professor of Education, Business and Public Policy at the University of Southern California, where he holds the Clifford and Betty Allen Chair in Urban Leadership. His most recent book is titled Let’s Talk About DEI: Productive Disagreements About America’s Most Polarizing Topics.

    Source link

  • Congress Proposes Increasing NIH Budget, Maintaining ED

    Congress Proposes Increasing NIH Budget, Maintaining ED

    The House and Senate appropriations committees have jointly proposed legislation that would generally maintain the Education Department’s funding levels, plus increase the National Institutes of Health’s budget by more than $400 million this fiscal year. It’s the latest in a trend of bipartisan Congressional rebukes of President Trump’s call to slash agencies that support higher ed.  

    For the current fiscal year, Trump had asked Congress to cut the NIH by 40 percent and subtract $12 billion from ED’s budget. The president proposed eliminating multiple ED programs, including TRIO, GEAR UP and the Supplemental Educational Opportunity Grant program, which all help low-income students attend college. He also proposed reducing the ED Office for Civil Rights budget by over a third. 

    But the proposed funding package senators and representatives released this week maintains funding for all of those programs. 

    “We were surprised to see the level of funding for the higher education programs actually be increased, in some regards—and be maintained,” said Emmanual Guillory, senior director of government relations at the American Council on Education. “We knew that level funding would be considered a win in this political environment.” 

    This latest set of appropriations bills is the final batch that Congress must approve to avert another government shutdown at the end of the month. Democrats have said passing actual appropriations bills, as opposed to another continuing resolution, is key to ensuring that federal agencies spend money as Congress wants.

    Joanne Padrón Carney, chief government relations officer for the American Association for the Advancement of Science, told Inside Higher Ed that the NIH budget increase is essentially “flat funding,” considering inflation. But she said “this appropriations package once again demonstrates Congressional, bipartisan support for research and development and the importance of these investments, as well as rejecting the administration’s very dramatic cuts.”  

    Earlier this month, Congress largely rejected Trump’s massive proposed cuts to the National Science Foundation, the National Aeronautics and Space Administration, and the Energy Department, three significant higher ed research funders. These developments are adding up to a more encouraging 2026 funding picture for research and programs that support postsecondary students. 

    But Congress has just 10 days to pass this new funding package, and Trump must still sign both packages into law. A government shutdown will begin after Jan. 30 for those agencies without approved appropriations legislation. 

    Guillory noted that—despite the Justice Department declaring last month that minority-serving institution programs are unlawful because they “effectively [employ] a racial quota by limiting institutional eligibility to schools with a certain racial composition”—Congress still proposed funding these programs. 

    “Pretty much every single program that is a minority-serving institution program received an increase in funding,” he said. 

    The appropriators also want to send another roughly $790 million to the Institute of Education Sciences, compared to the $261 million Trump requested. Last year, his administration gutted IES, the federal government’s central education data collection and research funding agency. But, like the broader Education Department, laws passed by Congress continue to require it to exist. 

    Beyond the appropriations numbers, the proposed legislation to fund the NIH would also prevent the federal government from capping indirect research cost reimbursement rates for NIH grants at 15 percent, as the Trump administration has unsuccessfully tried to do. Indirect cost rates, which individual institutions have historically negotiated with the federal government, pay for research expenses that are difficult to pin to any single project, such as lab costs and patient safety. 

    The appropriations committees released an explanatory statement alongside the legislation that says “neither NIH, nor any other department or agency, may develop or implement any policy, guidance, or rule” that would change how “negotiated indirect cost rates have been implemented and applied under NIH regulations, as those regulations were in effect during the third quarter of fiscal year 2017.” 

    GOP members of the House Appropriations Committee didn’t say they were bucking the president in their news release on the proposal. Instead, they said the legislation demonstrates “the will of the American people who mandated new priorities and accountability in government, including priorities to ‘Make America Healthy Again’ and ‘Make America Skilled Again.’” 

    “Investments are directed to where they matter most: into lifesaving biomedical research and resilient medical supply chains, classrooms and training that prepare the next generation for success, and rural hospitals and primary care to end the chronic disease epidemic,” the release said. 

    Democrats claimed victory for Congress. 

    “This latest funding package continues Congress’s forceful rejection of extreme cuts to federal programs proposed by the Trump Administration,” said Rep. Rosa DeLauro, the top Democrat on the House Appropriations Committee, in a release.  

    “Where the White House attempted to eliminate entire programs, we chose to increase their funding,” DeLauro said. “Where the Administration proposed slashing resources, we chose to sustain funding at current levels. Where President Trump and Budget Director Russ Vought sought broad discretion over federal spending, Congress, on a bipartisan, bicameral basis, chose to reassert its power of the purse.”

    Carney says she thinks passage is “highly likely.” 

    “Ostensibly, what they call the ‘four corners’—the chair and ranking members from both chambers and both parties—have come to this agreement on this package,” she said. So, barring “last-minute surprises,” she said, “it should be relatively smooth sailing.”

    Rep. Tom Cole, the Republican chair of the House appropriations committee, urged his fellow lawmakers to pass the legislation.

    “At a time when many believed completing the FY26 process was out of reach, we’ve shown that challenges are opportunities,” Cole said in a statement. “It’s time to get it across the finish line.”

    Source link

  • Five Ways Higher Ed Teams Can Improve AEO This Month

    Five Ways Higher Ed Teams Can Improve AEO This Month

    There’s a growing tension I’m hearing across higher education marketing and enrollment teams right now: AI is answering students’ questions before they ever reach our websites, and we’re not sure how, or if, we’re part of those answers.

    That concern is valid, but the good news is that Answer Engine Optimization (AEO) isn’t some futuristic discipline that requires entirely new teams, tools, or timelines. 

    In most cases, it’s about getting much more disciplined with the content, structure, and facts you already publish so that AI systems can confidently use your institution as a source of truth.

    And with some dedicated time and attention, there’s meaningful progress you can make starting today.

    Here are five actions higher ed teams can realistically take right now to improve how they appear in AI-powered search and answer environments.

    1. Run a Simple “Answer Audit” to Establish Your Baseline

    Before you can improve how you show up in AI-generated answers, you need to understand where you stand today, and that starts with asking the same questions your prospective students are asking.

    Identify Real Student Questions

    Select five to ten realistic, high-intent student questions, ideally pulled directly from admissions conversations, search query data, or inquiry emails. 

    Test Visibility Across Major Answer Engines

    Run those questions through a handful of major answer engines, such as:

    • Google AI Mode or AI Overviews
    • ChatGPT
    • Gemini
    • Perplexity
    • Bing Copilot or AI Overview search mode

    This isn’t a perfect science, as your geography and past search history does affect visibility, but it will give you a quick general idea.

    Document What Appears—and What Doesn’t

    For each query, document a few critical things:

    • Does your institution appear in the answer at all?
    • If it does, what information is being shared, and is it accurate?
    • How is your institution being described? Is the tone neutral, positive, or cautious, and does it align with how you want to be perceived?
    • Which sources are cited or clearly influencing the response (your site, rankings, Wikipedia, third-party directories)?

    Log this in a simple spreadsheet. What you’ve just created is your initial visibility benchmark, and it’s far more informative than traditional rankings or traffic reports in an AI-first discovery environment.

    Where We Can Help

    In Carnegie’s AEO Audit, we expand this approach across a much broader and more structured evaluation set. Over a 30-day period, Carnegie evaluates visibility, sentiment, and competitive positioning to show how often you appear, what AI engines are saying about your brand and programs, how you compare to peers, and where focused changes will have the greatest impact on AI search presence.

    >> Learn More About Carnegie’s AEO Solution

    2. Fix the Facts on Your Highest-Impact Pages

    If there’s one thing AI systems punish consistently, it’s conflicting or outdated information, and those issues most often surface on pages that drive key enrollment decisions.

    Identify Your Highest-Impact Pages and Core Facts

    Start by identifying ten to twenty priority pages based on enrollment volume, traffic, revenue contribution, or strategic importance. These typically include:

    • High-demand program pages
    • Admissions and application requirement pages
    • Tuition, cost, and financial aid pages
    • Visit, events, and deadline-driven pages

    These pages frequently influence AI-generated answers and early student impressions, and where inaccuracies can have an impact on trust and decision-making, particularly as search continues to evolve toward more experience-driven models.

    For each priority page, verify that the core facts are correct, complete, and clearly stated wherever they apply.

    Program Name and Credential Type

    Ensure the official program name and credential are clearly stated upon first mention. For example, fully spell out the name—Bachelor of Arts in English—in the first paragraph of the page and abbreviate to B.A. in English, Bachelor’s in English, and/or English major in future mentions.

    Delivery Format

    Clearly indicate whether the program or experience is offered on-campus, online, hybrid, or through multiple pathways.

    Time to Completion or Timeline Expectations

    Include full-time, part-time, and accelerated timelines, or key dates where applicable.

    Concentrations or Specializations

    List available concentrations or specializations clearly and consistently.

    Tuition and Fees

    Confirm how costs are expressed and whether additional fees apply.

    Admissions Requirements and Deadlines

    List requirements and deadlines explicitly, avoiding conditional or outdated language.

    Outcomes, Licensure, and Accreditation

    Document licensure alignment, accreditation status, and any verified outcomes data.

    Align Facts Across Every Source

    Once verified, align that information everywhere it appears, including:

    • Primary program, admissions, and visit pages
    • Catalog and registrar listings
    • PDFs, viewbooks, and other downloadable assets
    • Major program directories and rankings where edits are possible

    Signal Freshness with Clear Update Dates

    For content that is time-bound or interpretive—such as admissions pages, deadlines, visit information, policies, blog posts, and thought leadership—clearly signaling recency helps reduce confusion for both students and AI systems.

    In those cases, a visible “last updated” date can help establish confidence that information reflects current realities.

    The goal isn’t to add dates everywhere. It’s to be intentional about where freshness signals meaningfully support clarity, trust, and accuracy.

    3. Restructure a Small Set of Program Pages for AI Readability

    With your facts aligned, the next step is making sure your most important program pages are structured in a way that both humans and machines can easily understand.

    Use a Predictable Page Structure AI Can Parse

    Choose five to ten priority programs and apply a clear, predictable structure that answer engines can parse with confidence, such as:

    • Program overview
    • Who this program is designed for
    • What students will learn
    • Delivery format and scheduling
    • Time to completion
    • Cost and financial support options
    • Admissions requirements
    • Career pathways and outcomes
    • Frequently asked questions

    Add Information Gain to Differentiate Your Program

    Rely on descriptive headings and bullet points, and avoid unnecessarily complex language. Most importantly, include at least one element of information gain: a specific detail that differentiates the program, such as outcomes data, employer partnerships, or experiential learning opportunities.

    Answer Student Questions Explicitly with FAQs

    And if you want to influence AI-generated answers, you need to be explicit about the questions you’re answering—FAQ sections remain one of the most effective ways to do that.

    On each optimized program page, add four to six student-centered questions that directly address decision-making concerns. 

    Answers should be brief, factual, and supported by links to official institutional data wherever possible. 

    Use FAQ Schema Where Possible

    If your CMS and development resources allow, mark these sections up with FAQ schema so answer engines can more reliably identify and reuse them.

    If you don’t clearly answer these questions, AI will still respond, but it may not use your content to do so.

    4. Build a Net-New Content Strategy for AI Visibility

    Program pages matter, but institutions won’t win in AI search results by maintaining existing content alone.

    Why AI Systems Prefer Explanatory Content

    In practice, we’re seeing AI tools cite blog posts, explainers, and articles more often than traditional program pages, especially for the broader, earlier-stage questions students ask before they’re ready to search for a specific degree.

    That means AEO success requires more than restructuring what already exists. It requires a proactive content strategy that consistently publishes new points of expertise, experience, and trust around the topics students care about.

    The Types of Student Questions AI Is Answering

    For many institutions, that’s not just about program marketing. It’s about painting a credible picture of student life, outcomes, belonging, and the real-world value of higher education. The kinds of pieces AI systems surface tend to answer questions like:

    • What should I look for in an MBA program with an accounting concentration?
    • Is community college a good first step?
    • What kinds of jobs can I get in energy?
    • What does it mean to be an Emerging Hispanic-Serving Institution?

    In other words: content that helps students frame decisions before they compare institutions.

    Start with a Small, Intent-Driven Content Pipeline

    Start small. Choose five to ten priority student questions tied to your recruitment goals, informed by existing keyword research tools and site data from sources like Google Search Console.

    Use those insights to build a simple content pipeline that produces a handful of focused articles:

    • 3–5 new blog or explainer topics aligned to student intent
    • Outlines built around direct answers + structured headings
    • A short list of internal contributors or Subject Matter Experts (SMEs)
    • Clear calls-to-action that connect early-funnel content to next steps

    This is one of the fastest ways to expand your presence in AI-generated answers, and to build brand awareness earlier in the funnel, when students are still defining what they want.

    Where We Can Help

    Our AEO solution for higher ed turns insights from the audit into sustained visibility gains. Our experts deliver ongoing content development, asset optimization, visibility tracking and technical guidance to build your authority and improve performance across AI-driven search experiences.

    >> Learn More About Carnegie’s AEO Solution

    5. Establish a Lightweight Governance and Maintenance Cadence

    One of the biggest threats to long-term AEO success in higher education isn’t technology, it’s organizational drift.

    You don’t need an enterprise-wide governance overhaul to make a difference. Start with something intentionally simple:

    • A defined list of high-impact pages (programs, tuition, admissions, financial aid)
    • A basic owner matrix outlining responsibility for updates
    • A short monthly review checklist
    • A quarterly content review cadence by college or school

    Even a modest governance framework can dramatically reduce conflicting information and ensure your most important pages remain current as programs evolve.

    Good enough beats perfect every time.

    The Bigger Picture

    AEO isn’t about chasing every AI update or trying to “game” emerging platforms. It’s about being consistently clear, accurate, and helpful in the moments when students are asking their most important questions.

    If you do these five things this month, you won’t just improve your institution’s visibility in AI-driven search, you’ll build trust at the exact point where enrollment decisions are being shaped.

    Ready to go deeper?

    Download The Definitive Guide to AI Search for Higher Ed for practical frameworks, examples, and checklists that will help your team move from experimentation to strategy without the overwhelm.

    Frequently Asked Questions About AEO in Higher Education

    What is Answer Engine Optimization (AEO)?

    Answer Engine Optimization (AEO) is the practice of improving how institutions appear in AI-driven search and answer environments like ChatGPT, Google AI Mode and Overviews, Gemini, and Perplexity. Instead of focusing only on rankings and clicks, AEO emphasizes clarity, accuracy, and structured content so AI systems can confidently cite and summarize your institution.

    How is AEO different from traditional SEO?

    SEO is designed to improve visibility in search engine results pages, while AEO focuses on how content is interpreted and reused by AI systems that generate direct answers. AEO prioritizes structured content, consistent facts, explicit question answering, and information gain over keyword density alone.

    Why does AEO matter for higher education institutions?

    Students increasingly ask AI platforms questions about programs, outcomes, cost, and fit before visiting institutional websites. AEO helps ensure your institution is accurately represented in those early discovery moments, when perceptions are formed and enrollment decisions begin taking shape.

    What types of content help improve AEO performance?

    AI systems tend to favor content that is clearly structured and informative, including program pages with consistent facts, FAQ sections, explainer articles, and blog posts that directly answer student questions. Content that demonstrates expertise, outcomes, and real-world context is more likely to be cited.

    Who can we help implement AEO for higher education?

    Institutions can begin improving AEO internally by auditing content, aligning program facts, and adding structured FAQs. For more advanced support, higher education–focused partners like Carnegie provide AEO audits, content optimization, technical guidance, and ongoing visibility tracking tailored to AI-driven search environments.

    Source link

  • Answer Engine Optimization (AEO) Higher Ed

    Answer Engine Optimization (AEO) Higher Ed

    Artificial Intelligence (AI) is fundamentally reshaping how students discover colleges and universities, and how higher education institutions are evaluated before a single click ever happens.

    Instead of starting with traditional search engines and scrolling through results, many prospective students are turning to conversational platforms like ChatGPT, Perplexity, Gemini, Copilot, and Google’s AI Overviews/Mode. They’re asking direct questions and having conversations about programs, outcomes, cost, campus experience, and long-term value—and trusting the answers they receive.

    The challenge for higher education leaders is clear: most institutions have little to no visibility into how AI engines describe them, or whether they appear in AI-generated answers at all. And as Neil Patel has noted, more than half of searches in 2025 result in no website visit. When AI provides the answer, invisibility means missing the earliest—and often most influential—stage of student exploration.

    Why AI Search Changes the Stakes for Higher Education

    AI-driven search compresses what used to be weeks of research into a single moment.

    When AI engines summarize programs, affordability, outcomes, and reputation into confident responses, students form impressions before admissions teams, websites, or campaigns ever have a chance to engage. Institutions that are missing, misrepresented, or underexplained in these answers lose influence before recruitment efforts even begin.

    For enrollment and marketing leaders, this represents a fundamental shift. Visibility is no longer only about traffic or rankings. It is about early perception, trust, and preference in moments that increasingly determine whether a student continues exploring or moves on.

    Introducing Carnegie’s Answer Engine Optimization (AEO) For Higher Ed

    Answer Engine Optimization, or AEO, is the practice of ensuring that AI platforms accurately interpret, summarize, and surface institutional information in response to real student questions.

    Carnegie’s AEO service helps colleges and universities influence how AI platforms understand their programs, content, and brand. Designed specifically for higher education, Carnegie’s AEO solution combines AI visibility technology with enrollment strategy, brand expertise, and decades of leadership in SEO, search, and performance-driven discoverability.

    Key Components of Carnegie’s Answer Engine Optimization (AEO)

    Carnegie approaches AEO as a system rather than a one-time tactic. The solution is built around two connected phases that work together to create clarity and sustain visibility over time.

    Establish Clarity with the AEO Audit

    The AEO Audit provides a clear, data-driven view of how your institution appears in AI-generated answers across priority topics, audiences, and competitors. 

    Over a 30-day period, Carnegie evaluates AI visibility, sentiment, and competitive positioning to answer key questions such as:

    • How often your institution appears in AI-generated responses
    • What AI engines say about your brand, programs, and outcomes
    • How your visibility compares to peers and aspirational competitors
    • Where narrative, content, and structural gaps limit discoverability
    • What actions will most effectively strengthen AI search presence

    Build and Sustain Visibility Through AEO Activation

    AEO Activation moves institutions from understanding to execution.

    Carnegie partners with internal teams to continuously improve how institutions are interpreted and surfaced across AI-driven search experiences. Activation includes:

    • Ongoing optimization of content, structure, and AI-facing signals
    • Refinement of institutional narratives to improve accuracy, trust, and clarity
    • Recommendations for technical enhancements that support AI interpretation and discovery
    • Content development to drive AI visibility
    • Continuous monitoring to adapt to platform changes and student behavior

    Together, these two phases ensure AEO is not a one-time assessment, but a sustained strategy, helping institutions remain visible, accurately represented, and competitive as AI-powered discovery continues to evolve.

    The AEO Audit: Your First Step to AI Search Visibility

    Most institutions begin AEO with one critical question: How do we currently show up?

    The AEO Audit establishes a baseline for AI visibility, perception, and competitiveness so teams can move forward with confidence rather than assumptions.

    Inside the AEO Audit: What You Will Learn

    AI Visibility Score

    A measurement of how frequently your institution appears across prompts for 30 days, covering programs, brand terms, outcomes, and high-intent queries.

    Line graph titled "Visibility Score" shows a score of 54% on February 1, with a downward then upward trend.

    Competitive Analysis & Share of Voice

    Insight into how your institution stacks up to peers and competitors in AI-generated answers.

    Table showing share of voice: six universities ranked, with percentages from 43% to 1%.

    Brand Sentiment Insights

    A view of how AI platforms describe your institution—positive, negative, or neutral—and how those narratives influence perception.

    Sentiment analysis chart: 90.77% positive, 6.45% neutral, 2.78% negative.

    Technical + Structural Evaluation

    A review of your website’s structure and content signals to identify barriers that may be preventing AI engines from surfacing your information accurately.

    AI tools in education, tech audit, ChatGPT, Gemini, higher education innovation, digital learning tools.
    Graph showing ChatGPT outperforming Gemini in technical audit for higher education.

    Strategic Roadmap

    A prioritized plan outlining the most impactful improvements, from content enhancements to technical recommendations and program-level opportunities.

    A content roadmap diagram showing "Article" and "PR" leading to a highlighted red "Blog" box.

    Ready to See How You Show Up in AI Search?

    AI-powered discovery is already shaping how students explore, compare, and choose colleges.

    The AEO Audit is the fastest way to understand how your institution is represented today and where opportunities exist to strengthen visibility, accuracy, and trust.

    If you are responsible for your institution’s enrollment growth, brand differentiation, or long-term strategy, this is the place to start.

    See how you show up in AI search.

    Frequently Asked Questions

    What is Answer Engine Optimization (AEO)?

    Answer Engine Optimization (AEO) is the practice of optimizing institutional content so AI platforms like ChatGPT, Gemini, Perplexity, and Google AI Overviews accurately interpret, summarize, and surface it in response to user questions.

    Why is AEO important for higher education institutions in 2026?

    AI search engines now shape how prospective students discover colleges, making AEO essential for institutions to maintain visibility and influence student decision-making in AI-generated results.

    Who benefits most from Carnegie’s AEO solution?

    Enrollment leaders, marketing leaders, presidents, and finance leaders benefit from AEO by gaining clarity into AI visibility, competitive positioning, and early-stage student perception.

    What makes Carnegie’s AEO approach different?

    Carnegie approaches AEO by pairing industry-leading AI visibility technology with deep higher education expertise. We combine large-scale AI monitoring and analysis with decades of leadership in search, SEO, and performance-driven discoverability—so institutions gain not just insight into how AI engines represent them, but expert guidance on how to improve accuracy, trust, and visibility over time.

    What is Answer Engine Optimization and how does it differ from traditional SEO?

    Answer Engine Optimization (AEO) optimizes content for direct citation by AI-powered platforms like ChatGPT and Google AI Overviews, focusing on structured, authoritative responses rather than search rankings alone.

    How do AEO strategies complement existing SEO and content marketing efforts?

    AEO extends traditional SEO by optimizing for AI-friendly structured formats, expanding institutional reach across both conventional search engines and emerging AI answer platforms simultaneously.

    Source link

  • Trump 2.0’s impact on higher ed: The first year in 8 numbers

    Trump 2.0’s impact on higher ed: The first year in 8 numbers

    This audio is auto-generated. Please let us know if you have feedback.

    Monday marked the end of the first year of President Donald Trump’s second term, and higher education is still reeling from months of nonstop federal whiplash and policy changes.

    The Trump administration has used wide-ranging and unprecedented tactics to gain influence over the academic sector and advance its policy goals. In turn, some college leaders have been forced to decide between defending their institution’s independence and policies or yielding to the federal government’s demands due to financial pressure.

    Below, we’re breaking down some of the biggest impacts of the second Trump administration’s first year, number by number.

    150+

    The number of investigations the Trump administration either opened into colleges or cited while warning of a potential loss of federal funding.

    In March, the U.S. Department of Education put 60 colleges on notice over ongoing Title VI probes into allegations that they weren’t doing enough to protect Jewish students from discrimination or harassment. Title VI bans federally funded institutions from discriminating based on race, color or national origin.

    U.S. Secretary of Education Linda McMahon warned the colleges, many of whose investigations predated Trump’s second term, that federal funding “is a privilege” that is “contingent on scrupulous adherence to federal antidiscrimination laws.”

    Less than a week later, the Education Department opened 51 additional investigations into colleges over allegations they had programs or scholarships with race-based restrictions for participation or eligibility. The agency again cited potential Title VI violations, along with a February guidance letter aimed at snuffing out diversity, equity and inclusion efforts. That guidance was ultimately struck down in August by federal courts.

    Several well-known colleges were named in both sets of investigations, including Yale, Cornell, Tulane and Arizona State universities.

    Since last March, the Trump administration has opened additional college investigations over institutional policies that run antithetical to the president’s higher education agenda, such as allowing transgender students to play on sports teams aligning with their gender identity. 

    6

    The number of colleges that have publicly brokered deals with the Trump administration to settle allegations of civil rights violations.

    Most of the institutions — Brown University, Columbia University, Cornell University, Northwestern University, and the University of Pennsylvania each faced hundreds of millions of dollars in frozen or canceled federal funding. By settling with the Trump administration, university leaders sought to restore their funding and remove political targets from their institutions.

    The remaining institution, the University of Virginia, still had its funding intact but faced five federal investigations that could have threatened access to such funds. The U.S. Department of Justice paused those probes with the promise of closing them if the university “completes its planned reforms prohibiting DEI” through 2028.

    But many higher education experts have decried such agreements as violating academic freedom and emboldening the Trump administration’s assault on the sector.

    In one deal, Columbia University agreed to pay the federal government $221 million — the most of any college so far — and implement sweeping policy changes. Those included reporting extensive admissions data to the Trump administration, socializing “all students to campus norms and values” via training, and allowing an independent monitor to oversee the university’s compliance with the agreement. 

    The settlement will also put up walls between Columbia and international students by requiring the university to reduce its financial dependence on their tuition dollars and making applicants declare why they wish to study in the U.S.

    Source link

  • MSI Cuts Create Barriers for Indigenous Learners (opinion)

    MSI Cuts Create Barriers for Indigenous Learners (opinion)

    As we start the new year, my leadership team, like many others across the country, is confronting the financial fallout from the Department of Education’s decision to end grant programs for certain minority-serving institutions, including ours. The department has framed its September shift of funds away from MSIs and toward historically Black colleges and universities and tribal colleges and universities (TCUs) as an expansion of opportunity. Yet as an Indigenous education scholar and a college president, I see it creating new barriers for Indigenous learners. This decision is complex and requires deeper analysis to understand its lasting impacts.

    Federal support for Native education is a part of the federal trust responsibility, codified by at least 150 treaties, as well as various statutes and court decisions. Those treaties provide explicit provisions for various services, including education, that were guaranteed to Tribal Nations and their citizens by the United States government in exchange for land. This trust responsibility follows both Tribal Nations and individual tribal citizens. Ultimately, the federal trust responsibility is both a legal and moral obligation.

    In 2008, ​​Congress created Native American–serving nontribal institutions (NASNTIs), a new category of MSI, to ensure federal grant support for institutions educating Native students outside of tribal colleges and universities. Only about 12 percent of Native students attend TCUs. Stripping more than $54 million away from the other institutions that serve large numbers of Native students effectively undermines the federal government’s trust responsibility. Furthermore, this funding, which went not to just NASNTIs, but also but to Asian American and Native American Pacific Islander–serving institutions (AANAPISIs) and Alaska Native and Native Hawaiian–serving institutions (ANNHs)—typically supported programs open to all students at these institutions who qualified, not just Native learners.

    This loss is not abstract. At Fort Lewis College in Durango, Colo., where I am president, 37 percent of our students are Native American, representing more than 128 Tribal Nations and Alaska Native villages. We are the only NASNTI in the state. Recent federal cuts will mean a $2.27 million loss in critical grant support—dollars that have historically funded things like our peer educator tutoring, peer mentoring and summer bridge programs, all essential academic supports aimed at increasing student retention and graduation.

    In my role, I meet students every week who tell me that the support they received through these programs gave them the academic confidence to formally enroll or stay in school and a community to belong to on campus. For many students, these programs are the difference between continuing on the track toward graduation or leaving higher education altogether. Cutting this funding pulls away the very safety nets that level the playing field.

    Funding the institutions that support these students is also critical for boosting graduation rates, preparing a strong workforce and overall Tribal Nation building. Higher education access and success is a long-standing issue for Native communities, where only 42 percent of Native students graduate within six years, compared to 64 percent nationally, and only 17 percent of Native adults hold a bachelor’s degree. At a time when many communities are facing shortages of teachers, health-care providers and public servants, undermining critical pathways to higher education hurts our economy. Investing in these institutions is not only moral but profoundly practical.

    Finally, the decision to reallocate funding away from NASNTIs is especially damaging because it frames Native-serving institutions as competitors with TCUs, instead of partners in the shared mission of educating historically underserved students. There is no question that TCUs and HBCUs have both been woefully underfunded for decades. These institutions serve critical historical and present-day roles, providing access to higher education and meeting community and tribal needs. They deserve robust, sustained federal investment. TCUs, in particular, play an essential role in rural areas and tribal communities. That said, needed investments in these institutions should not come at the expense of the NASNTIs and other MSIs that educate vast numbers of Native students.

    By shifting this money, the Department of Education forces communities that are deeply aligned in our commitment to serving Native students and communities to fight for scarce resources, all while the department fails to meet its federal trust responsibility. NASNTIs and TCUs do not succeed at the expense of one another; we succeed together when federal policy recognizes the full breadth of our contributions.

    The Department of Education has an opportunity to reaffirm, not retreat from, its responsibility to Native students. That means sustaining investment in TCUs and HBCUs and restoring support for the NASNTIs that educate large numbers of Indigenous learners. When we fund the full ecosystem of Native-serving colleges and universities, we strengthen Native communities and the nation as a whole. True recognition of Native heritage lies in a commitment that honors the promises made and ensures that every Native student has the educational resources to thrive.

    Heather J. Shotton is president of Fort Lewis College.

    Source link

  • Better Defining and Measuring Higher Ed’s Value

    Better Defining and Measuring Higher Ed’s Value

    News this month that a group of stakeholders convened by the U.S. Education Department agreed on a new federal approach to assessing colleges offered fresh evidence that we as a country have decided to judge the value of higher education based primarily on students’ economic outcomes.

    The mechanism approved by the federal negotiating panel will set minimum earnings thresholds for graduates of academic programs at all colleges and universities; programs that fail to hit the mark will lose federal loan access or even Pell Grant funds, depending on how widespread the failure is.

    Building a new government accountability scheme around postcollege economic outcomes makes sense: Ensuring that learners come out of their educational experience better off financially than they would have been otherwise is a logical minimum requirement.

    But it reflects a larger problem, which is that we don’t have good ways of defining, let alone measuring, what quality or success look like in postsecondary education. And those of us who believe in higher education have erred badly by letting politicians and critics judge it exclusively by a narrow economic outcome like postgraduation salary.

    Most importantly, we’ve never come close to being able to measure learning—how much students cognitively gain from a course of study or academic experience. What a game changer it would be if we could—we’d really know which institutions actually help their learners grow the most. (I suspect such a measurement would upend our thinking about which colleges and universities are “the best,” and that part of why we haven’t ever solved this problem is because it wouldn’t be in the interest of the institutions that are most esteemed now.)

    Instead we look for proxies, and as our ability to track people’s movements between education and work has improved, we’ve focused on postcollege economic outcomes as our primary (if not exclusive) way of judging whether institutions serve learners well.

    That’s logical in many ways:

    1. Most learners cite career success as their top reason for pursuing postsecondary education and training,
    2. Federal and state governments invest in higher education in large part because of the institutions’ economic contributions, and
    3. It’s comparatively easy. We can’t expect politicians with limited understanding and expertise to develop sophisticated accountability systems.

    But overdependence on postcollege economic outcomes to judge higher education’s success and value ignores the full range of benefits that colleges and universities purport to deliver for individuals and for society collectively. It also has a range of potential unintended consequences, including deterring students from entering fields that don’t pay well (and institutions from supporting those fields).

    Many academic leaders hoped that if they ignored calls for accountability, the demands would fade. But in that vacuum, we ended up with limited, flawed tools for assessing the industry’s performance.

    The resulting loss of public confidence has damaged higher education, and turning that tide won’t be easy. But it’s not too late—if college leaders take seriously their need to marshal proof (not just words) that their institutions are delivering on what they promise.

    What would that look like? College leaders need to collectively define for themselves and for the public how their institutions are willing to be held accountable for what they say they do for learners and for the public good.

    This needs to be a serious attempt to say (1) this is what we purport to provide to individuals and to society, (2) this is how we will gauge success in achieving those goals, and (3) we commit to publicly reporting on our progress.

    Pushback against this sort of measurement and accountability (excluding those who simply don’t believe colleges should have to prove themselves, who at this point must be ignored) tends to focus on two reasonable complications: (a) different types of institutions do different things and have differing missions, and (b) some of what colleges and universities do can be difficult (and perhaps impossible) to measure.

    On argument (a), it’s certainly true that any effort to compare the full contributions of major research universities and of community colleges, for example, would need to focus on different things. The research university indicators might account for how many inventions their scientists have developed and how many graduate students they train; the community college indicators might include reskilling of unemployed workers and ESL classes for new immigrants preparing to become citizens.

    But in their core functioning focused on undergraduate learners, most colleges do pretty much the same thing: try to help them achieve their educational goals, including a mix of the practical (developing knowledge, skills and preparation for work), the personal (intellectual and personal growth), and the collective (contributions to society, including being engaged participants in communities and society).

    And on critique (b), yes, it’s true that some of what colleges and universities say they do may be hard to measure. But have we really tried? There are lots of big brains on college and university campuses: Couldn’t a working group find ways to quantify whether or not participation in a postsecondary course of study produces people with greater intercultural understanding or empathy? Or that they are more likely to donate to charity or to vote in national elections?

    The goal of this initiative would be to develop (through the collective participation of a diverse group of institutional and other stakeholders, through an existing association or a new coalition of the willing created expressly for this purpose) a broadly framed but very specific menu of indicators that would present a fuller picture of whether colleges and universities are delivering on the promises they make to students and to society more broadly. Ideally we’d generate institution-level data that would scaffold up to an industrywide portrait.

    The information would almost certainly give college leaders fodder to make a better public case about what their institutions already do well. But it would just as likely also reveal areas where the institutions fall short of what they say in their mission statements and where they collectively need to improve, and provide a scorecard of sorts to show progress over time.

    At the core, it would give them a way of showing, to themselves and to their critics, that they are willing to look at their own performance and prove their value, rather than just asserting it as they have arrogantly done for a long time. Colleges and universities would get public credit for being willing to hold themselves accountable.

    What would we want to measure, and how would we do so? Smarter people than me would need to help answer those questions, but possible areas of exploration include the following, based on ground laid over the years by the Gates Foundation’s Postsecondary Value Commission, Lumina and Gallup in a 2023 report, and others.

    Economic indicators might include:

    • Lifetime earnings
    • Employment and unemployment rates/job placement in desired field
    • Return on investment (comparing learners’ spending on their education with their lifetime earnings)
    • Social mobility (Do colleges help people advance up the economic ladder? Can we update the 2017 Chetty data to become a regular part of the landscape?)
    • Debt repayment

    Noneconomic indicators might include:

    • Employer alignment (Do higher education programs help students develop the skills and knowledge employers demand—technical skills like AI readiness and “human skills” such as critical thinking, problem-solving and creativity?)
    • Civic and democratic engagement (voting rates, charitable contributions)
    • Empathy and social cohesion (Does going to college make us more empathetic? More inclined to understand those who are different? Less racist?)
    • Health and emotional well-being/happiness (Surely with all the health data out there, one might be able to document some correlation, if not causation?)
    • Intercultural/global understanding

    Most of the indicators above would gauge contributions to individuals, rather than to society as a whole (though obviously some accrue to society). Those who believe we’ve stopped viewing higher education as a public good might argue for trying to measure the contributions institutions make to local and national economies (through their research, role as employers. etc.), as community anchors (medically, culturally, spiritually), and the like.

    Higher education has serious work to do to earn back the American public’s trust and confidence. Argumentation won’t suffice. I recognize that it may be hard to find (or develop) tangible information to build a data-based case that colleges and universities do what they say they do in their mission statements and promotional brochures.

    But could it hurt to try? What we’re doing now isn’t working.

    Doug Lederman was editor and co-founder of Inside Higher Ed from 2004 through 2024. He is now principal of Lederman Advisory Services.

    Source link

  • The Dangers of Pathologizing Administration (opinion)

    The Dangers of Pathologizing Administration (opinion)

    “One of my most distinguished colleagues … for a time refused to attend any meetings and made a point of always working on a book while others met to discuss departmental and university issues. After two years of boycotting meetings … [he] published a very nice book on the presidency … [and] cheerfully pointed out that he had written virtually the entire book during hours when he was not present at meetings.” —Benjamin Ginsberg, The Fall of the Faculty: The Rise of the All-Administrative University and Why It Matters (Oxford, 2011)

    Popular culture is rife with depictions of the hapless or even evil academic administrator, typically a dean. Most administrators know and regularly use the “double secret probation” line from the authoritarian and humorless Dean Wormer in Animal House (1978). In Old School (2003), Jeremy Piven portrayed a particularly noxious and conniving dean, who finally met his death when he was crushed by a car while fly fishing.

    More recently, dean representations have been kinder. For example, the dean from the 2021 Netflix series The Chair both misquotes Shakespeare to English faculty and uses the line “butts in seats” when trying to juice his English Department into taking action to stem the loss of majors and students. He is at least nice and kind.

    Maybe the most accurate representation of a dean was the one portrayed by Oscar Nuñez in the 2023 TV drama Lucky Hank, a modernized version of an excellent academic satire, Richard Russo’s Straight Man (1997). Constrained by a hapless president hell-bent on cutting faculty positions, and frustrated by turbulent and upset professors, again in the English Department, Dean Rose at least tries to muddle through with compassion. So, ineffective but nice is about as good as it gets for the representation of deans in popular culture.

    Popular culture provides lenses through which many of us see the world. A year before Animal House’s Dean Wormer, moviegoers were introduced to George Lucas’s menacing dark side of the force in Star Wars. And today, when a promising colleague tries their hand at administration, some may say that they have “gone over to the dark side.” Indeed, one of our old Ph.D. advisers (Jeff’s) emailed him with that remark—and he certainly heard it from many others, too—when he took an associate dean role in 2013.

    Several years ago, Jeff gave a presentation on how senior tenured faculty can make change difficult and the need for deans to more effectively consult and lead with them through shared governance. As part of his presentation, he showed an image of Bill Lumbergh, the mediocre boss played by Gary Cole in Office Space (1999), wearing a Darth Vader helmet. The line Jeff used in the presentation was, essentially, “faculty find us to be an odd mix of both pure evil and mediocrity.”

    The line landed well, with steady laughter for around 10 seconds in a room of at least 50 deans and associate deans. That strong response reveals the degree to which attacks on administrators are ubiquitous across universities and even disciplines.

    Indeed, beyond popular culture, we tend to vilify and pathologize administrators even within academia. In an Inside Higher Ed article titled “Who and What Is ‘The Administration’?,” a piece designed to help academics understand governance and organizational charts, Kathy Johnson Bowles describes academics’ general feeling about “the administration” being “a shadowy, amorphous group of suit-wearing, exorbitantly paid employees. They are to be vilified for making knuckleheaded, illogical, tone-deaf decisions that put the institution at risk, insult the faculty, demoralize the staff, enrage students and underestimate the power of the alumni.”

    Rather than taking the temperature of faculty attitudes, as Bowles does, Ginsberg, in his The Fall of the Faculty, offers a host of disparaging remarks about administrators, using a broad brush to condemn them as incompetent. For example, in writing about associate deans, whom he disparagingly calls “deanlets,” he says, “Many deanlets’ managerial savvy consists mainly of having the capacity to spout last year’s management buzz words during meetings, retreats, and planning exercises.”

    Ginsberg summarizes his whole project as such: “My book sounds a warning and offers a prescription designed to slow if not halt the spread of administrative blight. The prescribed medication will come too late for some victims, but others may yet recover.” While the expansion of administration versus faculty positions is a legitimate problem, to compare it to a disease is unnecessarily critical and simply enlarges the gap between faculty and administration that is so damaging to academic culture.


    Our own journey into academic administration was not a direct one. Years ago, we were both working together at a university in east Texas, and we had a regular poker game that included three other faculty members. On a Saturday night, once we settled seriously into the steady work of picking cards, tossing chips and reading each other’s faces, we regularly hit on two or three subjects. Invariably, we would end up talking about departmental issues (we came from three different departments, all in the liberal arts) and our less-than-impressive dean. We were all relatively young assistant professors, so we made bold claims about the way things should be at the university.

    Looking back, some were very sharp ideas, and others were naïve. One night Jeff said something along the lines of, “If we are so smart, shouldn’t we become deans? You know, lead, follow or get out of the way.” We had a good chuckle and returned to our game. Nearly 16 years later, while Jeff was the only one to take the path to become a dean, at least three of the other four friends have spent significant time serving as department-level administrators.

    If years ago we began as youthful know-it-alls with a slight disdain for our dean, what happened to commit us to various forms of administration? What led us to the dark side? For Jeff, his pathologization of administration earlier in his career began to end upon reading The Fall of the Faculty, a book he finally closed in fatigue. A fatuous and stunningly self-indulgent, even mean-spirited book, it opened his eyes not only to his knee-jerk approach to his dean at the time but also the degree to which faculty, and mostly senior faculty, had used ridicule and hatred of administration as a justification for not providing service and not engaging with the serious issues of the university. For Lee, his own concerns about the dangers of pathologization were driven home when a faculty colleague actually said to him that just because he had an administrative role, he would continue to lose friends.

    In both of these examples, we find the myth of the dark side at play. Faculty render an image of Darth Administrator so they can imagine themselves to be the light side of the force—Professor Skywalkers all, pure in defending the virtue and mission of higher education. But light and dark are complementary opposites, and as Jeff’s example above should indicate to anyone familiar with Star Wars’ lore, anger and hatred are the way of the Sith.

    An essay about the othering of university administrators written by two middle-aged, straight, white full professors may seem problematic, to say the least. To be clear, we are not claiming this othering as an issue of oppression. And indeed, we note that administrators from underrepresented backgrounds can be othered in very troubling ways. Rather, we identify this pathologizing of administration because it disrupts the functioning of higher education.

    It would be unfair if we did not acknowledge that administrators also grouse about faculty. For Lee, in his less generous moments, this may take the form of simply repeating a faculty complaint in a new setting as a bit of dry humor (e.g., “Did you know that requiring faculty to teach more than twice a week might cause the university to lose its R-1 status?”). We are not so naïve as to suggest that there should be no tension between faculty and administration or in any workplace. But what makes the faculty pathologizing of administration so different is its pervasive and public nature. Treating administration as the “dark side” has become the norm within academia, but it is a norm that is our undoing.


    Probably the most important problem that arises from this pathologization is the inability of faculty and administrators to cross the divide and work effectively together. There are always faculty who figure out how to do it, or do it because they know it is key to winning the support and advocacy they require. But what happens when faculty disdain or distrust for administration creates an obstacle? Perhaps a faculty member, lacking faith in their administration, will fail to ask for support for a student to attend a conference. In such a case, it is the student who will suffer the consequences. Or perhaps upon receiving a request from a faculty member who has repeatedly slighted the administration, an administrator may do their job in a professional but minimal way, still helping the faculty member, but maybe not moving heaven and Earth to make their life better. Why should they?

    Constant negativity coarsens administrator experiences and attitudes. Over the years we have openly heard “We need fewer deans here,” “You’re just going to leave soon for another higher-paying job,” “I don’t know why you are paid so much,” “We need to return to the old model with no deans,” “Administrators don’t teach real classes” and other troubling statements. With all this in mind, we ask our faculty colleagues—because faculty are the colleagues of administrators and vice versa—to consider a few questions.

    • Think of the damage that has been done to U.S. institutions by politicians vilifying university professors as lazy and ineffective. Why would you contribute to this effort? And how would you feel about your colleagues if that is how they spoke about you, and so unabashedly?
    • Effective administration often requires learning the culture of an institution and building strong relationships. Faculty rightly complain about administrators job-hopping across institutions. But to what degree do faculty drive away potential leaders and allies?
    • Consider also the opportunity cost for faculty. Viewing administrators through the “dark side” lens, or knowing that their colleagues hold these negative views, may deter talented faculty from moving into leadership roles and accomplishing great things in their careers. This, of course, leaves a lot of space for the less talented among us. Whom do you want in the administrative role—the person with the strongest knowledge of how the university works, vision for the program, capacity for listening, etc.? Or simply the person with the thickest skin, who can take the most guff from faculty and who plays favorites to make the right people happy?

    Finally, we need to shift the debate away from faculty versus administration. If we remember that the purpose of higher education is our students, and if we always center our students in conversations between faculty and administration, we stand a much better chance of working together.


    Closing this gap is a responsibility that falls on all of us. Administrators and faculty can do a lot more to communicate and engage more effectively, thereby making such othering less likely. In an earlier essay, we discussed ways to improve shared governance. Administrators who build trust through small actions—i.e., doing the thing they said they would do, closing out communications and being as transparent and consultative as possible—will close the gap on their side substantially. Faculty who are able and willing to set aside the casual critiques and invite administrators into collaborations, to bring problems with solutions to them—or who are even willing to have a chat over a cup of coffee—will likewise do a great deal to close the gap from their side.

    Returning to Ginsberg’s example of the faculty member who wrote a book instead of attending departmental meetings, this moment epitomizes the desire of some faculty to see themselves as islands alone in the ocean. However, a university is not a place for islands. It is more like one of those ancient Mediterranean warships, the triremes, with masses of people rowing together in unison. By refusing department meetings and service, Ginsberg’s colleague took his oar out of the water, making the rowing harder for everyone else. Likewise, as junior faculty we observed the failure of some senior faculty to perform their work while engaging in casual slander of administrators. To what degree does faculty abdication of their duties actually contribute to the growth of administration? Somebody has to do the work.

    So, please, do the work, step into leadership, put your oar in the water, come to the dark side, acknowledge the humanity of administrators and let us work together to build a stronger and more positive university for everyone.

    Jeff Crane is the dean of the College of Arts, Humanities and Social Sciences at California State Polytechnic University, Humboldt, and host of the Yeah, I Got a F#%*ing Job With a Liberal Arts Degree podcast and co-host of the SNAFUBAR podcast.

    Lee Bebout is a professor of English and recovering departmental administrator at Arizona State University whose recent research on political efforts to thwart social transformation has provided insight into how higher education resists change.

    Source link

  • Misrepresenting Prison Education Risks Harming Students

    Misrepresenting Prison Education Risks Harming Students

    To the editor:

    We write from a Big 10 Prison Education Program, where we’ve worked for a decade to increase access to higher education for incarcerated individuals. We found the framing of the article,“Prison Education May Raise Risk of Reincarceration for Technical Violations” (Jan. 12, 2026) to be misleading and have deep concerns for its potential impact on incarcerated students and prison education programming.

    The article fails to acknowledge decades of evidence about the benefits of prison education. The title and framing deceptively imply that college programs increase criminal activity post-release at a national scale. The Grinnell study—an unpublished working paper—is only informed by data collected in Iowa. Of most impact to incarcerated students, the title and introductory paragraphs mislead the reader by implying that the blame for technical violations and reincarceration should be placed on the justice-impacted individuals themselves. Buried in the article is a nuanced, accurate, structural interpretation of the data: per Iowa-based data, incarcerated individuals who pursue college may be unfairly targeted by parole boards and other decision-making bodies in the corrections system, thus leading to a higher rate of technical violations.

    The impact of the article’s misleading framing could be devastating for incarcerated college students, especially in a climate where legislators often value being “tough on crime.”

    We understand the importance for journalism to tell the full story, and many of the Grinnell study’s findings may be useful for understanding programmatic challenges; however, this particular framing could lead to its own unintended consequences. The 1994 repeal of Pell funding collapsed prison education for nearly thirty years; as a result, the US went from having 772 Prison Ed Programs to eight. Blaming incarcerated individuals for a structural failure could cause colleges and universities to pull support from their programs. We’ve already seen programs (e.g.,Georgia State University) collapse without institutional support, leaving incarcerated students without any access to college. This material threat is further amplified by the article’s premature conclusions about a field that has only recently—as of 2022 with the reintegration of Pell—begun to rebuild.

    In a world where incarcerated students are denied their humanity on a daily basis, it is our collective societal obligation to responsibly and fairly represent information about humanizing programming. Otherwise, we risk harming students’ still emerging—and still fragile—access to higher education.

    Liana Cole is the assistant director of the education at the Restorative Justice Initiative at Pennsylvania State University.

    Efraín Marimón is an associate teaching professor of education; director, of the Restorative Justice Initiative; and director of the Social Justice Fellowship at Pennsylvania State University.

    Elizabeth Siegelman is the executive director for Center for Alternatives in Community Justice.

    Source link

  • Montana President Eyes Senate Run

    Montana President Eyes Senate Run

    Don and Melinda Crawford/UCG/Universal Images Group via Getty Images

    While the politician–to–college president pipeline is thriving in red states like Florida and Texas, University of Montana president Seth Bodnar aims to go the other direction with a Senate run.

    Bodnar is expected to launch a bid for the U.S. Senate as an Independent and will resign from his role as president, a job he has held since 2018, to do so, The Montana Free Press reported

    A Bodnar spokesperson confirmed the run and the resignation plans to the news outlet but said he would wait until after a formal announcement to provide more details. The move is reportedly part of a plan backed by Jon Tester, a Democrat who served in the Senate from 2007 to 2024. Tester was unseated by Republican Tim Sheehy in 2024.

    Bodnar

    The University of Montana

    Tester has reportedly expressed skepticism about chances for a Democratic victory but signaled support for Bodnar in a text message, viewed by local media, in which he pointed to the UM president’s background in private business, military service and Rhodes Scholar status.

    Bodnar holds degrees from the United States Military Academy and the University of Oxford. He served in Iraq as a member of the 101st Airborne Division, was a Green Beret in the U.S. Army’s First Special Forces Group, and later a lieutenant colonel in the Montana National Guard.

    Bodnar taught at West Point from 2009 to 2011 before joining General Electric, where he served in a variety of corporate leadership roles before he was recruited to take the UM presidency.

    A university spokesperson did not respond to a request for comment from Inside Higher Ed asking when a formal campaign announcement will be made or when Bodnar may step down.

    Source link