Tag: Changing

  • AI Is Changing How Students Search: Will Your Website Show Up?

    AI Is Changing How Students Search: Will Your Website Show Up?

    AI is no longer a distant disruption. It’s already influencing how prospective students and families search, navigate, and make decisions on higher education websites. As teams responsible for delivering seamless digital experiences, we need to understand the behavioral shifts underway and how to respond strategically.

    Across the institutions we support, we’re seeing early but consistent signals: users expect smarter, faster, and more personalized interactions. These changes are subtle in some places and dramatic in others. But they’re accelerating.

    How AI is changing search behavior

    AI tools like Google’s Search Generative Experience (SGE), ChatGPT, and other large language models are changing how people expect to interact with information. According to a 2023 Pew Research study, 58% of U.S. adults are aware of ChatGPT, and younger audiences are among the most active users. Meanwhile, Google continues testing SGE, which presents AI-generated summaries above traditional search results.

    Students are learning to type full, natural language questions — and they expect precise, context-aware responses in return. This behavior is now showing up in on-site search patterns.

    Across higher ed websites, here are a few things we’re noticing:

    • A rise in long-form, conversational search queries, especially within internal site search tools
    • Increased use of search bars over menu navigation (particularly on mobile). A recent E-Expectations Trend report found that half of high school students use the site search to navigate a website.
    • Across the higher ed websites we support, we see stronger performance on pages that are tailored to high-intent topics like cost, admissions, and outcomes. A recent analysis of over 200 higher ed sites found that 53% of engaged sessions come from organic search — highlighting the importance of content that’s built for both SEO and AI-driven discovery.
    • Additionally, research indicates that 80% of high school juniors and seniors consider an institution’s website the most influential resource when exploring schools. This highlights the critical role of personalized and relevant content in engaging prospective students effectively.
    • These findings emphasize the necessity for higher education institutions to develop and maintain website content that is specifically tailored to the needs and questions of their target audiences to enhance engagement and support enrollment goals.
    • Parents and adult learners demonstrate similar behavior as they vet institutions with a clearer sense of goals and outcomes.

    Ready for a Smarter Way Forward?

    Higher ed is hard — but you don’t have to figure it out alone. We can help you transform challenges into opportunities.

    We still need to get the fundamentals right

    It’s important to say: AI-driven search doesn’t eliminate the need for strong site structure. Navigation menus, clear page hierarchy, and thoughtful content design still matter — a lot. Most users move fluidly between browsing and searching. What’s changing is the expectation for speed, relevance, and control.

    To meet this moment, higher ed websites should focus on:

    • Modernizing internal search tools to move beyond keyword matching and support relevance-based or semantic search with tools like Vertex AI in full-site search tools or even program finders.
    • Designing content around user intent, not just institutional priorities. Emphasize topics that students are searching for — like affordability, flexibility, and outcomes — rather than internal program structures or catalog-style descriptions.
    • Making calls to action easy to find and easy to act on (especially for first-time visitors.) We help partners optimize for conversion with AB testing for placement, messaging, and functionality that best resonates with your audience.
    • Better leveraging personalized and dynamic content to deliver tailored experiences based on user behavior, location, or stage in the journey. For instance, high-intent pages like “How to Apply” can be leveraged to serve personalized content blocks based on the visitor’s context. A returning user who previously viewed graduate programs might see a prompt to schedule a call with a graduate admissions counselor. A visitor browsing from New York in the evening hours could be shown a message about flexible online options for working professionals. These dynamic cues guide prospective students forward in their journey without overhauling the entire site.

    Why this isn’t a one-time fix

    This is not a single redesign or one-time upgrade. Optimizing your site for how people actually use it needs to be a continuous process.

    This should include the following:

    • Reviewing analytics and user behavior regularly
    • Conducting search query audits to identify gaps
    • A/B testing calls to action and user pathways
    • Collecting both qualitative and quantitative research to understand different audience needs

    Higher ed website performance is directly tied to enrollment growth. According to a 2024 survey conducted by UPCEA and Collegis Education to better understand the perspectives of post-baccalaureate students, 62% of respondents said not being able to easily find basic program information on the institution’s website would cause them to disengage.

    The survey focuses on program preferences, delivery methods, and expectations during the inquiry and application processes and offered insights into how these preferences vary by age and degree level.

    How to prepare for what’s next

    To stay competitive and relevant, institutions need to invest in both smart search experiences and a streamlined digital journey. Here are some high-level recommendations:

    1. Audit your internal search functionality. How are users searching your site, and are they getting the right results?
    2. Map user journeys for key audiences. This includes traditional students, adult learners, and family decision-makers.
    3. Evaluate AI integration options. Tools like Google’s Vertex AI or other semantic search platforms can enhance search accuracy and personalization.
    4. Don’t overlook AEO (answer engine optimization). As AI-powered tools reshape how students discover and evaluate schools, it’s time to think beyond traditional SEO. AEO focuses on structuring content to directly answer the natural-language questions students now ask in tools like ChatGPT and Google’s SGE. We can help you begin integrating AEO into your strategy and content planning, so your institution stays visible in the next wave of search.
    5. Treat optimization as ongoing. Staying competitive in the AI era requires continuous improvements grounded in data, user behavior, and evolving search trends. Ongoing commitment to this initiative is crucial.

    Smarter web experiences start now

    The future of higher ed websites isn’t just about making information accessible. It’s about making it findable, meaningful, and actionable – and being able to act fast and stay committed to this work.

    Institutions that recognize how AI is already reshaping user expectations, and respond with thoughtful, strategic digital experiences, will meet today’s learners where they are and build trust for the long-term.

    We’re paying close attention to these shifts and helping institutions make smart, scalable updates. If you’re rethinking how your website supports recruitment, engagement, or conversion, now is the right time to start. Collegis Education supports institutions with strategic marketing and web solutions designed to meet these evolving needs.

    Let’s talk about how we can work together to future proof your web and digital experiences to best support enrollment growth for years to come. 

    See how your website stacks up — Contact us to request your AI Readiness Assessment

    Innovation Starts Here

    Higher ed is evolving — don’t get left behind. Explore how Collegis can help your institution thrive.

    Source link

  • Navigating higher education in a changing landscape

    Navigating higher education in a changing landscape

    • Ahead of TASO’s annual conference, How to evaluate, on 29–30 April, Omar Khan, CEO of TASO (the Centre for Transforming Access and Student Outcomes in Higher Education) discusses the challenges facing higher education, particularly in the face of wider discussions around the value and purpose of higher education in the UK and beyond.

    We all know of the challenges facing higher education. The questions can feel existential: from the financial sustainability of institutions to the social consensus on the value and purpose of higher education itself.

    Without seeming pollyannish, I believe higher education can and must continue to argue for its value and purpose in these difficult times. There remains significant agreement that higher education brings value, for individuals as well as the economy, with reputational benefits for the UK internationally too. Similarly, there is broad consensus that addressing inequalities of participation as well as of the student experience is a priority. While we shouldn’t be complacent about the impact of criticism of ‘DEI’ (diversity, equity and inclusion) in the US, so far UK higher education has remained committed to the widening participation agenda and the sector has not been subject to sustained public attacks from the government.

    One reason that widening participation remains on the agenda is the legislative and regulatory environment. Significantly, for over a decade, the principle has been established that rising fees should be matched by a clear commitment to demonstrating improved access. As the sector will now know, in England this is delivered through providers submitting access and participation plans (APPs) to the Office for Students.

    A commitment to evaluation

    APPs are now also expected to have a clear commitment to evaluation. Unsurprisingly, given my role as CEO of the higher education What Works Centre TASO, I think this is a good thing. At TASO we’ve seen a significant improvement in the number and robustness of evaluations across the sector since our founding some five years ago.

    As we gather for our fourth annual conference (29–30 April), we will continue to support the sector on understanding the evidence base on inequalities in higher education. We do this in two main ways: through synthesising and commissioning research, and by producing more practical guidance for the sector to deliver effective evaluation themselves.

    A library of providers’ evaluations

    Recently, we’ve announced a key way we will bring this work together: the Higher Education Evaluation Library, or HEEL (like the rest of the sector, we too love an acronym), working in partnership with HEAT, the Higher Education Access Tracker, to deliver it. The library will bring together higher education evaluations in one place, which are otherwise published across the wide range of institutions across the sector.

    At our conference, we will continue our consultation with the sector about the library to ensure we understand and are responsive to how evaluators and others can best use this resource. Once we have consulted and worked with HEAT to develop the infrastructure for HEEL, and once providers upload their evaluations into this online library, we will produce regular digests summarising what we find. Ultimately, the goal or promise is that these digests will improve the evidence base, reduce duplication across the sector and improve outcomes for students.

    Navigating the financial landscape

    At TASO we are optimistic about the future of evaluation in the sector, not least as we have seen a wider cultural and institutional commitment to joint learning as well as to the value of equal opportunity and social mobility that motivates all of us to do this work. However, I want to recognise and to flag a serious concern that TASO (and no doubt many others) is seeing across the sector, that is, how the financial situation impacts widening participation activity.

    To effectively evaluate and assess whether activities improve outcomes for students, those activities need to be adequately resourced. We have heard evidence that redundancies and cost-cutting across the sector are impacting on the ability of staff to deliver these activities, as well as to evaluate them. This is in a context where child poverty is increasing, where inequalities in school attainment are rising, and where the higher education attainment gap between free school meal students and their more advantaged counterparts is at its widest at over 20.8 percentage points.

    A refocus on values and mission

    We recognise that times are tight, that tough decisions need to be made and that this has an impact on staff morale. At the same time, higher education must continue to prioritise its values and mission: a commitment to evidence as well as to equality and social mobility. Furthermore, at a time of increased public scepticism of how the sector is delivering on these aims, delivering for the most disadvantaged students becomes a matter of public support and democratic consensus.

    As we’ve spent the past decade building the foundations to better address inequalities in higher education, it’s vital we continue to work together to make the promise of higher education a reality for everyone who wants to access it, regardless of their background.

    While TASO is here to support the sector to do this, we cannot do this alone, and I want to recognise and thank all of those who do this important work day in and day out: senior leaders, evaluators, practitioners, third sector organisations, teachers, parents and of course student leaders and activists committed to ensuring better lives for themselves and their peers.

    Source link

  • ‘How can I know what I think till I see what I say?’: How AI is changing education and writing

    ‘How can I know what I think till I see what I say?’: How AI is changing education and writing

    • Following HEPI’s recent Policy Note on students’ use of artificial intelligence (AI), HEPI Director Nick Hillman reviews a new book from the United States on what AI means for writing.

    ‘ChatGPT cannot write.’ It’s a bold statement but one near the start of the new book More Than Words: How to Think about Writing in the Age of AI that explains what comes in the following 300 pages.

    The author John Warner’s persuasive argument is that generative AI creates syntax but doesn’t write because ‘writing is thinking.’ (I hope this is the only reason why, when asked to write a higher education policy speech ‘in the style of Nick Hillman’, ChatGPT’s answer is so banal and vacuous…) People are, Warner says, attracted to AI because they’ve not previously been ‘given the chance to explore and play within the world of writing.’

    Although Warner is not as negative about using ChatGPT to retrieve information as he is on using it to write wholly new material, he sees the problems it presents as afflicting the experience of ‘deep reading’ too: ‘Reading and writing are being disrupted by people who do not seem to understand what it means to read and write.’

    The book starts by reminding the reader how generative AI based on Large Language Models actually works. ChatGPT and the like operate as machines predicting the next word in a sentence (called a ‘token’). To me, it is reminiscent of Gromit placing the next piece of train track in front of him as he goes. It’s all a bit like a more sophisticated version of how the iPhone Notes app on which I’m typing this keeps suggesting the next word for me. (If you click on the suggestions, it tends to end up as nonsense though – I’ve just done it and got, ‘the app doesn’t even make a sentence in a single note’, which sounds like gibberish while also being factually untrue.)

    ‘The result’, we are told of students playing with ChatGPT and the like, ‘is a kind of academic cosplay where you’ve dressed up a product in the trappings of an academic output, but the underlying process is entirely divorced from the genuine article.’

    Writing, Warner says, is a process in which ‘the idea may change based on our attempts to capture it.’ That is certainly my experience: there have been times when I’ve started to bash out a piece not quite knowing if it will end up as a short blog based on one scatty thought or flower into a more polished full-length HEPI paper. Academics accustomed to peer review and the slow (tortuous?) procedures of academic journals surely know better than most that writing is a process.

    The most interesting and persuasive part of the book (and Warner’s specialist subject) is the bit on how formulae make writing mundane rather than creative. Many parents will recognise this. It seems to me that children are being put off English in particular by being forced to follow the sort of overweening instructions that no great author ever considered (‘write your essay like a burger’, ‘include four paragraphs in each answer’, ‘follow PEE in each paragraph’ [point / evidence / explain]). Warner sees AI taking this trend to its logical and absurd conclusion where machines are doing the writing and the assessment – and ruining both.

    Because writing is a process, Warner rejects even the popular idea that generative AI may be especially useful in crafting a first draft. He accepts it can produce ‘grammatically and syntactically sound writing … ahead of what most students can produce.’ But he also argues that the first draft is the most important draft ‘as it establishes the intention behind the expression.’ Again, I have sympathy with this. Full-length HEPI publications tend to go through multiple drafts, while also being subjected to peer review by HEPI’s Advisory Board and Trustees, yet the final published version invariably still closely resembles the first draft because that remains the original snapshot of the author’s take on the issue at hand. Warner concludes that AI ‘dazzles on first impression but … has significantly less utility than it may seem at first blush.’

    One of the most interesting chapters compares and contrasts the rollout of ChatGPT with the old debates about the rise of calculators in schools. While calculators might mean mental arithmetic skills decline, they are generally empowering; similarly, ChatGPT appears to remove the need to undertake routine tasks oneself. But Warner condemns such analogies: for calculators ‘the labor of the machine is identical to the labor of a human’, whereas ‘Fetching tokens based on weighted probabilities is not the same process as what happens when humans write.’

    At all the many events I go to on AI in higher education, three areas always comes up: students’ AI use; what AI might mean for professional services; and how AI could change assessment and evaluation. The general outcome across all three issues is that no one knows for sure what AI will mean, but Warner is as big a sceptic on AI and grading as he is on so much else. Because it is formulaic and based on algorithms, Warner argues:

    Generative AI being able to give that “good” feedback means that the feedback isn’t actually good. We should instead value that which is uniquely human. … Writing is meant to be read. Having something that cannot read generate responses to writing is wrong.

    The argument that so many problems are coursing through education as a result of new tech reminds me a little of the argument common in the 1980s that lead pipes brought down the Roman Empire. Information is said to become corrupted by AI in the way that the water supposedly became infected by the lead channels. But the theory about lead pipes is no longer taken seriously and I remain uncertain whether Warner’s take will survive the passage of time in its entirety either.

    Moreover, Warner’s criticisms of the real-world impact of ChatGPT are scattergun in their approach. They include the ‘literal army of precarious workers doing soul-killing tasks’ to support the new technology as well as the weighty environmental impact. This critique calls to mind middle-class drug-takers in the developed world enjoying their highs while dodging the real-world impact on developing countries of their habit.

    In the end, Warner’s multifarious criticisms tot up to resemble an attack on technology that comes perhaps just a little too close for comfort to the attacks in the early 1980s by the Musicians’ Union’s on synthesisers and drum machines. In other words, the downsides may be exaggerated while the upsides might be downplayed.

    Nonetheless, I was partially persuaded. The process of writing is exactly that: a process. Writing is not just mechanical. (The best young historian I taught in my first career as a school teacher, who is now an academic at UCL, had the worst handwriting imaginable as his brain moved faster than his hand / pen could manage.) So AI is unlikely to replace those who pen words for a living just yet.

    Although, paradoxically, I also wished the author had run his text through an AI programme and asked it to knock out around 40% of his text. Perhaps current iterations of generative AI can’t write like a smart human or think like a smart human, but they might be able to edit like a smart human? Perhaps AI’s biggest contribution could come at the end of the writing process rather than the beginning? Technology speeds up all our lives, leaving less time for a leisurely read, and it seems to me that all those ‘one-idea’ books that the US floods the market with, including this one, could nearly always be significantly shorter without losing anything of substance.

    Source link

  • Strengthening data and insights into our changing university research landscape by Jessica Corner

    Strengthening data and insights into our changing university research landscape by Jessica Corner

    The UK continues as a global leader in research and innovation and our universities are uniquely strong contributors, among which are the highest performing in the world. We have some of the highest-intensity innovation ecosystems in the world, with universities as the core driver. As a country, our invention record is well recognised. The UK, with its powerful life sciences effort, delivered one of the first UK COVID-19 vaccines, saving millions of lives around the world and only possible because of long-standing investment in research that became serendipitously essential. In cities across the UK, universities act as pillar institutions with positive and reinforcing effects on their local economies. We have a rich network of specialist institutions that excel in music, the arts, medicine and life sciences. Our universities continue to deliver discoveries, technologies, creative insights, talent for our industries and public services and so much more. Many have the scale and reach to deliver across the full span of research and innovation to enterprise and commercialisation.

    A unique feature, and underpinning this extraordinary record, is our dual support funding system. That system balances competitive grant funding from UKRI Research Councils, charities, business, and others with long-term stable underpinning funds to enable universities to pursue ambitious and necessary strategies, develop research strengths, foster talent, pivot towards new fields, collaborate and maintain research infrastructure.

    However, the sector faces unprecedented challenges. Erosion of the value of student fees and the growing costs of delivering education, disruptions to anticipated income from international student fees, a slow erosion of the value of QR, rising costs of research and a mismatch between this and cost recovery from grants has created a perfect storm and unsustainable operating models for most institutions. The additional £5bn a year in funding from universities’ own surpluses towards research and innovation is no longer guaranteed. The sector has and continues to evolve in response to a changing landscape, but consideration is needed about how best to support the sector to change.

    Research England’s role is to support a healthy, dynamic, diverse, and inclusive, research and innovation system in universities in England6. We work by facilitating and incentivising system coherence, acting as both champion and challenger. In partnership we aim to create and sustain the conditions for the system to continue delivering excellence and leverage resources far beyond funding provided by government. We are working to enhance the data and evidence to support our role as expert, evidence-based funder and on the outcomes that the funding delivers. In fulfilling this role and against the current context, Research England has two initiatives that we will be taking forward in the coming weeks.

    Our ongoing programme to review the principles underpinning our funding and mechanisms by which we allocate research funds to institutions has reached a point where we are seeking to increase the visibility and transparency of how these funds are deployed by institutions. We are developing an approach, designed to be light touch and low burden that asks universities to report back on their use of strategic institutional research funding. We will begin testing the approach with a selection of institutions in the coming months and, subject to the outcomes of this initial engagement, aim to roll out a pilot with institutions in the 2025/26 academic year. We will be communicating to institutions directly about the pilot in the early Autumn. In the second phase of this work, we intend to work with institutions to develop a forward-looking strategic element that will give insight into plans and then how decisions are made about the deployment of funding. For the programme, we are also reviewing the effectiveness of the different unhypothecated and ring-fenced research funds provided to institutions. When fully implemented, the information we will acquire will enable Research England greater visibility of the role of institutions and the contribution of our formula-based research funding (including QR) to the research and innovation system while also contributing to efforts to have more systematic and timely data.

    A second strand of work is our programme to monitor the implications for the sustainability of research in universities against the current financial context. We are seeking to better understand how challenges are impacting universities’ ability to deliver research and innovation and maintain research capabilities, capacity, and facilities and, in turn, further strengthen assurance with more robust data. In partnership with the Department for Innovation Science and Technology, we have commissioned the Innovation Research Caucus with OMB Research Ltd to undertake a survey into how institutions are responding to current pressures with respect to research and innovation. The survey will provide important data that can support advice to government and others on the extent of universities’ financial challenges, how these issues are being managed, and how this impacts their investment and planning in the research and innovation space. The approach is to provide insights that are currently not available at an aggregate level or in a timely way through national data sets. Additionally, Research England will be asking institutions to report on material changes they are making to research and innovation capabilities and capacity or in relation to wider changes in institutional form or organisation when these may affect the basis on which our funding is awarded.

    We continue to see our role as facilitator, enabler and partner and believe we have a strong reputation for having timely and robust insights into the conditions underpinning our great research and innovation system. These two programmes of work are being taken forward in support of universities and, against the current backdrop, will strengthen Research England’s fundamental role in the research and innovation system. We look forward to working in close partnership with universities as we take these critical work programmes forward.

    Source link

  • How AI is Changing the Way I Teach Business Law

    How AI is Changing the Way I Teach Business Law

    Reading Time: 5 minutes

    AI has taken the world by storm, and the education field is no exception. After over two decades teaching at The Paul Merage School of Business at the University of California, Irvine, I have seen lots of changes related to curriculum, teaching resources and students. However, I’ve seen nothing quite like the wave of AI tools popping up in classrooms. It’s exciting, a little daunting and definitely something we all need to talk about.

    So, here’s the deal: I’m not an AI expert. But I have spent a lot of time experimenting with AI, learning from my mistakes and figuring out what works and what doesn’t. I’d like to share some of these experiences with you.

    AI in education: What’s the big deal?

    AI is already here, whether we’re ready for it or not. According to Cengage research, use of AI has nearly doubled among instructors, from 24% in 2023, to 45% in 2024. Many of us are using AI to create lectures, craft assignments and even grade assessments. The challenge is not whether we adopt AI. Rather, it’s doing so in a way that enhances our students’ learning outcomes, while maintaining academic integrity in our courses.

    In my online undergraduate business law course, I have always required my students to take written assessments, where they analyze a set of facts to reach a legal conclusion. Not only am I trying to teach them the principles of law, but I want them to improve their writing skills.

    A shift in focus

    A few years ago, I noticed a subtle increase in the overall scores for these written assessments. I have taught this course for over 20 years, so I knew what the historical scores were. Looking into it further, I started hearing about how some students were using ChatGPT in their courses. This got me wondering whether some of my students had already been using it to take my written assessments. Quick answer: yes, they were. This now presented a problem: what do I do about it? In an online course, how can I prohibit the use of AI tools on a written assessment while effectively enforcing that ban?  I shifted my focus from policing and enforcing a ban on the use of AI in my courses to teaching my students how to use AI responsibly.

    Teaching students to use AI responsibly

    In my course, I developed assignments called “Written ApprAIsals.” These three-part writing assignments combine traditional learning with AI-assisted refinement. These teach students how to use AI responsibly while improving their critical thinking and writing skills. Here’s how it works:

    Step 1: Write a first draft without AI

    Students are given a law and related news article about a current legal issue. They must write a memo which analyzes the constitutionality of this law. I also provide them with guidance on utilizing the standard legal memo format, known as IRAC (Issue, Rule, Analysis, Conclusion), to help organize their thoughts and writing.

    Students are permitted to use whatever materials they have, including eBooks, my lecture videos and outlines, Cengage’s online learning platform, MindTap and its resources, and any other information they ethically obtain online. But, they’re not permitted to use AI.

    The purpose of this first draft is for them to demonstrate the foundational knowledge they should have already learned. Students must attest to completing this first draft without using AI, and it’s worth 30% of the total “Written ApprAIsal” grade.

    Step 3: Integrate AI to resolve deficiencies

    Once I have given them feedback on their first drafts, students are required to use AI to improve their first draft. Students must submit the URL to their AI queries and responses (“AI log”). Or less ideally, they can submit a PDF or screenshot of them. I can assess the effort they put in, evaluate their queries, and provide guidance on how to more effectively use AI. This part is worth 40% of the total “Written ApprAIsal” grade.

    Step 3: Use AI to help write a final draft

    Using what they’ve obtained from AI, along with my feedback, students must transform their first draft into an improved final draft. Students are permitted to continue using AI as well.  They must turn on track changes in their document so I can see what changes they’ve made to the first draft.

    Why has this approach worked in my course?

    1. It makes students aware of my familiarity with AI and how it’s used. Students now know I am on the lookout for improper usage of AI in our course.
    2. It encourages their acquisition of foundational knowledge. Students quickly figure out that they must know the basic legal principles. Without them, they will have no idea if AI is providing them with inaccurate information, which can happen sometimes, especially when it comes to legal cases
    3. This approach promotes academic integrity. Students recognize their first drafts must reflect their genuine understanding. There is no benefit to using AI for the first draft. Because the remaining parts are based on their use of AI to improve the first draft, there will not be much room for improvement if the first draft is too good. And because students must submit their AI logs, I can easily ascertain if they actually did the work.
    4. Students build necessary skills for their future careers. They can improve their writing and analysis skills in a low stakes’ way, while receiving useful feedback.
    5. It helps me focus my efforts on helping them understand the law, rather than having to enforce a ban on the use of AI.

    Issues related to this approach

    It takes a lot of effort to find the right law and related news article to use. Not only does the law have to be current, but it also must be interesting and relevant to the students. Legal issues must be presented in a way which are factually neutral to avoid bias. And, the news articles must be factual and not cluttered with distracting commentary or opinions.

    Additionally, rapid feedback is required. With up to 150 students in my course, I only have a little more than 24 hours to turn around written feedback and comments on their first drafts and AI logs. Frankly, it can be overwhelming.

    Tips on integrating AI into your course

    I have learned a few things along the way about integrating AI into my courses.

    Establish clear rules: Be upfront and clear about when, and how, AI can be used. Stick to those rules and enforce them.

    Consider accessibility: Not every student has easy or affordable access to AI tools. Make sure you have alternatives available for these students.

    Teach foundational knowledge first: Students need to know the core concepts so they can critically evaluate any AI-generated content.

    Require transparency: Students must show how they used AI. It is a great way to keep them honest.

    Be flexible and open to experimentation, most importantly: Mistakes are inevitable. There will be times where something you thought would work just doesn’t. That’s ok. Adjust and keep innovating.

    Final Thoughts

    AI is here to stay, and that’s not necessarily a bad thing. AI is a tool that can help students learn. But, it’s up to us to show our students how to use AI responsibly. Whether it’s helping them improve their writing skills, gain foundational knowledge or develop critical thinking skills, AI has so much potential in our courses. Let’s embrace it and figure out how to make it work for each of us.

    Got ideas or experiences with AI in your courses? Let’s connect. I would love to hear how you are using it!

    Machiavelli (Max) Chao is a full-time Senior Continuing Lecturer at the Paul Merage School of Business at the University of California, Irvine and Cengage Faculty Partner. 

    Source link

  • Greater width and greater depth: changing higher education in an electronic age

    Greater width and greater depth: changing higher education in an electronic age

    • Ronald Barnett is (www.ronaldbarnett.co.uk), Emeritus Professor of Higher Education at the Institute of Education and President of the Philosophy and Theory of Higher Education Society.

    Chris Husbands’ latest HEPI blog is fine so far as it goes but, I suggest, it goes neither wide enough nor deep enough.  Yes, the world is changing, and higher education institutions have to change, but the analysis of ‘change’ has to encompass the whole world (have great width) and also to burrow down into the deep structures of a world in turbulent motion (and so have much depth).

    The current crisis in UK higher education – and especially in higher education in England – has to be set more fully in the context of massive global shifts that directly affect higher education. These are too many to enumerate entirely here, but they include:

    • A hyper-fast world: Theorists speak of cognitive capitalism, but we have already moved into a new stage of electronic capitalism (of which AI is but the most evident feature).
    • Volatile labour markets (disrupting the relationship between higher education and the world of work)
    • A fragmenting state of student being: now, their higher education is just part of an incredibly complex life and set of challenges with which students are confronted
    • Geo-political shifts affecting (and the reduction of) the internationalisation of higher education; and
    • An entirely new complex of human needs (physical, cognitive, social, political, environmental, and phenomenological) for learning to be part of life, ‘cradle-to-grave’. 

    In short, the world is fast-changing not only around higher education but in the depths of higher education in ways not yet fully appreciated.  It is a world that is going through multiple and unimagined transformations, transformations that are replete with conflicts and antagonisms that are both material and discursive. These changes are already having effects on higher education, especially on what it is to be a student.

    Now, students are increasingly part-time (whatever the formal designation of their programmes of study). Moreover, they play out their lives across multiple ecosystems – including the economy, social institutions, culture, the polity, the self, knowledge, learning, the digital environment, and Nature. In the context of these nine ever-moving ecosystems, what it is to be a student is now torn open, fragmented, and bewildering to many. 

    More still, many students of today will be alive in the 22nd century and will have to navigate an Earth, a world, that will exhibit many stages of anxiety-provoking and even possibly terrifying change. Sheer being itself is challenged under such circumstances and, more so, the very being of students. And we see all this in spades in (quite understandable) growing student anxiety and even suicide.

    In this context, and under these conditions, a model and an idea of higher education bequeathed to the world largely from early 19th century Europe (Germany and England) no longer matches the present and the future age for which students are destined.

    Crucially, here, we are confronted by profound changes that are not just institutional and material in that sense, but we are – and students are – confronted with unnoticed changes in the discourse of higher education.  Not just what it means to be a student but the very meaning of student is changing in front of us; yet largely unnoticed. (Is being a student a matter of dutifully acquiring skills for an AI world or is it to set up encampments so as actively to engage with and to be a troublesome presence in world-wide conflicts?)

    Fundamental here is the idea of higher education.  Fifty years ago, some remnants of the idea of the university from 19th century Europe could romantically be held onto. Ideas of reason, truth, student development, consciousness-raising, critical thinking and even emancipation were concepts that could be used without too much embarrassment.  But now, that discourse and the pedagogical goals and relationships that it stood for have almost entirely dissolved, overtaken and trivialised by talk of skills, work-readiness, employability, impact and of using but of being ‘critical’ of AI.

    It is a commonplace – not least in the higher education consultancies and think-tanks – to hear murmurings to the effect that institutions of higher education must change (and are insufficiently changing). Yes, most certainly.  And some signs of change are apparent. In England alone, we see talk of (but little concerted action on) life-long learning, formation of tertiary education as such, better higher education-further education connections, micro-credentials, ‘alternative providers’, shortened degrees, AI and recovering part-time higher education (disastrously virtually vanquished, regretted now by David Willetts, its progenitor).

    These are but some of the adjustments that the large and complex higher education can be seen to be making to the challenges of the age.  But it is piecemeal in a fundamental sense, namely that it is not being advanced on the basis of a broad and deep analysis of the problem situation. 

    Some say we need a new Robbins, and there is more to that idea than many realise. Robbins was a free-market economist, but the then general understanding as to what constituted a higher education and the balance of the Robbins committee, with a phalanx of heavy-weight educationalists, resulted in a progressive vision of higher education. Now, though, we need new levels of analysis and imaginative thinking. 

    Consider just one of the changes into which English higher education is stumbling, that of life-long learning and its associated idea of credit accumulation. Credit accumulation was first enunciated as an idea in English higher education in the 1980s through the Council for National Academic Awards (and there through the efforts especially of Norman Evans and Peter Toyne). But the idea of universities as a site where the formation of human beings for a life of never-ending inquiry, learning and self-formation has never seriously been pursued, either practically or theoretically. 

    Now, life-long learning is more urgent than ever, but the necessary depth and width of the matters it prompts are hardly to be seen or heard.  However, and as intimated, this matter is just one of a raft of interconnected and mega issues around post-school education today.

    As is said, there is no magic bullet here in such an interconnected world. Joined-up progress is essential, from UNESCO and suchlike to the teacher and class of students. The key is the individual institution of higher education, which now has a responsibility to become aware of its multiple ecosystem environment and work out a game-plan in each of the nine ecosystems identified above.

    At the heart of that ecosystem scanning has to be the individual student. Let this design process start from the bottom-up – the flourishing of the individual student living into the C22.  It would be a design process that tackles head-on what it is to live purposefully in a world of constant change, challenge and conflict and what might we hope for from university graduates against such a horizon.  Addressing and answering this double question within each university – for each university will have its own perspective – will amount to nothing less than a revolution in higher education.

    Source link

  • The Changing Landscape of Internships in Higher Education

    The Changing Landscape of Internships in Higher Education

    Title: Internships Index 2025

    Source: Handshake

    The latest research from Handshake reveals a troubling reality in higher education: the internship landscape is becoming both more competitive and less accessible, particularly for students already facing systemic barriers. Based on a November 2024 survey of over 6,400 students and recent graduates, combined with job posting and application data from over 15 million students and 900,000 employers on Handshake, this report highlights key trends shaping the internship experience today.

    Internship listings have fallen by more than 15 percent from January 2023 to January 2025. At the same time, applications have dramatically increased, doubling the competition for each available position. The decline is even more severe in high-paying fields—technology postings dropped by 30 percent, and professional services postings dropped by 42 percent.

    There are persistent participation gaps:

    • First-generation college students (50 percent) lag behind their peers (66 percent) in internship participation.
    • Students at institutions classified as “inclusive” in the Carnegie Classifications (those with less selective admissions) have much lower internship participation rates (48 percent) compared to students at institutions classified as “selective” or “more selective” (70 percent).
    • Students at these inclusive institutions are twice as likely as those at selective schools to cite financial constraints as their main reason for not pursuing internships.

    These disparities are exacerbated by practical realities. More than 80 percent of first-generation students and those at inclusive institutions report struggling to balance internships with coursework or employment. The timing of internship recruitment adds another challenge, with larger employers typically concentrating on hiring in fall and winter while smaller employers tend to recruit later into the spring.

    Yet internships remain transformative experiences when students can access them. Among those who have completed internships, 56 percent report that the experience was essential in making progress toward their career goals and 79 percent say the experience had a moderate or significant impact on their interest in working for that employer. Of students who haven’t yet participated in internships but hope to do so, 59 percent believe internships will be essential to clarifying their career goals.

    Quality of experience matters as much as access to the opportunity itself. Students who felt fairly compensated were more likely to accept a job offer from that employer (82 percent) versus those who felt underpaid (63 percent), and over half (58 percent) report that mentorship had a major influence on their desire to work for their internship employer.

    Internships have long been a critical bridge from college to career, offering more than just a line on a resume. By investing in robust internship programs, we not only nurture individual potential but also cultivate a dynamic, forward-thinking workforce prepared to meet the challenges of tomorrow’s workplace.

    To read the full report, click here.

    —Alex Zhao


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • Restoring Public Health by Changing Society (Rupa Marya)

    Restoring Public Health by Changing Society (Rupa Marya)

    We are told that our personal health is our individual responsibility based on our own choices. Yet, the biological truth is that human health is dependent upon the health of nature’s ecosystems and our social structures. Decisions that negatively affect these larger systems and eventually affect us are made without our consent as citizens and, often, without our knowledge. Dr. Rupa Marya, Associate Professor of Medicine at UC San Francisco, and Faculty Director of the Do No Harm Coalition (https://www.donoharmcoalition.org/) , says “social medicine” means dismantling harmful social structures that directly lead to poor health outcomes, and building new structures that promote health and healing.

    Learn more about Rupa Marya and her work here. (https://profiles.ucsf.edu/rupa.marya)

    Source link

  • A Game Changing App for Faculty Researchers!

    A Game Changing App for Faculty Researchers!

    Consensus – A Game Changing App for Faculty Researchers

    Today, I started to utilize a new AI app for my research. This app, Consensus, is a game changer for faculty researchers. I wish that I had this app in graduate school – it would have definitely made life easier!

    Step 1 – Here are some screen shots of the software. You can type a question in the box (yes, a question) and the system does the work. Yes, the work that you would usually have to do!

    Step 2 – Then, AI does the rest. You receive AI-powered answers for your results. Consensus analyzes your results (before you even view them) and then summarizes the studies collectively.

    Step 3 – You can view the AI-powered answers which review each article for you.

    *I would also encourage you to review the article independently as well.

    Step 4 – View the study snapshots! Yes, a snapshot of the population, sample size, methods, outcomes measured, and more! Absolutely amazing!

    Step 5 – Click the “AI Synthesis” button to synthesize your results. Even better!

    Step 6 – Use the “powerful filters” button. You can view the “best” research results by: a) population, b) sample size, c) study design, d) journal quality, and other variables. 

    I plan to make a video soon, but please take a look at this video to discover exactly how Consensus can help you in your research! 

    ***

    Check out my book – Retaining College Students Using Technology: A Guidebook for Student Affairs and Academic Affairs Professionals.

    Remember to order copies for your team as well!


    Thanks for visiting! 


    Sincerely,


    Dr. Jennifer T. Edwards
    Professor of Communication

    Executive Director of the Texas Social Media Research Institute & Rural Communication Institute

    Source link