Tag: critical

  • Reimagining the Flipped Classroom: Integrating AI, Microlearning, and Learning Analytics to Elevate Student Engagement and Critical Thinking – Faculty Focus

    Reimagining the Flipped Classroom: Integrating AI, Microlearning, and Learning Analytics to Elevate Student Engagement and Critical Thinking – Faculty Focus

    Source link

  • As Cuts to Department of Veterans Affairs Loom, Our Commitment to Veterans Education Faces a Critical Test

    As Cuts to Department of Veterans Affairs Loom, Our Commitment to Veterans Education Faces a Critical Test

    “VA support isn’t a gift, it’s a debt.”

    That was the message displayed on signs across the National Mall on June 6, where thousands of veterans rallied against sweeping federal job cuts. With the Dropkick Murphys on stage and lawmakers like Sens. Tammy Duckworth (D-IL) and Ruben Gallego (D-AZ) in the crowd, the “Unite for Veterans, Unite for America” rally marked a striking show of both unity and frustration.

    While many agencies are facing delays or court injunctions, the Department of Veterans Affairs (VA) is moving forward with plans to eliminate approximately 83,000 positions, or about 15 percent of its workforce. Public attention has been understandably focused on the impact these cuts may have on veterans’ health care. But staffing losses could also disrupt access to veterans’ education benefits, just as even more veterans and service members may be turning to higher education and career training.

    Among the many education and training benefits administered by the VA, the Post-9/11 GI Bill is the cornerstone of financial aid for military learners, including veterans, service members, and their families. From 2009 to 2019, the federal government budgeted nearly $100 billion for the program, with 2.7 million enlisted veterans eligible to use those benefits over the next decade. And the return on investment is clear: Veterans who use their education benefits complete college at twice the rate of other independent students—those typically supporting themselves without parental aid—according to research by the American Institutes for Research.

    Despite the GI Bill’s importance, military learners often struggle to access the benefits they’ve earned. Eligibility rules can be confusing, and transferring benefits to spouses or dependents involves time-consuming red tape. Many students and the institutions that serve them rely on VA staff to interpret the rules, resolve disputes, and ensure benefits are processed on time. With fewer staff, that support system is at risk of breaking down.

    This strain comes amid a broader wave of federal downsizing that is hitting the veteran community especially hard. The federal government has long been the largest employer of veterans, and the current reduction in force across the federal government is disproportionately affecting them. In just one example, the Department of Defense is reportedly cutting 50,000 to 60,000 civilian jobs, many held by veterans.

    At the same time, the Army is considering reducing its active-duty force by as many as 90,000 troops, amid shrinking reenlistment options. Even senior military leadership have seen targeted cuts. The result is that more veterans and service members will be leaving military service and looking to build new careers. This in turn will increase the demand for VA education and training benefits, just as fewer staff may be available to help them access those benefits.

    For decades, support for military learners has united policymakers across party lines. In a time of significant change in Washington, we need to uphold our commitments to those who have dedicated their lives and careers to serving our nation. This includes a commitment to ensuring that the VA has the staffing and resources it needs to deliver on its promise—so every veteran can access the education benefits they’ve earned.


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • The spending review is a critical moment for UK science and innovation

    The spending review is a critical moment for UK science and innovation

    A series of key government announcements over the coming weeks will set the direction of travel for research and innovation for years to come. Next week’s spending review will set the financial parameters for the remainder of this Parliament – and we shouldn’t expect this outcome to maintain the status quo, given this is the first zero-based review under a Labour government for 17 years.

    Accompanying this will be the industrial strategy white paper, which is likely to have a focus on driving innovation and increasing the diffusion and adoption of technologies across the economy – in which the UK’s universities will need to be key delivery partners. We can also expect more detail on the proposals in the immigration white paper, with implications for international student and staff flows to the UK.

    The outcome for higher education and research remains hard to call, but the government has sent early signals that it recognises the value of investment in R&D as crucial to transforming the UK’s economy. In a volatile fiscal environment, DSIT’s R&D budget saw a real-terms increase of 8.5 per cent for 2025–26 with protection for “core research” activity within this.

    Looking ahead to the spending review, the Institute for Fiscal Studies has pointed out that the fiscal envelope set by the Chancellor for capital spending – which is how R&D is classified – at the spring statement is significantly frontloaded. There is scope for increases in the early years of the spending review period and then real-terms declines from 2027–28. With such significant constraints on the public finances, it’s more essential than ever that the UK’s R&D funding system maximises efficiency and impact, making the best possible use of available resources.

    International comparisons

    Last month, the Russell Group published a report commissioned from PwC and funded by Wellcome which considered the experiences of countries with very different R&D funding systems, to understand what the UK might learn from our competitors.

    Alongside the UK, the report examined four countries: Canada, Germany, the Netherlands and South Korea, scoring them across five assessment criteria associated with a strong R&D system: strategic alignment to government priorities; autonomy, stability and sustainability; efficiency; and leveraging external investment. It also scored the countries on two measures of output: research excellence and innovation excellence.

    The analysis can help to inform government decisions about how to strike a balance between these criteria. For example, on the face of it there’s a trade-off between prioritising institutional autonomy and ensuring strategic alignment to government priorities. But PwC found that providing universities with more freedom in how they allocate their research funding – for example, through flexible funding streams like Quality-Related (QR) funding – means they can also take strategic long-term decisions, which create advantage for the UK in key research fields for the future.

    Over the years, QR funding and its equivalents in the devolved nations have enabled universities to make investments which have led to innovations and discoveries such as graphene, genomics, opto-electronics, cosmology research, and new tests and treatments for everything from bowel disease to diabetes, dementia and cancer.

    Conversely, aligning too closely to changing political priorities can stifle impact and leave the system vulnerable. PwC found that, at its extreme, a disproportionate reliance on mission-led or priority-driven project grant funding inhibits the ability of institutions to invest outside of government’s immediate priority areas, resulting in less long-term strategic investment.

    With a stretching economic growth mission to deliver, policymakers will be reaching for interventions which encourage private investment into the economy. The PwC report found long-term, stable government incentives are crucial in leveraging industry investment in R&D, alongside supporting a culture of industry-university collaboration. This has worked well in Germany and South Korea with a mix of incentives including tax credits, grants and loans to strengthen innovation capabilities.

    Getting the balance right

    The UK currently lags behind global competitors on the proportion of R&D funded by the business sector, at just over 58 per cent compared to the OECD average of 65 per cent. However, when considering R&D financed by business but performed by higher education institutions, the UK performs fifth highest in the OECD – well above the average.

    This demonstrates the current system is successfully leveraging private sector collaboration and investment into higher education R&D. We should now be pursuing opportunities to bolster this even further. Schemes such as the Higher Education Innovation Fund (HEIF) deliver a proven return on investment: every £1 invested in HEIF yields £14.8 in economic return at the sector-level. PwC’s report noted that HEIF has helped develop “core knowledge exchange capabilities” within UK HEIs which are crucial to building successful partnerships with industry and spinning out new companies and technologies.

    In a time of global uncertainty, economic instability and rapid technological change, investments in R&D still play a key role in tackling our most complex challenges. In its forthcoming spending review – the Russell Group submission is available here – as well as in the industrial strategy white paper and in developing reforms to the visa system, the government will need to balance a number of competing but interrelated objectives. Coordination across government departments will be crucial to ensure all the incentives are pointing in the right direction and to enable sectors such as higher education to maximise the contribution they can make to delivering the government’s missions.

    Source link

  • Only connect: why investing in relational infrastructure is critical for universities

    Only connect: why investing in relational infrastructure is critical for universities

    Today on the HEPI blog, we explore the discussions at a recent HEPI roundtable with Elsevier on the topic of the Fourth Generation University – which combines teaching, research and knowledge exchange.

    You can read a full write-up of the roundtable, by HEPI’s own Director of Partnerships, Lucy Haire, at this link – or read on for a discussion of relational infrastructure by Sarah Chaytor.

    • Sarah Chaytor is Director of Strategy & Policy and Joint Chief of Staff to the UCL Vice-Provost, Research, Innovation & Global Engagement.

    At a recent HEPI roundtable dinner with Elsevier to discuss how universities could strengthen their regional and civic contributions, there was a rather sobering discussion of the ‘low stock’ of universities amongst both government and the public.

    This was in the context of an ongoing, international discussion about the concept of ‘fourth generation’ universities. These are defined as ‘global universities that are fully integrated in their local innovation ecosystem with the aim of tackling worldwide societal challenges and driving regional economic growth.’

    We are well-versed in our sector on the economic benefits of universities and well-practiced in trumpeting these to ourselves and to government. Yet at the same time, there is a growing evidence base on the disconnect between the British public and universities. Reports from UPP/HEPI and from Public First suggest a significant lack of awareness amongst many citizens of how universities positively affect their daily lives or contribute to the places they live. As someone working in university research, I am particularly concerned by public attitudes to research and development (R&D) – important work done by CaSE on public perceptions of R&D has found that a significant majority of people think that that ‘R&D doesn’t benefit people like them’ or feel neutral or unsure about R&D’s impacts.

    I’m not sure that, as a sector, we have fully grasped how serious this is. It cannot be a state of affairs that we simply shrug our shoulders at. As CaSE has observed: ‘This is a precarious position for a sector that receives substantial public investment.’  We risk undermining the ‘social compact’ that exists between universities and the public – that is, the basis on which we receive public funding (especially for R&D) is our ability to make a broader contribution.

    I conclude from this that the focus over the past 20 years or so on universities’ economic contribution doesn’t cut through to those citizens who feel that the economy simply doesn’t work for them. Making universities part of an abstract and disconnected concept of economic growth is of no interest to people worried about access to housing, cost of living or the state of their local high street. It also overlooks the multifaceted ways in which universities are contributing to places across the UK, from providing jobs to sports facilities to cultural institutions to working with community groups to undertaking the research that can save lives or tackle pressing challenges. 

    I think we need to focus more on how universities can make human connections and articulate their research benefit in human terms. To draw from Peter Kyle’s framing of innovation, we need to show how universities are putting their considerable assets and resources to use for the public good. From a research perspective, this requires us to think about the purpose of knowledge and how we connect knowledge to communities across the country.  In particular, we need to work much better to build trusted relationships that enable us to understand the needs of communities and citizens around the country and ensure that we are demonstrably meeting these.

    For me, that starts with taking much more seriously the need to invest in the ‘relational infrastructure’ that can support those connections. Put simply, relational infrastructure is the people, structures and processes that support universities to connect with other parts of society. At its core are people – people who build and maintain relationships, who manage processes and structures for engagement, who keep connections going between specific projects and funding periods.

    In my own world of academic-policy engagement, this relational infrastructure is the crucial ‘glue’ which underpins a whole host of interactions, projects, and exchange of ideas. It supports ways of working with policymakers that are about long-term partnership and collaboration rather than one-off transactions. (More on this in the final report from the Capabilities in Academic Policy Engagement project.)

    We know that universities can tell a powerful story about their civic contribution – as the Civic University Commission noted, universities are ‘hugely important to the economic, social, cultural and environmental wellbeing of the places in which they are located’. This concept is echoed in the idea of the ‘fourth generation’ university. But perhaps we have focused too much on shiny projects and initiatives, and not enough on the simple relational approaches which underpin successful and long-term engagement and meaningful partnerships.

    Relational infrastructure is all too easy to overlook or to take for granted. It rarely appears in business cases or exciting new project proposals. But it is one of our most precious assets and should be actively cultivated. This requires institutions to acknowledge the need for long-term investment and to recognise that, whilst it will deliver dividends for universities, these will not necessarily arise a short time-frame or via our ‘usual’ metrics. What relational infrastructure will deliver is deep and meaningful connections with other parts of society, which enable universities to put their research (and other) assets to public good use.

    It’s time to take our responsibility to develop and maintain relational infrastructure seriously – it is the route to rebuilding our relationship with wider society.

    Source link

  • Why History Instruction is Critical for Combating Online Misinformation – The 74

    Why History Instruction is Critical for Combating Online Misinformation – The 74

    Can you tell fact from fiction online? In a digital world, few questions are more important or more challenging.

    For years, some commentators have called for K-12 teachers to take on fake news, media literacy, or online misinformation by doubling down on critical thinking. This push for schools to do a better job preparing young people to differentiate between low- and high-quality information often focuses on social studies classes.

    As an education researcher and former high school history teacher, I know that there’s both good and bad news about combating misinformation in the classroom. History class can cultivate critical thinking – but only if teachers and schools understand what critical thinking really means.

    Not just a ‘skill’

    First, the bad news.

    When people demand that schools teach critical thinking, it’s not always clear what they mean. Some might consider critical thinking a trait or capacity that teachers can encourage, like creativity or grit. They could believe that critical thinking is a mindset: a habit of being curious, skeptical and reflective. Or they might be referring to specific skills – for instance, that students should learn a set of steps to take to assess information online.

    Unfortunately, cognitive science research has shown that critical thinking is not an abstract quality or practice that can be developed on its own. Cognitive scientists see critical thinking as a specific kind of reasoning that involves problem-solving and making sound judgments. It can be learned, but it relies on specific content knowledge and does not necessarily transfer between fields.

    Early studies on chess players and physicists in the 1970s and ’80s helped show how the kind of flexible and reflective cognition often called critical thinking is really a product of expertise. Chess masters, for instance, do not start out with innate talent. In most cases, they gain expertise by hours of thoughtfully playing the game. This deliberate practice helps them recognize patterns and think in novel ways about chess. Chess masters’ critical thinking is a product of learning, not a precursor.

    Because critical thinking develops in specific contexts, it does not necessarily transfer to other types of problem-solving. For example, chess advocates might hope the game improves players’ intelligence, and studies do suggest learning chess may help elementary students with the kind of pattern recognition they need for early math lessons. However, research has found that being a great chess player does not make people better at other kinds of complex critical thinking.

    Historical thinking

    Since context is key to critical thinking, learning to analyze information about current events likely requires knowledge about politics and history, as well as practice at scrutinizing sources. Fortunately, that is what social studies classes are for.

    Social studies researchers often describe this kind of critical thinking as “historical thinking”: a way to evaluate evidence about the past and assess its reliability. My own research has shown that high school students can make relatively quick progress on some of the surface features of historical thinking, such as learning to check a text’s date and author. But the deep questioning involved in true historical thinking is much harder to learn.

    Social studies classrooms can also build what researchers call “civic online reasoning.” Fact-checking is complex work. It is not enough to tell young people that they should be wary online, or to trust sites that end in “.org” instead of “.com.” Rather than learning general principles about online media, civic online reasoning teaches students specific skills for evaluating information about politics and social issues.

    Still, learning to think like a historian does not necessarily prepare someone to be a skeptical news consumer. Indeed, a recent study found that professional historians performed worse than professional fact-checkers at identifying online misinformation. The misinformation tasks the historians struggled with focused on issues such as bullying or the minimum wage – areas where they possessed little expertise.

    Powerful knowledge

    That’s where background knowledge comes in – and the good news is that social studies can build it. All literacy relies on what readers already know. For people wading through political information and news, knowledge about history and civics is like a key in the ignition for their analytical skills.

    Readers without much historical knowledge may miss clues that something isn’t right – signs that they need to scrutinize the source more closely. Political misinformation often weaponizes historical falsehoods, such as the debunked and recalled Christian nationalist book claiming that Thomas Jefferson did not believe in a separation of church and state, or claims that the nadir of African American life came during Reconstruction, not slavery. Those claims are extreme, but politicians and policymakers repeat them.

    For someone who knows basic facts about American history, those claims won’t sit right. Background knowledge will trigger their skepticism and kick critical thinking into gear.

    Past, present, future

    For this reason, the best approach to media literacy will come through teaching that fosters concrete skills alongside historical knowledge. In short, the new knowledge crisis points to the importance of the traditional social studies classroom.

    But it’s a tenuous moment for history education. The Bush- and Obama-era emphasis on math and English testing resulted in decreased instructional time in history classes, particularly in elementary and middle schools. In one 2005 study, 27% of schools reported reducing social studies time in favor of subjects on state exams.

    Now, history teachers are feeling heat from politically motivated culture wars over education that target teaching about racism and LGBTQ+ issues and that ban books from libraries and classrooms. Two-thirds of instructors say that they’ve limited classroom discussions about social and political topics.

    Attempts to limit students’ knowledge about the past imperil their chances of being able to think critically about new information. These attacks are not just assaults on the history of the country; they are attempts to control its future.

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Source link

  • Why History Instruction is Critical for Combating Online Misinformation – The 74

    Why History Instruction is Critical for Combating Online Misinformation – The 74

    Can you tell fact from fiction online? In a digital world, few questions are more important or more challenging.

    For years, some commentators have called for K-12 teachers to take on fake news, media literacy, or online misinformation by doubling down on critical thinking. This push for schools to do a better job preparing young people to differentiate between low- and high-quality information often focuses on social studies classes.

    As an education researcher and former high school history teacher, I know that there’s both good and bad news about combating misinformation in the classroom. History class can cultivate critical thinking – but only if teachers and schools understand what critical thinking really means.

    Not just a ‘skill’

    First, the bad news.

    When people demand that schools teach critical thinking, it’s not always clear what they mean. Some might consider critical thinking a trait or capacity that teachers can encourage, like creativity or grit. They could believe that critical thinking is a mindset: a habit of being curious, skeptical and reflective. Or they might be referring to specific skills – for instance, that students should learn a set of steps to take to assess information online.

    Unfortunately, cognitive science research has shown that critical thinking is not an abstract quality or practice that can be developed on its own. Cognitive scientists see critical thinking as a specific kind of reasoning that involves problem-solving and making sound judgments. It can be learned, but it relies on specific content knowledge and does not necessarily transfer between fields.

    Early studies on chess players and physicists in the 1970s and ’80s helped show how the kind of flexible and reflective cognition often called critical thinking is really a product of expertise. Chess masters, for instance, do not start out with innate talent. In most cases, they gain expertise by hours of thoughtfully playing the game. This deliberate practice helps them recognize patterns and think in novel ways about chess. Chess masters’ critical thinking is a product of learning, not a precursor.

    Because critical thinking develops in specific contexts, it does not necessarily transfer to other types of problem-solving. For example, chess advocates might hope the game improves players’ intelligence, and studies do suggest learning chess may help elementary students with the kind of pattern recognition they need for early math lessons. However, research has found that being a great chess player does not make people better at other kinds of complex critical thinking.

    Historical thinking

    Since context is key to critical thinking, learning to analyze information about current events likely requires knowledge about politics and history, as well as practice at scrutinizing sources. Fortunately, that is what social studies classes are for.

    Social studies researchers often describe this kind of critical thinking as “historical thinking”: a way to evaluate evidence about the past and assess its reliability. My own research has shown that high school students can make relatively quick progress on some of the surface features of historical thinking, such as learning to check a text’s date and author. But the deep questioning involved in true historical thinking is much harder to learn.

    Social studies classrooms can also build what researchers call “civic online reasoning.” Fact-checking is complex work. It is not enough to tell young people that they should be wary online, or to trust sites that end in “.org” instead of “.com.” Rather than learning general principles about online media, civic online reasoning teaches students specific skills for evaluating information about politics and social issues.

    Still, learning to think like a historian does not necessarily prepare someone to be a skeptical news consumer. Indeed, a recent study found that professional historians performed worse than professional fact-checkers at identifying online misinformation. The misinformation tasks the historians struggled with focused on issues such as bullying or the minimum wage – areas where they possessed little expertise.

    Powerful knowledge

    That’s where background knowledge comes in – and the good news is that social studies can build it. All literacy relies on what readers already know. For people wading through political information and news, knowledge about history and civics is like a key in the ignition for their analytical skills.

    Readers without much historical knowledge may miss clues that something isn’t right – signs that they need to scrutinize the source more closely. Political misinformation often weaponizes historical falsehoods, such as the debunked and recalled Christian nationalist book claiming that Thomas Jefferson did not believe in a separation of church and state, or claims that the nadir of African American life came during Reconstruction, not slavery. Those claims are extreme, but politicians and policymakers repeat them.

    For someone who knows basic facts about American history, those claims won’t sit right. Background knowledge will trigger their skepticism and kick critical thinking into gear.

    Past, present, future

    For this reason, the best approach to media literacy will come through teaching that fosters concrete skills alongside historical knowledge. In short, the new knowledge crisis points to the importance of the traditional social studies classroom.

    But it’s a tenuous moment for history education. The Bush- and Obama-era emphasis on math and English testing resulted in decreased instructional time in history classes, particularly in elementary and middle schools. In one 2005 study, 27% of schools reported reducing social studies time in favor of subjects on state exams.

    Now, history teachers are feeling heat from politically motivated culture wars over education that target teaching about racism and LGBTQ+ issues and that ban books from libraries and classrooms. Two-thirds of instructors say that they’ve limited classroom discussions about social and political topics.

    Attempts to limit students’ knowledge about the past imperil their chances of being able to think critically about new information. These attacks are not just assaults on the history of the country; they are attempts to control its future.

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Source link

  • Simulations and AI: Critical Thinking Improvement

    Simulations and AI: Critical Thinking Improvement

    Reading Time: 4 minutes

    As an educator teaching undergraduates and graduates, both online and face-to-face, it’s always a challenge to find meaningful ways to engage students. Now that artificial intelligence has come into play, that challenge has become even greater. This has resulted in a need to address ways to create “AI-proof” assignments and content.

    Simulations in different types of courses

    According to Boston College, simulations are designed to engage students “directly with the information or the skills being learned in a simulated authentic challenge.” In my teaching over the past decade plus, I have gone from using simulations in one primary operations management course to using them in almost every course I teach. And I don’t necessarily use them in a stand-alone assignment, although they can be used as such. How I use a simulation is course dependent.

    Face-to-face

    In some face-to-face courses, I will run the simulation in class with everyone participating. Sometimes I will have teams work in a “department,” or have true, open discussions. Sometimes I will run the room, ensuring every single student is paying attention and contributing. Using simulations in this fashion gives flexibility in the classroom. It shows me who truly gets the concepts and who is going through the motions. The dynamic of the class itself can dictate how I run the simulation.

    Online

    In online courses, I typically assign simulation work. This can be one simulation assignment or a progressive unit of simulations. It’s a great way to see students improve as they move through various concepts, ideas, and applications of the topics covered. Creating assignments which are both relative to the simulation and comparative to the work environment make assignments AI-proof. Students must think about what they have actually done in class and relate it to their workplace environment and/or position.

    Why simulations work for all levels

    There are many simulations that can be used and incorporated in both undergraduate and graduate level courses. As much as we don’t think of graduate students relying on AI to complete work, I have seen this happen multiple times. The results aren’t always ideal. Using simulations at the graduate level, and ensuring your assignments reflect both the simulation and real-world comparisons, can help your students use AI to gather thoughts, but not rely on it for the answers.

    Student benefits

    Using simulations will have many benefits for your students. I have gotten feedback from many students over the years regarding their ability to make decisions and see the results that simulations give. My capstone students often want to continue running the simulation, just to see how well they can do with their “business.” I have had students in lower-level management courses ask me how they can get full access to run these when I have them as “in-class only” options. The majority of feedback includes:

    1. Anything is better than lecture!
    2. Being able to see how students’ decisions impact other areas can be very helpful for them. They actually remember it, enforcing more than reading or watching can do.
    3. Students want more simulations throughout their courses, rather than just one or two. They will have the ability to make those decisions and see those impacts. And they feel it will prepare them even more for the workforce.

    As a retention and engagement tool, simulations seem to be one of the best I have found. Are there students that don’t like them? Yes, there always are. Even so, they’re forced to think through solutions and determine a best course of action to get that optimal result. From an instructor’s perspective, there’s nothing better than seeing those wheels turn. Students are guided on how to recover from an issue, and are advised on what may happen if different solutions were attempted. The questions gained are often better than the results.

    Instructor benefits

    For instructors, there are many benefits. As I stated earlier, you can see improvements in student behavior. They ask questions and have a defined interest in the results of their actions. In classes when you have teams, it can become friendly competition. If they are individual assignments, you get more questions, which is something we always want to see. More questions show interest.

    Ease of use

    Although I usually include recorded instructions and tips for simulations in my online courses, I prefer my personal recordings, since I also give examples relevant to student majors and interests. For example, in an entrepreneurial class, I would go through a simulation piece and include how this might affect the new business in the market vs. how it might impact an established business.

    Auto-grading

    When assigning simulations, they are usually auto-graded. This can drastically lighten our workload. I personally have around 150-200 students each term, so being able to streamline the grading function is a huge benefit. However, with this, there are trade-offs. Since I also create simulation-based questions and assignments, there are no textbook answers to refer to. You must know the simulations and be the content expert, so you can effectively guide your students.

    Thoughtful responses

    AI can be a great tool when used productively. But seeing overuse of the tool is what led me to learn more simulations. This adjustment on my end has resulted in students presenting me with more thoughtful, accurate, and relevant responses. Feedback from students has been positive.

    Sims for all industries

    An additional benefit of simulations is that there are basically sims for all industries. Pilot and healthcare sims have existed for a very long time. But even if you only have access to one or two, you have the ability to make it relatable to any field. If you’re like me and teach a variety of classes, you can use one simulation for almost any class.

    Overall success

    I was using simulations before AI became so influential. The extensive and current use of AI has driven me to use more simulations in all of my courses. By adjusting what tools I use, I have been able to encourage more thorough problem solving, active listening and reasoning. Plus, I get strategic and effective questions from my students. The overall results include intense engagement, better critical thinking skills, and content retention.

     

    Written by Therese Gedemer, Adjunct Instructor and Workforce Development Trainer, Marian University, Moraine Park Tech College and Bryant & Stratton College

     

    Source link

  • Eight critical questions for the new chief executive of UKRI

    Eight critical questions for the new chief executive of UKRI

    The appointment of a new chief executive for UK Research and Innovation (UKRI) could not happen at a more crucial time.

    With public finances under strain, the case for public investment in R&D needs to be made cogently and needs to focus both on addressing the UK’s five government missions and on sustaining the fantastic research asset which the UK university sector represents. The list of issues for the new appointee will no doubt be lengthy, but we put forward the following as a possible shortlist of priorities.

    1. The interface (pipeline) between research councils and Innovate UK

    One of the main goals in establishing UKRI was to ensure a smooth pipeline from the research undertaken by the individual research councils to the industrial/end user base thereby bringing both economic and societal benefit. However, despite years of intent this pipeline seems as obstructed as ever. The fundamental question remains: to what extent is the role of Innovate UK to aid the transition of the outcomes resulting from research council funding versus simply supporting UK-based enterprises in their own research?

    Currently there are disconnects between the research priorities, often defined by government and implemented by the research councils, and the Innovate UK funding mechanism to ensure they are exploited. There are some exceptions here of course: the Creative Industries Clusters was a good example of a joint initiative between AHRC and Innovate UK which did integrate industry demand to local research strengths.

    A key priority for the new chief executive is to join up the pipeline more effectively across the whole range of industry sectors and ensure a very clear role for Innovate UK in partnership with the research councils and the subsequent interface to the National Wealth Fund or British Business Bank.

    2. Articulating and agreeing the balance between UKRI spend on government priorities and investment in the research base of the future

    As we have argued elsewhere on Wonkhe, the nation needs UKRI to fund both the research required by current government priorities relating to industrial strategies or societal challenges, and invest in the broader research base that, in the words of science minister Patrick Vallance, will feed the “goose that lays the golden egg” of our research base and the opportunities of tomorrow.

    Currently, this balance is, at best, hidden from view, suiting neither the needs of government nor the future aspirations of the sector. We urge UKRI to quantify this balance historically and to articulate a proposal to government for moving forward. We also require balance between the budget committed in the long-term to institutes, infrastructure, international subscriptions, and facilities vs. the shorter-term funding into the wider research and innovation community. Balancing these priorities requires a strengthening of the relationship, and open discussion, between UKRI, DSIT and wider government.

    3. Ensuring UKRI is relevant to the government’s regional economic development agenda

    As part of the government’s economic agenda, driving productivity growth in the tier-2 cities outside the South-East and the wealthier places in the UK is key to executing its growth mission. There is a clear tension here in UKRI acting as the key funding agency for public R&D spending driven solely by excellence, and a regional economic development mission, for which additional criteria apply. This tension must be addressed and not ignored.

    The creation of innovation accelerators in which additional funding was provided by government, but UKRI was involved in evaluating the merit of proposals, is a good example of how UKRI can drive change. As the government develops new levers to address and fund regional economic development, UKRI should play a key role in ensuring that this dovetails with the research and innovation base of the nation.

    4. Creating a highly skilled workforce

    As is becoming clear, the number of doctoral students supported by UKRI continues to fall – an issue highlighted, for example, by Cambridge vice chancellor Deborah Prentice in a recent Guardian interview. This is particularly clear in areas which have traditionally relied upon UKRI funding, such as the engineering and physical sciences. The corresponding research effort is in part bolstered by an increase in the number of fee-paying overseas students, but this does little to create the UK-based workforce industry needs.

    UKRI needs to prioritise funding and work with government to find new ways of addressing the skills the nation needs if we are to drive a productive knowledge-based economy. The skills required extend beyond doctoral degrees to include technical professionals and engineers.

    5. Sector confidence around REF as a rigorous, fair process, supportive of excellence

    The HE sector is in financial turmoil, manifested in the unprecedented number of UK higher education institutions currently implementing severance schemes. Ongoing uncertainties over the REF process, from the portability of outputs and the lack of an essential mechanism to ensure a diversity of authors (current proposals have no cap on the number of outputs that can be submitted from any one individual) to the absence of clarity on the people, culture and environment template’s support for excellence need resolution.

    This resolution is required, firstly so that research strategies institutions put in place prior to any census date have time to drive the changes required given that REF is meant to be formative as well as summative; and secondly so that institutions can efficiently deliver their REF returns to a standard and detail a government should expect to provide assurance over the future quality related (QR) spend.

    6. The importance and accountability of QR

    Virtually everyone in the sector embraces the notion that QR is central to the agility and sustainability of the UK research base. This certainty is matched with uncertainty within government as to the value for money this investment provides. If we are to maintain this level of trust in the sector’s ability to derive benefit from this investment, collectively we need to do a better job at showing how QR is central to the agility of our investment in the research outcomes of tomorrow and not simply a plugging of other, non-research related, financial holes. As both assessor and funder UKRI can lead and co-ordinate this response.

    7. Completion of the new funding service (the software needs to work!)

    The joint electronic submission system (Je-S) was outdated and potentially no longer supportable. Its back room equivalent, Siebel, even worse. Their replacement, the new funding service is an acceptable portal to applicants but seemingly still provides inadequate assurances for a system from which to make financial commitments. This shortcoming seems almost incomprehensible given it was an in-house development.

    Moving beyond the essential financial controls it seems to offer little by way of the AI assistance in the identification of reviewers that the software behind the submission systems for many of our research publications has offered for decades. Whether we lack the skills or investment to solve these issues is unclear, but the inefficiency of the current situation is wasteful of perhaps an even more precious resource, namely the time of UKRI staff to add human value to our research landscape. This seeming lack of skills and the systems we require is worrying too to the future REF exercise, even once the framework is known.

    8. Evidencing the effects of change

    Of course the world should and must move on. As a funder of research, it is appropriate that UKRI experiments with better ways of funding, becoming an expert in metascience. Changes inspired by ideology are fine, but it is essential that these changes are then assessed to see if the outcomes are those we desired.

    One example is the narrative CV, a well-meaning initiative to recognise a wider definition of excellence and an equality of opportunity. Is this what it achieved? Do we acknowledge the risks associated with AI or the unintended consequence of favouring the confident individual with English as their first language? While not advocating a return to the tradition of lists CV, we urge a formal reporting of outcomes achieved through the narrative CV using both quantitative and qualitative data and an evidenced based plan to move forward.

    Looking to the future

    We realise that criticism is easy and solutions are hard to find. So in case of doubt, we would like to finish with a call out to UKRI’s greatest resource, namely at all levels its committed and highly professional staff. We know at first hand the dedication of its workforce which is committed to fairly supporting the community, the research they do and the impact it creates.

    The role of chief executive of UKRI provides vital leadership not just to UKRI but to the sector as a whole, and the sector must unite to stand behind the new incumbent in solving the challenges that lie ahead.

    Source link

  • Career coaches fill critical gaps in Ph.D. training

    Career coaches fill critical gaps in Ph.D. training

    To the editor:

    In “The Doctoral Dilemma” (Feb. 3, 2025), Inside Higher Ed reporter Johanna Alonso describes career coaching as a “cottage industry” of “gurus” that emerged to fill critical gaps in graduate training. As a career coach cited in the article, I was disappointed to see such an inaccurate and biased portrayal of my work. 

    Coaching is a professional industry with proven methods, tools, and credentialing provided by the International Coaching Federation (ICF). Coaching is distinct from “consulting,” and it’s an intentional, strategic step for anyone seeking to change careers. This is why Johns Hopkins University employs coaches as part of its Doctoral Life Design Studio. Yet, the article portrays these university-led coaching initiatives as legitimate, structured and holistic, while describing coaching outside of the university as an opportunistic “cottage industry.” Why frame the same service in two very different ways?

    From our wide-ranging, 20-minute interview, Alonso only highlighted my hourly rate—$250/hour for a single one-to-one meeting—without any context. There is no mention of the benefits of career coaching, or whether universities like Johns Hopkins pay their coaches a similar rate. The monetary cost, presented in isolation, suggests exploitation. The reality? As a neurodivergent person, I find one-to-one meetings draining, so I’ve priced them to limit bookings. Instead, I direct Ph.D.s toward my free library of online content, my lower-cost group programs and my discounted coaching packages, all of which have helped Ph.D.s secure industry roles that double or triple their academic salaries. The article doesn’t include these details.

    The most telling sign of the article’s bias is the use of the word “guru.” Why use a loaded term like “guru” instead of “expert” to describe career coaches? As I frequently remind my clients, language shapes perception. Ph.D.s are more likely to be seen as industry-ready professionals if they use terms like “multi-year research project” instead of “dissertation” or “stakeholders” instead of “academic advisers.” The same logic applies here—calling career coaches “gurus” trivializes our work, implying we are self-appointed influencers rather than qualified professionals. I’ll never forget the professor who once tweeted, “If life outside of academia is so great, why do alt-ac gurus spend so much time talking about it? Don’t they have better things to do?”

    My response? “I wouldn’t have to do this if professors provided ANY professional development for non-academic careers.”

    Because contrary to what the article claims, I didn’t start my coaching business because I wished there were more resources available to me. I started it because, after I quit my postdoctoral fellowship for an industry career, I spent untold hours providing uncompensated career support to Ph.D.s. For nearly two years, I responded to thousands of messages, created online resources, reviewed résumés and met one-to-one with hundreds of Ph.D. students, postdocs and even tenured professors—all for free, in my leisure time. Eventually, I burned out from the incessant demand. I realized that, if I was going to continue pouring my time into helping Ph.D.s, I needed to be compensated. That’s when I started my business.

    Academia conditions us to see for-profit businesses as unethical, while “nonprofit” universities push students into a lifetime of high-interest debt. It convinces us that charging for expertise is predatory, while asking Ph.D.s to work for poverty wages is somehow noble. It forces us to internalize the idea that, if you truly care about something, you should sacrifice your well-being and life for it. But our time is valuable. Our skills are valuable. We deserve to be fairly compensated for our labor, inside and outside of academia.

    Career coaching isn’t the problem. The real problem is that academia still refuses to take a critical look in the mirror.

    Ashley Ruba is the founder of After Academia.

    Source link

  • Why unified data and technology is critical to student experience and university success

    Why unified data and technology is critical to student experience and university success

    The Australian higher education sector continues to evolve rapidly, with hybrid learning,
    non-linear education, and the current skills shortage all shaping how universities operate.

    At the same time, universities are grappling with rising operational costs and decreased funding, leading to fierce competition for new enrolments.

    Amidst the dynamic landscape of higher education, the student experience has become a crucial factor in attracting and retaining students.

    The student experience encompasses a wide array of interactions, from how students first learn about an institution through to the enrolment process, coursework, social activities, wellbeing support and career connections. With so many student touchpoints to manage, institutions are turning to data and technology integrations to help streamline communications and improve their adaptability to change.

    Download the white paper: Why Unifying Data and Technology is Critical to the Success and Future of Universities

    Enhancing institutional efficiency and effectiveness
    Universities face an increasingly fragmented IT landscape, with siloed data and legacy systems making it difficult to support growth ambitions and improve student experiences.

    By integrating systems and data, institutions are starting to align digital and business strategies so that they can meet operational goals while providing more connected, seamless and personalised experiences for students.

    One of the most effective ways universities can achieve this is by consolidating disparate systems into a cloud-based Customer Relationship Management (CRM) solution, such as Salesforce.

    Optimising admissions and enhancing student engagement
    In recent years, there have been significant fluctuations in the enrolment of higher education students for numerous reasons – Covid-19 restrictions, declining domestic student numbers, high cost of living, proposed international student caps, and volatile labour market conditions being just a few.

    To better capture the attention of prospective students, institutions are now focusing on delivering more personalised and targeted engagement strategies. Integrated CRM and marketing automation is increasingly being used to attract more prospective students with tailored, well-timed communication.

    Universities are also using CRM tools to support student retention and minimise attrition. According to a Forrester study, students are 15 per cent more likely to stay with an institution when Salesforce is used to provide communications, learning resources and support services.

    Streamlining communication and collaboration
    By creating a centralised system of engagement, universities can not only support students throughout their academic journey, but also oversee their wellbeing.

    For example, a leading university in Sydney has developed a system that provides a comprehensive view of students and their needs, allowing for integrated and holistic support and transforming its incident reporting and case management.

    Fostering stronger alumni and industry relations
    Another area where CRM systems play a pivotal role is in building alumni and industry relationships. Alumni who feel valued by their university – through personalised engagement – are more likely to return when seeking upskilling, or to lend financial support.

    Personalising communication to industry partners can also help strengthen relationships, potentially leading to sponsored research, grants, and donations, as well as internships and career placements.

    University of Technology Sydney, for example, adopted a centralised data-led strategy for Corporate Relations to change how it works with strategic partners, significantly strengthening its partner network across the university.

    Unlocking the value of data and integration

    With unified data and digital technology driving personalised student interactions, university ICT departments can empower faculty and staff to exceed enrolment goals, foster lifelong student relationships and drive institutional growth.

    To learn more about the strategies and technologies to maximise institutional business value, download the white paper.

    Do you have an idea for a story?
    Email [email protected]

    Source link