Blog

  • Higher education in England needs a special administration regime

    Higher education in England needs a special administration regime

    Extra government funding for the higher education sector in England means the debate about the prospect of an HE provider facing insolvency and a special administration regime has gone away, right?

    Unfortunately not. There is no additional government funding; in fact the additional financial support facilitated by the new Labour government so far is an increase to tuition fees for the next academic year for those students that universities can apply this to. It is estimated that the tuition cost per student is in excess of £14K per year, so the funding gap has not been closed. Add in increased National Insurance contributions and many HE providers will find themselves back where they are right now.

    It is a problem that there is no viable insolvency process for universities. But a special administration regime is not solely about “universities going bust.” In fact, such a regime, based on the existing FE special administration legislation, is much more about providing legal clarity for providers, stakeholders and students, than it is about an insolvency process for universities.

    Managing insolvency and market exit

    The vast majority of HE providers are not companies. This means that there is a lack of clarity as to whether current Companies and Insolvency legislation applies to those providers. For providers, that means that they cannot avail themselves of many insolvency processes that companies can, namely administration, company voluntary arrangements and voluntary liquidation. It is debatable whether they can propose a restructuring plan or be wound up by the court, but a fixed charge holder can appoint receivers over assets.

    Of these processes, the one most likely to assist a provider is administration, as it allows insolvency practitioners to trade an entity to maximise recoveries from creditors, usually through a business and asset sale.

    At best therefore, an HE provider might be able to be wound up by the court or have receivers appointed over its buildings. Neither of these two processes allows continued trading. Unlike administration, neither of these processes provides moratorium protection against creditor enforcement either. They are not therefore conducive to a distressed merger, teach out or transfer of students on an orderly basis.

    Whilst it is unlikely that special administration would enable survival of an institution, due to adverse PR in the market, it would provide a structure for a more orderly market exit, that does not currently exist for most providers.

    Protections for lenders

    In addition to there being no viable insolvency process for the majority of HE providers, there is also no viable enforcement route for secured lenders. That is a bad thing because if secured lenders have no route to recovering their money, then they are not going to be incentivised to lend more into the sector.

    If government funding is insufficient to plug funding gaps, providers will need alternative sources of finance. The most logical starting point is to ask their existing lenders. Yes, giving lenders more enforcement rights could lead to more enforcements, but those high street lenders in the sector are broadly supportive of the sector, and giving lenders the right to do something is empowering and does not necessarily mean that they will action this right.

    Lenders are not courting the negative press that would be generated by enforcing against a provider and most probably forcing a disorderly market exit. They are however looking for a clearer line to recovery, which, in turn, will hopefully result in a clearer line to funding for providers.

    Protections for students

    Students are obviously what HE providers are all about, but, if you are short of sleep and scour the Companies and Insolvency legislation, you will find no mention of them. If an HE provider gets into financial distress, then our advice is that the trustees should act in the best interest of all creditors. Students may well be creditors in respect of claims relating to potential termination of courses and/or having to move to another provider, potentially missing a year and waiting longer to enter the job market.

    However, the duty is to all creditors, not just some, and under the insolvency legislation, students have no better protection than any other creditor. Special administration would change that. The regime in the FE sector specifically provides for a predominant duty to act in the best interest of students and would enable the trustees to put students at the forefront of their minds in a time of financial distress.

    A special administration regime would therefore help trustees focus on the interest of students in a financially distressed situation, aligning them with the purposes of the OfS and charitable objects, where relevant.

    Protections for trustees

    Lastly, and probably most forcefully, a special administration regime would assist trustees of an HE provider in navigating a path for their institution in financial distress. As touched on above, it is not clear, for the vast majority of HE providers, whether the Companies and Insolvency legislation applies.

    It is possible that a university could be wound up by the court as an unregistered company. If it were, then the Companies and Insolvency legislation would apply. In those circumstances, the trustees could be personally liable if they fail to act in the best interest of creditors and/or do not have a reasonable belief that the HE provider could avoid an insolvency process.

    Joining a meeting of trustees to tell them that they could be personally liable, but it is not legally clear, is a very unsatisfactory experience; trust me, this is not a message they want to hear from their advisors.

    A special administration regime, applying the Companies and Insolvency legislation to all HE providers, regardless of their constitution or whether they are incorporated, would allow trustees to have a much clearer idea of the risks that they are taking and the approach that they should follow to protect stakeholders.

    In the event a special administration was to be brought in, we would hope it would not need to be applied to a market exit situation. Its real value, however, is in bringing greater legal clarity for lenders and trustees and more protection for students, in the current financial circumstances that HE providers find themselves in.

    Source link

  • The data dark ages | Wonkhe

    The data dark ages | Wonkhe

    Is there something going wrong with large surveys?

    We asked a bunch of people but they didn’t answer. That’s been the story of the Labour Force Survey (LFS) and the Annual Population Survey (APS) – two venerable fixtures in the Office for National Statistics (ONS) arsenal of data collections.

    Both have just lost their accreditation as official statistics. A statement from the Office for Statistical Regulation highlights just how much of the data we use to understand the world around us is at risk as a result: statistics about employment are affected by the LFS concerns, whereas APS covers everything from regional labour markets, to household income, to basic stuff about the population of the UK by nationality. These are huge, fundamental, sources of information on the way people work and live.

    The LFS response rate has historically been around 50 per cent, but it had fallen to 40 per cent by 2020 and is now below 20 per cent. The APS is an additional sample using the LFS approach – current advice suggests that response rates have deteriorated to the extent that it is no longer safe to use APS data at local authority level (the resolution it was designed to be used at).

    What’s going on?

    With so much of our understanding of social policy issues coming through survey data, problems like these feel almost existential in scope. Online survey tools have made it easier to design and conduct surveys – and often design in the kind of good survey development practices that used to be the domain of specialists. Theoretically, it should be easier to run good quality surveys than ever before – certainly we see more of them (we even run them ourselves).

    Is it simply a matter of survey fatigue? Or are people less likely to (less willing to?) give information to researchers for reasons of trust?

    In our world of higher education, we have recently seen the Graduate Outcomes response rate drop below 50 per cent for the first time, casting doubt as to its suitability as a regulatory measure. The survey still has accredited official statistics status, and there has been important work done on understanding the impact of non-response bias – but it is a concerning trend. The national student survey (NSS) is an outlier here – it has a 72 per cent response rate last time round (so you can be fairly confident in validity right down to course level), but it does enjoy an unusually good level of survey population awareness even despite the removal of a requirement for providers to promote the survey to students. And of course, many of the more egregious issues with HESA Student have been founded on student characteristics – the kind of thing gathered during enrollment or entry surveys.

    A survey of the literature

    There is a literature on survey response rates in published research. A meta-analysis by Wu et al (Computers in Human Behavior, 2022) found that, at this point, the average online survey result was 44.1 per cent – finding benefits for using (as NSS does) a clearly defined and refined population, pre-contacting participants, and using reminders. A smaller study by Diaker et al (Journal of Survey Statistics and Methodology, 2020) found that, in general, online surveys yield lower response rates (on average, 12 percentage point lower) than other approaches.

    Interestingly, Holton et al (Human Relations, 2022) show an increase in response rates over time in a sample of 1014 journals, and do not find a statistically significant difference linked to survey modes.

    ONS itself works with the ESRC-funded Survey Futures project, which:

    aims to deliver a step change in survey research to ensure that it will remain possible in the UK to carry out high quality social surveys of the kinds required by the public and academic sectors to monitor and understand society, and to provide an evidence base for policy

    It feels like timely stuff. Nine strands of work in the first phase included work on mode effects, and on addressing non-response.

    Fixing surveys

    ONS have been taking steps to repair LFS – implementing some of the recontacting/reminder approaches that have been successfully implemented and documented in the academic literature. There’s a renewed focus on households that include young people, and a return to the larger sample sizes we saw during the pandemic (when the whole survey had to be conducted remotely). Reweighting has led to a bunch of tweaks to the way samples are chosen, and non-responses accounted for.

    Longer term, the Transformed Labour Force Survey (TLFS) is already being trialed, though the initial March 2024 plans for full introduction has been revised to allow for further testing – important given a bias towards older age group responses, and an increased level of partial responses. Yes, there’s a lessons learned review. The old LFS and the new, online first, TLFS will be running together at least until early 2025 – with a knock on impact on APS.

    But it is worth bearing in mind that, even given the changes made to drive up responses, trial TLFS response rates have been hovering around just below 40 per cent. This is a return to 2020 levels, addressing some of the recent damage, but a long way from the historic norm.

    Survey fatigue

    More usually the term “survey fatigue” is used to describe the impact of additional questions on completion rate – respondents tire during long surveys (as Jeong et al observe in the Journal of Development Economics) and deliberately choose not to answer questions to hasten the end of the survey.

    But it is possible to consider the idea of a civilisational survey fatigue. Arguably, large parts of the online economy are propped up on the collection and reuse of personal data, which can then be used to target advertisements and reminders. Increasingly, you now have to pay to opt out of targeted ads on websites – assuming you can view the website at all without paying. After a period of abeyance, concerns around data privacy are beginning to reemerge. Forms of social media that rely on a constant drive to share personal information are unexpectedly beginning to struggle – for younger generations participatory social media is more likely to be a group chat or discord server, while formerly participatory services like YouTube and TikTok have become platforms for media consumption.

    In the world of public opinion research the struggle with response rates has partially been met via a switch from randomised phone or in-person to the use of pre-vetted online panels. This (as with the rise of focus groups) has generated a new cadre of “professional respondents” – with huge implications for the validity of polling even when weighting is applied.

    Governments and industry are moving towards administrative data – the most recognisable example in higher education being the LEO dataset of graduate salaries. But this brings problems in itself – LEO lets us know how much income graduates pay tax on from their main job, but deals poorly with the portfolio careers that are the expectation of many graduates. LEO never cut it as a policymaking tool precisely because of how broadbrush it is.

    In a world where everything is data driven, what happens when the quality of data drops? If we were ever making good, data-driven decisions, a problem with the raw material suggests a problem with the end product. There are methodological and statistical workarounds, but the trend appears to be shifting away from people being happy to give out personal information without compensation. User interaction data – the traces we create as we interact with everything from ecommerce to online learning – are for now unaffected, but are necessarily limited in scope and explanatory value.

    We’ve lived through a generation where data seemed unlimited. What tools do we need to survive a data dark age?

    Source link

  • Collaboration is key when it comes to addressing harassment and sexual misconduct

    Collaboration is key when it comes to addressing harassment and sexual misconduct

    In all of the noise about the OfS’s new regulation on harassment and sexual misconduct there’s one area where the silence is notable and disappointing – sector collaboration.

    Back in 2022, the independent evaluation of the OfS statement of expectations on harassment and sexual misconduct made a clear recommendation that OfS and DfE “foster more effective partnership working both between HE providers and with those external to the sector. Now, having published details of the new condition E6 and the accompanying guidance, this seems to have been largely forgotten.

    There’s a nod to the potential benefit of collaboration in OfS’s analysis of consultation responses, but it only goes as far as to say that providers “may wish to identify collective steps” – with little explanation of what this could look like and no intention or commitment to proactively support this.

    This feels like a significant oversight, and one that is disappointing to say the least. It’s become clear from our work with IHE members that collaboration needs to be front and centre if we have any hope as a sector of delivering in this area. Without it, some providers – especially smaller ones – will not be able to meet the new requirements, creating risk and failing to achieve the consistency of practice and experience that students expect. This feels even more true given the current context of widespread financial insecurity. Any new regulation ought to be presenting mechanisms and incentives to collaborate – and reduce costs in doing so.

    Working together for a stronger sector – or only sometimes?

    The silence around collaboration is also surprising, given that in other spheres it is seen to be – and in many cases is – the solution to institutions meeting regulatory requirements and student expectations. John Blake’s latest speech on a regional approach to access and participation is just one example of this. There is implicit recognition that in this era of “diminishing resources”, working together is the solution. There’s also the recognition that partnership working needs funding – more on that later.

    It’s also surprising given that OfS has made clear that both providers in any academic partnership are responsible for compliance with the new condition, including where there’s a franchise arrangement. This seems like an open door for collaborative approaches, given that over half the providers on the register do not have their own degree awarding powers. However, as usual, it is unclear what this means in practice. There is no reference in the regulation to how the OfS would view any collaborative efforts, or examples of what this might look like in practice.

    Academic partnerships make logical collaborators

    IHE’s recent project on academic partnerships demonstrates the potential of such arrangements for collaboration that benefits both providers and their students. Our research found a number of innovative models where awarding institutions facilitated collaboration with and between their academic partners in areas including shared learning opportunities and use of shared platforms.

    There’s a clear opportunity here when it comes to staff training. All institutions need to have staff who are “appropriately trained”. Training in areas such as receiving disclosures and conducting investigations benefits from group delivery – where staff can learn from each other. A small provider might only have one or two staff who require it, meaning they are unlikely to draw much benefit from this. It would also make such training prohibitively expensive. It’s likely to need to be delivered by an external organisation (to ensure the “credible and demonstrable expertise” required) and such solutions aren’t scaled to an institution with just a handful of relevant staff. Awarding institutions sharing such group training would solve this – and also benefit shared processes in that staff across both institutions have the same level of knowledge and competence.

    A further benefit of shared training would be that partners could share staff when investigations need greater independence than a small provider can offer. This could be staff from the awarding partner, or another academic partner. This would effectively bring together useful knowledge of institutional context, policies and processes with the necessary external objectivity to run a credible investigation.

    Another opportunity for collaboration is in shared online reporting tools. These can be an effective way of encouraging disclosure, but such systems are often not scaled for small institutions. As well as being more cost-effective, sharing these could lead to greater confidence of students reporting in the independence of tool and the process that follows.

    Think local – for everyone’s sake!

    Regional or local collaboration is the other area with the potential to benefit students, providers, and other services supporting those who experience harassment or sexual misconduct.

    Local or regional collaboration on reporting and investigation can support disclosure by creating more independence in the system. The independent evaluation spoke specifically of this, recommending the facilitation of

    formal or informal shared services, such as regional support networks, and in particular regional investigation units or hubs.

    And it would enable more effective partnerships with external support services. Rather than every provider trying to establish a partnership with a local service (putting a greater burden on groups who are often charities or not-for-profits), group collaborations could streamline this. This needs to include all types of provider, including small providers and FE colleges delivering HE. This would be more efficient, reduce unhelpful competition for the limited resource of the service, and ensure that all students have access to these support services irrespective of their place of study.

    Where there aren’t local services, providers could pool resource and expertise to develop and deliver these. This would reduce competition for specialist staff in the same geographic location, and again ensure parity of support for students across providers.

    It’s important that such collaborations involve all parts of the sector, including small providers – with the burden of their participation reflective of their smaller size. This is vital to ensure that collaborative models are cost effective for everyone.

    Getting it right on student engagement

    Collaborative approaches are also going to be critical to make sure we get it right on student engagement. The OfS expectation is clear that providers work with students and their representatives to develop policies and procedures. But what happens when an institution doesn’t have an SU, or a formal representative structure, or the necessary experience in student engagement to do this? There’s a risk that it won’t be done properly or be done at all.

    We need to consider how we facilitate students to support each other to engage in co-production. This could include sharing staff or exploring the development of local student union services that bring in smaller providers or FE colleges without the means to partner with students in the way that is needed.

    Making it happen

    The sort of collaboration outlined above will need more than just the goodwill of institutions to make it happen. It needs regulatory backing, with more explicit recognition of the value of these approaches and guidance on what this might look like in practice. We also need to recognise that it’s costly.

    Catalyst funding, like that provided back in 2019, would represent far better value to the sector than asking individual providers to fund collaboration. The risk is that without it, the burden of developing a system that works for all students at all providers will be left to the smallest institutions who need these collaborative options the most. Funding would also boost evaluation and resource sharing across the sector. It could consider the benefits of collaborative approaches between awarding and teaching institutions as well as regional structures which ensure a greater parity of support across providers large and small.

    Somewhere on this path to regulation we lost the perspective that harassment and sexual misconduct is a societal issue. What we do now to educate, prevent harm to and support students will have a lasting impact on the future as students become employees, employers, parents and educators themselves. It is not a task to be shouldered alone.

    Source link

  • NM Vistas: What’s New in State Student Achievement Data?

    NM Vistas: What’s New in State Student Achievement Data?

    By Mo Charnot
    For NMEducation.com

    The New Mexico Public Education Department has updated its student achievement data reporting website — NM Vistas — with a renovated layout and school performance data from the 2023-2024 academic year, with expectations for additional information to be released in January 2025. 

    NM Vistas is crucial to informing New Mexicans about school performance and progress at the school, district and state levels through yearly report cards. The site displays student reading, math and science proficiency rates taken from state assessments, as required by the federal Every Student Succeeds Act. Districts and schools receive scores between 0 and 100 based on performance, and schools also receive designations indicating the level of support the school requires to improve.

    Other information on the site includes graduation rates, attendance and student achievement growth. Data also shows rates among specific student demographics, including race, gender, disability, economic indicators and more. 

    PED Deputy Secretary of Teaching, Learning and Innovation, Amanda DeBell told NM Education in an interview that this year’s recreation of the NM Vistas site came from a desire to go beyond the state’s requirements for school performance data.

    “We knew that New Mexico VISTAs had a ton of potential to be a tool that our communities could use,” DeBell said. 

    One new data point added to NM Vistas this year is early literacy rates, which measures the percentage of students in grades K-2 who are reading proficiently at their grade level. Currently, federal law only requires proficiency rates for grades 3-8 to be published, and New Mexico also publishes 11th grade SAT scores. In the 2023-2024 school year, 34.6% of students grades K-2 were proficient in reading, the data says.

    DeBell said several advisory groups encouraged the PED to report early literacy data through NM Vistas.

    “We were missing some key data-telling opportunities by not publishing the early literacy [rates] on our website, so we made a real effort to get those early literacy teachers the kudos that they deserve by demonstrating the scores,” DeBell said.

    The PED also added data on individual schools through badges indicating specific programs and resources the school offers. For example, Ace Leadership High School in Albuquerque has two badges: one for being a community school offering wraparound services to students and families, and another for qualifying for the career and technical education-focused Innovation Zone program.

    “What we are really trying to do is provide a sort of one-stop shopping for families and community members to highlight all of the work that schools are doing,” DeBell said.

    The updated NM Vistas website has removed a few things as well, most notably the entire 2021-2022 NM Vistas data set. DeBell said this was because the PED changed the way it measured student growth data, which resulted in the 2021-2022 school year’s data being incomparable to the most recent two years. 

    “You could not say that the schools in 2021-2022 were doing the same as 2022-2023 or 2023-2024, because the mechanism for calculating their scores was different,” DeBell said.

    However, this does leave NM Vistas with less data overall, only allowing viewers to compare scores from the latest data set to last year’s. 

    In January 2025, several new indicators are expected to be uploaded to the site, including:

    • Student performance levels: Reports the percentage of students who are novices, nearing proficiency, proficient and advanced in reading, math and science at each school, rather than only separating between proficient and not proficient.
    • Results for The Nation’s Report Card (also known as NAEP): Compares student proficiencies between US states.
    • Educator qualifications: DeBell said this would include information on individual schools’ numbers of newer teachers, substitute teachers covering vacancies and more.
    • College enrollment rates: only to be statewide numbers indicating the percentage of New Mexico students attending college after graduating, but DeBell said she later hopes the PED can narrow down by each K-12 school.
    • Per-pupil spending: How much money each school, district and the state spends per-student on average. 
    • School climate: Links the viewer to results of school climate surveys asking students, parents and teachers how they feel about their school experience.
    • Alternate assessment participation: Percentage of students who take a different assessment in place of the NM-MSSA or SAT.

    “We want VISTAs to be super, super responsive, and we want families to be able to use this and get good information,” DeBell said. “We will continue to evolve this until it’s at its 100th iteration, if it takes that much.”

    This year, the PED released statewide assessment results for the 2023-2024 school year to NM Vistas on Nov. 15. Results show 39% of New Mexico students are proficient in reading, 23% are proficient in math and 38% are proficient in science. Compared to last year’s scores, reading proficiency increased by 1%, math proficiency decreased by 1% and science proficiency increased by 4%.

    Source link

  • Lessons Learned from the AI Learning Designer Project –

    Lessons Learned from the AI Learning Designer Project –

    We recently wrapped up our AI Learning Design Assistant (ALDA) project. It was a marathon. Multiple universities and sponsors participated in a seven-month intensive workshop series to learn how AI can assist in learning design. The ALDA software, which we tested together as my team and I built it, was an experimental apparatus designed to help us learn various lessons about AI in education.

    And learn we did. As I speak with project participants about how they want to see the work continue under ALDA’s new owner (and my new employer), 1EdTech, I’ll use this post to reflect on some lessons learned so far. I’ll finish by reflecting on possible futures for ALDA.

    (If you want a deeper dive from a month before the last session, listen to Jeff Young’s podcast interview with me on EdSurge. I love talking with Jeff. Shame on me for not letting you know about this conversation sooner.)

    AI is a solution that needs our problems

    The most fundamental question I wanted to explore with the ALDA workshop participants was, “What would you use AI for?” The question was somewhat complicated by AI’s state when I started development work about nine months ago. Back then, ChatGPT and its competitors struggled to follow the complex directions required for serious learning design work. While I knew this shortcoming would resolve itself through AI progress—likely by the time the workshop series was completed—I had to invest some of the ALDA software development effort into scaffolding the AI to boost its instruction-following capabilities at the time. I needed something vaguely like today’s AI capabilities back then to explore the questions we were trying to answer. Such as what we could be using AI for a year from then.

    Once ALDA could provide that performance boost, we came to the hard part. The human part. When we got down to the nitty-gritty of the question—What would you use this for?—many participants had to wrestle with it for a while. Even the learning designers working at big, centralized, organized shops struggled to break down their processes into smaller steps with documents the AI could help them produce. Their human-centric rules relied heavily on the humans to interpret the organizational rules as they worked organically through large chunks of design work. Faculty designing their own courses had a similar struggle. How is their work segmented? What are the pieces? Which pieces would they have an assistant work on if they had an assistant?

    The answers weren’t obvious. Participants had to discover them by experimenting throughout the workshop series. ALDA was designed to make that discovery process easier.

    A prompt engineering technique for educators: Chain of Inquiry

    Along with the starting question, ALDA had a starting hypothesis: AI can function as a junior learning designer.

    How does a junior learning designer function? It turns out that their primary tool is a basic approach that makes sense in an educator’s context and translates nicely into prompt engineering for AI.

    Learning designers ask their teaching experts questions. They start with general ones. Who are your students? What is your course about? What are the learning goals? What’s your teaching style?

    These questions get progressively more specific. What are the learning objectives for this lesson? How do you know when students have achieved those objectives? What are some common misconceptions they have?

    Eventually, the learning designer has built a clear enough mental model that they can draft a useful design document of some form or other.

    Notice the similarities and differences between this approach and scaffolding a student’s learning. Like scaffolding, Chain of Inquiry moves from the foundational to the complex. It’s not about helping the person being scaffolded with their learning, but it is intended to help them with their thinking. Specifically, the interview progression helps the educator being interviewed think more clearly about hard design problems by bringing relevant context into focus. This process of prompting the interviewee to recall salient facts relevant to thinking through challenging, detailed problems is very much like the AI prompt engineering strategy called Chain of Thought.

    In the interview between the learning designer and the subject-matter expert, the chain of thought they spin together is helpful to both parties for different reasons. It helps the learning designer learn while helping the subject-matter expert recall relevant details that help with thinking. The same is true in ALDA. The AI is learning from the interview, while the same process helps both parties focus on helpful context. I call this AI interview prompt style Chain of Inquiry. I hadn’t seen it used when I first thought of ALDA and haven’t seen it used much since then, either.

    In any case, it worked. Participants seem to grasp it immediately. Meanwhile, a well-crafted Chain of Inquiry prompt in ALDA produced much better documents after it elicited good information through interviews with its human partners.

    Improving mental models helps

    AI is often presented, sold, and designed to be used as a magic talking machine. It’s hard to imagine what you would and wouldn’t use a tool for if you don’t know what it does. We went at this problem through a combination of teaching, user interface design, and guided experimentation.

    On the teaching side, I emphasized that a generative AI model is a sophisticated pattern-matching and completion machine. If you say “Knock knock” to it, it will answer “Who’s there?” because it knows what usually comes after “Knock knock.” I spent some time building up this basic idea, showing the AI matching and completing more and more sophisticated patterns. Some participants initially reacted to this lesson as “not useful” or “irrelevant.” But it paid off over time as participants experienced that understanding helped them think more clearly about what to expect from the AI, with some additional help from ALDA’s design.

    ALDA’s basic structure is simple:

    1. Prompt Templates are re-usable documents that define the Chain of Inquiry interview process (although they are generic enough to support traditional Chain of Thought as well).
    2. Chats are where those interviews take place. This part of ALDA is similar to a typical ChatGPT-like experience, except that the AI asks questions first and provides answers later based on the instructions it receives from the Prompt Template.
    3. Lesson Drafts are where users can save the last step of a chat, which hopefully will be the draft of some learning design artifact they want to use. These drafts can be downloaded as Word or PDF documents and worked on further by the human.

    A lot of the magic of ALDA is in the prompt template page design. It breaks down the prompts into three user-editable parts:

    1. General Instructions provide the identity of the chatbot that guides its behavior, e.g., “I am ALDA, your AI Learning Design Assistant. My role is to work with you as a thoughtful, curious junior instructional designer with extensive training in effective learning practices. Together, we will create a comprehensive first draft of curricular materials for an online lesson. I’ll assist you in refining ideas and adapting to your unique context and style.

      “Important: I will maintain an internal draft throughout our collaboration. I will not display the complete draft at the end of each step unless you request it. However, I will remind you periodically that you can ask to see the full draft if you wish.

      “Important Instruction: If at any point additional steps or detailed outlines are needed, I will suggest them and seek your input before proceeding. I will not deviate from the outlined steps without your approval.

    2. Output template provides an outline of the document that the AI is instructed to produce at the end of the interview.
    3. Steps provide the step-by-step process for the Chain of Inquiry.

    The UI reinforces the idea of pattern matching and completion. The Output Template gives the AI the structure of the document it is trying to complete by the end of the chat. The General Instructions and Steps work together to define the interview pattern the system should imitate as it tries to complete the document.

    Armed with the lesson and scaffolded by the template, participants got better over time at understanding how to think about asking the AI to do what they wanted it to do.

    Using AI to improve AI

    One of the biggest breakthroughs came with the release of a feature near the very end of the workshop series. It’s the “Improve” button at the bottom of the Template page.

    When the user clicks on that button, it sends whatever is in the template to ChatGPT. It also sends any notes the user enters, along with some behind-the-scenes information about how ALDA templates are structured.

    Template creators can start with a simple sentence or two in the General Instructions. Think of it as a starting prompt, e.g., “A learning design interview template for designing and drafting a project-based learning exercise.” The user can then tell “Improve” to create a full template based on that prompt. Because ALDA tells ChatGPT what a complete template looks like, the AI returns a full draft of all the fields ALDA needs to create a template. The user can then test that template and go back to the Improve window to ask for the AI to improve the template’s behavior or extend its functionality.

    Building this cycle into the process created a massive jump in usage and creativity among the participants who used it. I started seeing more and more varied templates pop up quickly. User satisfaction also improved significantly.

    So…what is it good for?

    The usage patterns turned out to be very interesting. Keep in mind that this is a highly unscientific review; while I would have liked to conduct a study or even a well-designed survey, the realities of building this on the fly as a solo operator managing outsourced developers limited me to anecdata for this round.

    The observations from the learning designers from large, well-orchestrated teams seem to line up with my theory that the big task will be to break down our design processes into chunks that are friendly to AI support. I don’t see a short-term scenario in which we can outsource all learning design—or replace it—with AI. (By the way, “air gapping” the AI, by which I mean conducting an experiment in which nothing the AI produced would reach students without human review, substantially reduced anxieties about AI and improved educators’ willingness to experiment and explore the boundaries.)

    For the individual instructors, particularly in institutions with few or no learning designers, I was pleasantly surprised to discover how useful ALDA proved to be in the middle of the term and afterward. We tend to think about learning design as a pre-flight activity. The reality is that educators are constantly adjusting their courses on the fly and spending time at the end to tweak aspects that didn’t work the way they liked. I also noticed that educators seemed interested in using AI to make it safer for them to try newer, more challenging pedagogical experiments like project-based learning or AI-enabled teaching exercises if they had ALDA as a thought partner that could both accelerate the planning and bring in some additional expertise. I don’t know how much of this can be attributed to the pure speed of the AI-enabled template improvement loop and how much the holistic experience helped them feel they understood and had control over ALDA in a way that other tools may not offer them.

    Possible futures for ALDA under 1EdTech

    As for what comes next, nothing has been decided yet. I haven’t been blogging much lately because I’ve been intensely focused on helping the 1EdTech team think more holistically about the many things the organization does and many more that we could do. ALDA is a piece of that puzzle. We’re still putting the pieces in place to determine where ALDA fits in.

    I’ll make a general remark about 1EdTech before exploring specific possible futures for ALDA. Historically, 1EdTech has solved problems that many of you don’t (and shouldn’t) know you could have. When your students magically appear in your LMS and you don’t have to think about how your roster got there, that was because of us. When you switch LMSs, and your students still magically appear, that’s 1EdTech. When you add one of the million billion learning applications to your LMS, that was us too. Most of those applications probably wouldn’t exist if we hadn’t made it easy for them to integrate with any LMS. In fact, the EdTech ecosystem as we know it wouldn’t exist. However much you may justifiably complain about the challenges of EdTech apps that don’t work well with each other, without 1EdTech, they mostly wouldn’t work with each other at all. A lot of EdTech apps simply wouldn’t exist for that reason.

    Still. That’s not nearly enough. Getting tech out of your way is good. But it’s not good enough. We need to identify real, direct educational problems and help to make them easier and more affordable to solve. We must make it possible for educators to keep up with changing technology in a changing world. ALDA could play several roles in that work.

    First, it could continue to function as a literacy teaching tool for educators. The ALDA workshops covered important aspects of understanding AI that I’ve not seen other efforts cover. We can’t know how we want AI to work in education without educators who understand and are experimenting with AI. I will be exploring with ALDA participants, 1EdTech members, and others whether there is the interest and funding we need to continue this aspect of the work. We could wrap some more structured analysis around future workshops to find out what the educators are learning and what we can learn from them.

    Speaking of which, ALDA can continue to function as an experimental apparatus. Learning design is a process that is largely dark to us. It happens in interviews and word processor documents on individual hard drives. If we don’t know where people need the help—and if they don’t know either—then we’re stuck. Product developers and innovators can’t design AI-enabled products to solve problems they don’t understand.

    Finally, we can learn the aspects of learning design—and teaching—that need to be taught to AI because the knowledge it needs isn’t written down in a form that’s accessible to it. As educators, we learn a lot of structure in the course of teaching that often isn’t written down and certainly isn’t formalized in most EdTech product data structures. How and when to probe for a misconception. What to do if we find one. How to give a hint or feedback if we want to get the student on track without giving away the answer. Whether you want your AI to be helping the educator or working directly with the student—which is not really an either/or question—we need AI to better understand how we teach and learn if we want it to get better at helping us with those tasks. Some of the learning design structures we need are related to deep aspects of how human brains work. Other structures evolve much more quickly, such as moving to skills-based learning. Many of these structures should be wired deep into our EdTech so you don’t have to think or worry about them. EdTech products should support them automatically. Something like ALDA could be an ongoing laboratory in which we test how educators design learning interventions, how those processes co-evolve with AI over time, and where feeding the AI evidence-based learning design structure could make it more helpful.

    The first incarnation ALDA was meant to be an experiment in the entrepreneurial sense. I wanted to find out what people would find useful. It’s ready to become something else. And it’s now at a home where it can evolve. The most important question about ALDA hasn’t changed all that much:

    What would you find ALDA at 1EdTech useful for?

    Source link

  • Recommendations for States to Address Postsecondary Affordability

    Recommendations for States to Address Postsecondary Affordability

    Authors: Lauren Asher, Nate Johnson, Marissa Molina, and Kristin D. Hultquist

    Source: HCM Strategists

    An October 2024 report, Beyond Sticker Prices: How States Can Make Postsecondary Education More Affordable, reviews data to evaluate affordability of postsecondary education across nine states, including Alabama, California, Indiana, Louisiana, Ohio, Oklahoma, Texas, Virginia, and Washington.

    The authors emphasize the importance of considering net price, or the full cost of attendance less total aid. Depending on the state, low-income students pay 16-27 percent of their total family income to attend community college.

    At public four-year colleges with high net prices, students with family income of $30,000-48,000 py more than half of their income (51-53 percent) for school in two of the nine states. Four-year colleges with low net prices show cost variability based on whether a student is the lowest income, earning $0-30,000, or has $30,000-48,000 in income. Students in the former group pay 21-27 percent of their family income toward education, while students in the latter group pay 40-41 percent of their income.

    The brief recommends that policymakers take the following issues into account:

    • The way states fund public institutions is critical for low-income students. Consider increasing funding for community colleges as well as evaluating how student income factors into allocation of state funds.
    • Tuition policy is integral to making decisions about postsecondary education. Public perception of college affordability is influenced by tuition costs. States have the power to set limits on how much institutions can raise or change costs, but states also must be careful not to limit institutions from charging what they require to adequately support students’ needs.
    • Transparency and consistency among financial aid programs increase their reach. States should consider making financial aid programs more readily understandable. State financial aid policies should also increase flexibility to adjust for transferring, length of time to graduate, and financial aid from other sources.
    • How states support basic needs affects students’ ability to afford attending college. Policies at the state level can offer students more options for paying for food, housing, caregiving, and more.

    To read the full report, click here.

    Kara Seidel


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • From Pause to Progress: Predictors of Success and Hurdles for Returning Students

    From Pause to Progress: Predictors of Success and Hurdles for Returning Students

    Title: Some College, No Credential Learners: Measuring Enrollment Readiness

    Source: Straighterline, UPCEA

    UPCEA and StraighterLine carried out a survey to examine the driving factors, obstacles, preparedness, and viewpoints of individuals who started but did not finish a degree, certificate, technical, or vocational program. This population, according to the National Student Clearinghouse Research Center, has grown to 36.8 million, a 2.9 percent increase from the year prior. A total of 1,018 participants completed the survey.

    Key findings related to respondents’ readiness to re-enroll include:

    • Predictive factors: Mental resilience, routine readiness, a positive appraisal of institutional communication, and belief in the value of a degree strongly predict re-enrollment intentions.
    • Academic preparedness: A majority of respondents (88 percent) feel proficient in core academic skills (e.g., reading, writing, math, critical thinking), and 86 percent feel competent using technology for learning tasks.
    • Financial readiness: More than half (58 percent) believe they cannot afford tuition and related expenses, while only 22 percent feel financially prepared.
    • Career and personal motivations: The top motivators for re-enrolling include improving salary (53 percent), personal goals (44 percent), and pursuing a career change (38 percent).
    • Beliefs in higher education: Trust in higher education declines after stopping out. While 84 percent of those who had been enrolled in a degree program initially believed a degree was essential for their career goals, only 34 percent still hold that belief. Additionally, just 42 percent agree that colleges are trustworthy.
    • Grit readiness: Four in five respondents feel adaptable and persistent through challenges, and 71 percent say they can handle stress effectively.
    • Flexibility and adaptability: Three-fourths of respondents are open to changing routines and adjusting to new environments.
    • Learning environment: Half of respondents report having access to a study-friendly environment, but 11 percent report not having such access.
    • Time management: Nearly two-thirds are prepared to dedicate the necessary time and effort to complete their education.
    • Support systems: About three in every five respondents receive family support for continuing education, but only 31 percent feel supported by their employers.

    Key findings related to enrollment funnel experiences include:

    • Preferred communication channels: When inquiring about a program, 86 percent of respondents like engaging via email, 42 percent through phone calls, and 39 percent via text messages, while only 6 percent want to use a chatbot.
    • Timeliness and quality of communication: A majority (83 percent) agree or strongly agree that the communication they received when reaching out to a college or university about a program was timely, and 80 percent found it informative.
    • Enrollment experience: Among those who re-enrolled, 88 percent found that the enrollment process was efficient, 84 percent felt adequately supported by their institution, and 78 percent found the process easy.
    • Challenges from inquiry to enrollment: Nearly one-third (31 percent) encountered difficulties with financial aid support, 29 percent experienced delays in getting their questions answered, and 21 percent reported poor communication from the institution.

    Click to read the full white paper.

    —Nguyen DH Nguyen


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • What is academic freedom? With Keith Whittington

    What is academic freedom? With Keith Whittington

    “Who controls what is taught in American universities
    — professors or politicians?”

    Yale Law professor Keith Whittington answers this
    timely question and more in his new book, “You Can’t Teach That!
    The Battle over University Classrooms.” He joins the podcast to
    discuss the history of academic freedom, the difference between
    intramural and extramural speech, and why there is a
    “weaponization” of intellectual diversity.

    Keith E. Whittington is the David Boies Professor of
    Law at Yale Law School. Whittington’s teaching and scholarship span
    American constitutional theory, American political and
    constitutional history, judicial politics, the presidency, and free
    speech and the law.


    Read the transcript.

    Timestamps:

    00:00 Intro

    02:00 The genesis of Yale’s Center for Academic
    Freedom and Free Speech

    04:42 The inspiration behind “You Can’t Teach
    That!”

    06:18 The First Amendment and academic freedom

    09:29 Extramural speech and the public sphere

    17:56 Intramural speech and its complexities

    23:13 Florida’s Stop WOKE Act

    26:34 Distinctive features of K-12 education

    31:13 University of Pennsylvania professor Amy Wax

    39:02 University of Kansas professor Phillip
    Lowcock

    43:42 Muhlenberg College professor Maura
    Finkelstein

    47:01 University of Wisconsin La-Crosse professor Joe
    Gow

    54:47 Northwestern professor Arthur Butz

    57:52 Inconsistent applications of university
    policies

    01:02:23 Weaponization of “intellectual diversity”

    01:05:53 Outro

    Show notes:

    Source link

  • Top 10 U.S. Higher Ed Stories of 2024 with Robert Kelchen

    Top 10 U.S. Higher Ed Stories of 2024 with Robert Kelchen

    Robert Kelchen is a prolific higher education researcher and also the head of the University of Tennessee at Knoxville’s Department of Educational Leadership and Policy Studies. He is also a pretty steady blogger on higher education, but he doesn’t have the time to post quite as much as he did before he took on all those extra admin duties. One of the casualties of his reduced blogging schedule is that he no longer posts his regular “top ten” stories of the year in US higher education, which I, as an outsider, always used to find a handy way to keep track of what mattered over the long term in the US.

    But last year, Robert agreed to reprise his role of summarizer-in chief for us on the year’s final pod, and reaction was so positive, we thought we would have him on again for our final podcast of 2024. As always, Robert is sharp, succinct, and not one to shy away from unconventional calls. And so, without further ado, let’s hear Robert’s Top Ten.


    The World of Higher Education Podcast
    Episode 3.14 | Top 10 U.S. Higher Ed Stories of 2024 with Robert Kelchen

    Transcript

    Alex Usher (AU): Robert, let’s start things off. What’s your number 10 story this year?

    Robert Kelchen (RK): Number 10 out of the U.S. is more changes to big-time college athletics. It seems like things cannot stay stable, and that’s in part because there is so much money involved. So, the big changes this year are more teams changing athletic conferences. Everyone is trying to jockey for position in big-time college athletics to be on the right side of TV contracts. Never mind that the next round of TV contracts may look very different with people cutting the cord from cable. The other big piece is a landmark settlement with former athletes. That requires a financial settlement and then also athletes going forward are going to get about 20 percent or so of all revenue.

    AU: Gross revenue?

    RK: Yeah. So, this also affects the number of scholarships that programs can offer. Previously for big-time athletics, that number was limited. Now, it’s not limited. They focus more on roster sizes instead. This means colleges have some really tough financial choices to make. Because they have to pay athletes, and if they want programs to be competitive, they need to offer more scholarships. That means what will probably happen is some colleges are going to look at dropping sports to club status so they don’t have to pay for scholarships. While also keeping in mind they can’t just drop the women’s sports, at least under Title IX regulations. Although, who knows what’s going to happen for regulations.

    AU: We’ll get to that. We’ll get to that. Let’s move along to number nine.

    RK: Number nine is college closures. It always seems to hang on the list because we continue to see closures. We had a really chaotic closure in early June with the University of the Arts in Philadelphia. I don’t think they were on anyone’s radar for closing.

    Their public financials at the time looked decent, but then their accreditor stepped in, saying, “We’re going to shut you down,” and it happened within a week.

    It was apparently for financial reasons. And it wasn’t immediately obvious from the financial statements from, say, a year and a half ago, what was going on. But it seems like they just ran out of cash very quickly. And it got to the point where, with a week’s notice, students couldn’t finish, faculty couldn’t find jobs, and staff couldn’t find jobs. It was just the absolute worst way to do things.

    AU: Has the number of closures actually ticked up—I mean, you’ve made the point on many occasions that there are always program closures.

    RK: Yeah, you know, there are always program closures. They really did try to push a lot of the low-performing for-profits out, and there just aren’t as many now.

    But I think the big piece that’s coming now is not college closures as much as program closures and academic restructuring. It’s a great time to be a consultant in this industry. Because consultants are the ones brought in to help do the studies on this, identify programs that may need to be closed, and institutional leaders like it because someone else is making the tough calls.

    AU: What about number eight?

    RK: Does anyone want international student? They’ve been a cash cow for many institutions for a while now but that’s beginning to change. Australia’s gotten the majority of the global news coverage on this, with their efforts to try to cap enrollment, which is really divisive there, especially among the more rural institutions that would like more international students. You’re seeing it in Canada, the UK, and the US looking to move in that direction. That potentially creates opportunities in Southeast Asia or in Europe.

    Another wildcard in international students is what’s going to happen with both China and India? Where China is always at risk of having a major policy change, and there seems to be a fair amount of instability in India right now.

    AU: Number seven?

    RK: Number seven is state funding for higher education. There’s been a lot made in the U.S. about disinvestment in public higher education, but over the last decade or so, state funding for higher education in most states has been pretty strong. The states where it’s been the weakest are often the more politically liberal states, and that’s basically because they’ve had more longstanding budget issues. But a number of the more conservative states have funded pretty well, and state funding is at a two-decade high right now.

    I have a hard time seeing that continuing because state budgets have largely flatlined for the upcoming fiscal year. There have been some states that have gone down the route of tax cuts from post-pandemic money that’s starting to come due. But also, there’s just more skepticism about the value of public higher education. And there are states like Utah where enrollment is up substantially. But they’re looking at cutting funding and telling universities and colleges to expect less in the way of enrollment. This really creates the haves and have-nots in public higher education. The big-name public universities are growing like crazy. The regionally focused colleges are struggling mightily.

    AU: You’ve talked about a flight to quality among students. Is it likely that state funding starts to follow into the flagships more than it used to?

    RK: It depends in part on the funding model. If it’s an enrollment or performance funding type model, then that will happen. But also, states don’t want to see regional institutions fail. So they need to have some kind of capacity there.

    The big question that states have to wrestle with is how big they want their flagship institution to be. Do they want to push students to regional institutions? In some states, they have the governance structure in place to do that, even though it’s extremely politically painful. And in other states, there’s no centralization whatsoever, so there’s really nothing they can do about it.

    AU: What about number six?

    RK: Number six is the protests about the war in Gaza and the fall of several Ivy League presidents. I did some analysis back in the spring, and it was really only at a fairly small number of colleges, these protests. But they happened at the institutions that policymakers care about — the super-elite private colleges and some of the big public flagships. Congressional Republicans found that hauling in college presidents — especially women of color — plays really well to their base. And I think that was one of the reasons behind republican elector success.

    AU: That appearance in front of Congress by the presidents of Penn, MIT, and Harvard really was kind of the flashpoint of the year, wasn’t it? I mean, two of them were out within a month of that appearance. It’s another example of Americans assuming that what happens at a very small handful of prominent private institutions is actually reflective of something bigger, isn’t it?

    RK: That’s exactly it. And one of the big reasons is that so many of the policymakers and so many of the journalists — that is their sphere, that’s what they know. We’re also seeing a really interesting dichotomy as President-elect Trump announces his key political appointments. He’s abolishing the Department of Education, reforming higher education, but at the same time, all his press releases highlight the colleges these people went to. So, he’s saying, “They went to NYU, they went to Penn,” while simultaneously dumping on them.

    AU: Robert, what about number five?

    RK: Number five is the increased political realignment by educational attainment. It used to be that if people had a bachelor’s degree, there was a pretty good chance they were pro-business Republicans. That was a substantial part of the base — part of what really kept the party going post-Reagan through the George W. Bush years.

    Then, I think we saw a bit of this starting with Obama, and then it really moved forward. The Democrats made substantial gains among college-educated individuals, especially those with postgraduate degrees. Then Trump came in 2016 and really accelerated the realignment, where college-educated individuals shifted to the Democratic Party, while non-college-educated individuals moved toward the Republican Party.

    That is a sea change to where pollsters now are focusing on weighting polls based on education instead of race or gender. There are still divides in those areas, of course. But what this means for higher ed is that higher education has long been relatively apolitical in the U.S. — probably had a 50-year run that way. But that has started to change dramatically, and that change threatens higher education enrollment as well as public support for the sector.

    AU: It’s tough for a public university. I mean, it’s like saying hospitals are Democrats, right? Or K-12 schools are Republican. It’s weird for a public institution to be identified as partisan. It can’t be easy for public university presidents to be in that position. What can they do? What are they doing to try to reverse that trend?

    RK: One piece of it is who becomes a president of a university or system. We’re seeing more politicians take on those roles. Some of them are unsuccessful, but some of them are very successful as they try to be the bridge between academics and the legislature.

    The other big piece is focusing on outreach and the public mission. Public higher education has two main advantages: one is community outreach, which includes things like agricultural extension classes and community programming. The other is athletics like football, it’s a big driver of public support.

    AU: Okay, what about number four?

    RK: Number four is accreditation. It’s a topic that’s deep in the weeds for a lot of people, but it’s in the political spotlight right now.

    Two big examples stand out. One is the toughest accreditation job in the U.S., which is at the Southern Association of Colleges and Schools (SACS). We no longer have truly regional accreditation in the U.S. — that went away under the first Trump administration. But SACS is still largely focused on conservative southern states, and those states are not happy with accreditation. In Florida, for instance, they decided you have to switch accreditors every cycle. SACS President Belle Whelan is retiring, and I have no idea who in the world would want that job. That is probably the most difficult job in American higher education.

    AU: What’s the potential impact of accreditation becoming more politicized?

    RK: Some of it is just administrative burden for higher ed. If institutions are expected to switch accreditors or if accreditation standards change constantly, that’s a lot of administrative cost.

    But the bigger issue is, will accreditors uphold basic standards? They’ve largely punted on academic standards because every time they try, they get sued. They often win those cases, but it’s expensive. So, accreditors have largely focused on finance. But, the perception is that they’re focused too much on diversity, equity, and inclusion. SACS is actually the only major accreditor that does not require that.

    Another big pressure on accreditation is that several accreditors are now trying to push for shorter bachelor’s degrees. The U.S. traditionally has 120-credit bachelor’s degrees, but there’s a push for 90-credit degrees — shorter, faster, cheaper, better. There’s a strong rationale for it, but also concerns about educational quality. This could completely upend the higher ed finance system. If you get less revenue per student and you eliminate some of the upper-level courses, that might work. But it seems like they’re taking away more of the lower-level general education courses, and those courses subsidize other parts of the system.

    AU: Interesting. Okay, I think DEI has something to do with number three as well.

    RK: Yes. State governments are pushing higher education hard on more of these social issues. Texas and Florida have taken the lead on trying to ban any mention of diversity, equity, and inclusion. In a lot of conservative states — including mine — DEI is now known as “access and engagement” or “access and belonging” or something else. They don’t want to use those words because people expect emails and course syllabi to be searched for those terms.

    At the University of North Texas, for example, the new leader, who came from the Texas Higher Education Coordinating Board, required that all mentions of DEI be eliminated. They focused on the education school, which is also searching for a new dean.

    AU: But it’s gone beyond just excising words or renaming units. If I recall correctly, at North Texas, they were even getting rid of words like “racism” from course syllabi, which makes it hard to teach U.S. history, doesn’t it?

    RK: It does. There was a round of this about a half dozen years ago where the response was to get rid of the words and do the same thing, the legislatures did not like that so now they’re trying to go back and root all of these out.

    AU: Alright, let’s move on. What’s number two? We’ve got to be coming pretty soon to the election, right?

    RK: We are. But I actually don’t think the election is number one this year. The election of Trump is a big deal, and it will have large effects on American higher education. Will the U.S. Department of Education go away? I’m still extremely skeptical of that. Every Republican since 1979 has said they want to abolish it, but it’s difficult to get rid of an agency. And also, Republicans may have unified control in Washington, D.C., but it’s by the skin of their teeth. They can afford to lose, I think, only two votes in the House of Representatives, and it’s a fractured caucus. They’ve got a lot of other priorities, too.

    Plus, you have members looking ahead to 2026 and wondering if they can get re-elected when the majority party typically loses seats in a midterm election. So, it’s going to be a very unsettled, interesting time. But I don’t see the Department of Education going away.

    The bigger question is, what can sneak its way onto that one bill each year that can be passed completely on a partisan basis? The U.S. has a mechanism called reconciliation, where anything with a budgetary impact can go through the Senate with just 50 votes instead of 60. So, that’s where the action will be.

    If they wanted to make changes to student loans, for example, that would have a direct budgetary impact, so it could be part of a reconciliation bill. The challenge is then uniting the Republican caucus. They’re not always well-aligned. And they’ll have to figure out their priorities. Is it immigration? Is it tax cuts, since the Trump tax cuts are set to expire at the end of 2025?

    And even within education, how big is their focus going to be on K-12 education versus higher education? If history is any guide, K-12 will get most of the attention.

    AU: We also have a new Secretary of Education. She seems quite different from Betsy DeVos. What do you expect from her?

    RK: Yeah, she’s definitely different. Her name’s Jovita Carranza. She ran the Small Business Administration, and by all accounts, she got fairly good marks from employees over there. She’s actually one of the few high-level Trump appointees who did not go to an elite institution. She got a teaching certificate and a French degree from East Carolina University. I just found that fascinating. But I think it’s part of the strategy — put the person with a teaching credential in charge of the Department of Education. From a management perspective, she seems competent. From a policy perspective, it’s a little less clear.

    The stated goal is still to get rid of the Department of Education. But even if that’s their goal, actually pulling it off is another story. There’s legislation to basically break apart the department and shuffle its components into other federal agencies. But that’s a long, complicated process. I’d probably say the chances of it happening are maybe 5 to 10 percent at best.

    AU: Yeah, that sounds about right. Okay, bring us to number one.

    RK: Number one doesn’t come from the White House this year — it comes from the U.S. Supreme Court. And it’s a big one. The Supreme Court decision in Loper Bright overturned a 40-year-old precedent called Chevron. The Chevron doctrine gave federal agencies broad discretion to interpret laws where the statute was vague, and courts would generally defer to the agency’s interpretation. It was seen as a major source of power for the so-called “administrative state.”

    But conservatives have wanted to get rid of Chevron for years. They saw it as giving too much power to unelected bureaucrats. Well, they finally got what they wanted. The Supreme Court’s ruling says, “No more deference to agencies. If the statute isn’t clear, it’s Congress’s job to fix it.”

    AU: So why is that such a big deal for higher ed?

    RK: It’s a big deal because so much of higher education policy in the U.S. happens through administrative rulemaking. Look, the Higher Education Act hasn’t been reauthorized since 2008. Congress hasn’t done anything. So everything that’s happened since then — like changes to student loans, Title IX rules, and accreditation requirements — has been done through executive action or rulemaking by the Department of Education.

    With Loper Bright, that power is now significantly reduced. Agencies can no longer just “interpret” laws as they see fit. They need clear statutory authority from Congress.

    So, here’s the twist. Loper Bright was something conservatives pushed for because they didn’t like how Democratic administrations used Chevron to expand regulations on, say, environmental protection or labor standards. But now, with a Republican administration on the way, they’ve tied their own hands.

    If Trump wants to make big changes to higher education — like dismantling the Department of Education, reforming student loans, or changing Title IX — he’s going to have a harder time doing it through executive action. He’s going to need Congress, and Congress isn’t exactly known for its efficiency.

    AU: So, to summarize, when Democrats were in power, Chevron was seen as a bad thing because it gave them more power. But now, with a Republican in power, they’ve realized that Chevron would’ve been useful for them, too.

    RK: That’s it. It’s ironic, right? They dismantled their own ability to govern. And I think the Trump administration learned a lot the first time about how to effectively use executive authority. They were pretty bad at it in the early years, but they figured it out by the end. Well, now their hands are tied in some crucial areas.

    AU: So, in the end, the impact of the Trump presidency might be a lot less than people think because he won’t be able to wield executive power in the same way.

    RK: That’s quite possible.

    AU: Fascinating. Well, Robert, thank you so much for being with us today. It’s been a great ride, as always. We’ll see you back here in 12 months, and we’ll see how much has changed by the end of 2025.

    RK: Probably quite a bit.

    AU: Yeah, no doubt. Thanks, Robert. And it just remains for me to thank our excellent producers, Tiffany MacLennan and Sam Pufek, and of course, you — our listeners — for tuning in. If you have any questions or comments about today’s episode, feel free to reach out to us at [email protected]. And don’t forget to subscribe to our YouTube channel so you never miss an episode of The World of Higher Education.

    We’ll be back on January 9th with our first episode of the new year. Our guest is a mystery for now — you’ll just have to wait and see. Stay well, have a good holiday season, and bye for now.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service.

    Source link

  • Institutional constraints to higher education datafication: an English case study

    Institutional constraints to higher education datafication: an English case study

    by Rachel Brooks

    ‘Intractable’ datafication?

    Over recent years, both policymakers and university leaders have extolled the virtues of moving to a more metricised higher education sector: statistics about student satisfaction with their degree programme are held to improve the decision-making processes of prospective students, while data analytics are purported to help the shift to more personalised learning, for example. Moreover, academic studies have contended that datafication has become an ‘intractable’ part of higher education institutions (HEIs) across the world.

    Nevertheless, our research (conducted in ten English HEIs, funded by TASO) – of data use with respect to widening participation to undergraduate ‘sandwich’ courses (where students spend a year on a work placement, typically during the third year of a four-year degree programme) – indicates that, despite the strong claims about the advantages of making more and better use of data, in this particular area of activity at least, significant constraints operate, limiting the advantages that can accrue through datafication.

    Little evidence of widespread data use

    Our interviewees were those responsible for sandwich course provision in their HEI. While most thought that data could offer useful insights into the effectiveness of their area of activity, there was little evidence of ‘intractable’ data use. This was for three main reasons. First, in some cases, interviewees explained that no relevant data were collected – in relation to access to sandwich courses and/or the outcomes of such courses. Second, in some HEIs, relevant data were collected but not analysed. Such evidence tends to support the contention that ‘data lakes’ are emerging, as HEIs collect more and more data that often remain untapped. Third, in other cases, appropriate data were collected and analysed, but in a very limited manner. For example, one interviewee explained how data were collected and analysed in relation to the participation of students from under-represented ethnic groups, but not with respect to any other widening participation categories. This limited form of datafication, in which only some social characteristics were datafied, was not, therefore, able to inform any action with respect to the participation of widening participation students generally. Indeed, across all ten HEIs, there was only one example of where data were used in a systematic fashion to help analyse who was accessing sandwich courses within the institution, and the extent to which they were representative of the wider student population.

    Constraints on data use

    Lack of institutional capacity

    In explaining this absence of data use, the most commonly identified constraint was the lack of institutional capacity to collect and/or analyse appropriate data. For example, one interviewee commented that they did not have a very good data system for placements – ‘we are still quite Excel- based’. Excel spreadsheets were viewed as limited as they could not be easily shared or updated, and data were relatively hard to manipulate. This, according to the interviewee, made collection of appropriate data laborious, and systematic analysis of the data difficult. Interviewees also pointed to the limited time staff had available to analyse data that the institution had collected.

    Prioritisation of ‘externally-facing’ data

    Several interviewees described how ‘externally-facing data’ – i.e. that required by regulatory bodies and/or that fed into national and international league tables – was commonly prioritised, leaving little time for information officers to devote to generating and/or analysing data for internal purposes. One interviewee, for example, was unclear about what data, if any, were collected about equity gaps but believed that they were generally only pulled together for high-level reports ‘such as for the TEF’.

    Institutional cultures

    A further barrier to using data to analyse access to and outcomes of sandwich courses was perceived to be the wider culture of the institution, including its attitude to risk. An interviewee explained that the data collected in their institution was limited to two main variables – subject of study and fee status (home or international) – because of ‘ongoing cautiousness at the university about how some of that data is used and how it’s shared with different teams’.

    In addition, many participants outlined the struggles they had faced in gaining access to relevant data, and in influencing decisions about what should be collected and what analyses should be run. Several spoke of having to ‘request’ particular analyses to be run (which could be turned down), leading to a fairly ad hoc and inefficient way of proceeding, and illustrating the relative lack of agency accorded to staff – typically occupying mid-level organisational roles – in accessing and manipulating data.

    Reflections

    Examining a discrete set of activities within the UK higher education sector – those relating to sandwich courses – provides a useful lens to examine quotidian practices with respect to the availability and use of data. Despite the strong emphasis on data by government bodies and HEI senior management teams, as well as the claims made about the ‘intractability’ of HEI data use in the academic literature, our research suggests that datafication is perhaps not as widespread as some have claimed. Indeed, it indicates that some areas of activity – even those linked to high profile political and institutional priorities (in this case, employability and widening participation) – have remained largely untouched by ‘intractable’ datafication, with relevant data either not being collected or, where it is collected, not being made available to staff working in pertinent areas.

    As a consequence, the extent to which students from widening participation backgrounds were accessing sandwich courses – and then succeeding on them – relative to their peers typically remained invisible. While the majority of our interviewees were able to speculate on the extent of any under-representation and/or poor experience, this was typically on the basis of anecdotal evidence and their own ‘sense’ of how inequalities were played out in this area. Although reflecting on professional experience is obviously important, many inequalities may not be visible to staff (for example, if a student chooses not to talk about their neurodiversity or first-in-family status), even if they have regular contact with those eligible to take a sandwich course. Moreover, given the status often accorded to quantitative data within the senior management teams of universities, the lack of any statistical reporting about inequalities by social characteristic, as they pertain to sandwich courses, makes it highly likely that such issues will struggle to gain the attention of senior leaders. The barriers to the effective use of metrics highlighted above may thus have a direct impact on HEIs’ capacity to recognise and address inequalities.  

    The research on which this blog is based was carried out with Jill Timms (University of Surrey) and is discussed in more detail in this article Institutional constraints to higher education datafication: an English case study | Higher Education

    Rachel Brooks is Professor of Higher Education at the University of Oxford and current President of the British Sociological Association. She has conducted a wide range of research on the sociology of higher education; her most recent book is Constructing the Higher Education Student: perspectives from across Europe, published (open access) with Policy Press.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link