Tag: Paper

  • Innovation and skills in the English devolution white paper

    Innovation and skills in the English devolution white paper

    Devolution is a central plank of the government’s growth agenda. Providing places with the tools and resources to address local problems in ways that make sense on the ground is a means to unleash potential – and to end what English devolution minister Jim McMahon is happy to call the “top-down micromanaging” approach of ringfencing funds and centralising decision making.

    The launch of the English devolution white paper is the first step on that journey. Strategic authorities, led (for preference) by elected mayors, will cover the entirety of England. Integrated settlements will provide powers covering transport, infrastructure, housing, public services – and, of particular interest to the higher education sector, skills and innovation.

    A big part of the work of the white paper is in consolidating and standardising what had become an unruly system. Sitting above unitary, county, and district councils, a layer of strategic authorities will take on the services that larger areas need to thrive:

    Our goal is simple. Universal coverage in England of Strategic Authorities – which should be a number of councils working together, covering areas that people recognise and work in. Many places already have Combined Authorities that serve this role.

    The forthcoming English Devolution Bill will enshrine this concept in law. We get a computer game-like hierarchy of how strategic authorities will level up (so to speak): foundation strategic authorities (which do not – yet – have a mayor), followed by mayoral strategic authorities, which can then “unlock” designation as established mayoral strategic authorities through fulfilling various criteria. This will grant integrated funding settlements and other treats such as the ability to pilot new kinds of devolution.

    Already eligible for this top designation are Greater Manchester, Liverpool City Region, the North East, South Yorkshire, West Midlands, and West Yorkshire. There’s an aspiration for something similar to apply to London as well, but some legislative fiddling will be needed due to the capital’s “unique circumstances.”

    Innovation

    If you’ve got your head around the different levels of hierarchy, there’s actually quite a lot in the white paper for research and innovation, dependent on an area’s level of devolution.

    In language echoing the industrial strategy green paper, we are told that a strong local network of public and private institutions focused on R&D, innovation, and the diffusion of ideas “is one of the factors which sets highly productive local economies apart.” A big part of this is closer join-up between UKRI and local government.

    Working our way up the devolution ladder, all strategic authorities (including foundation level) will be able to draw on UKRI data on the location of R&D investments, to better allow them to “understand publicly supported innovation activity in their region and how to best take advantage of it.”

    Those mayoral strategic authorities will additionally work with Innovate UK to produce joint plans, to shape long-term innovation strategies and investments in places. UKRI will also be extending its regional partnerships and “network of embedded points of contact” with mayoral strategic authorities.

    And then coming up to the pinnacle of devolution, those established mayoral strategic authorities – to remind you: Greater Manchester, Liverpool City Region, the North East, South Yorkshire, West Midlands, and West Yorkshire, and possibly London – will get actual devolved research funding, in the form of a future regional innovation funding programme allowing local leaders to develop “bespoke innovation support offers for their regions.”

    This draws somewhat on the spirit of the Regional Innovation Fund, though this was allocated to individual higher education institutions – what’s on offer here sounds like a pot of money controlled by mayors. Its format is also to be based on lessons learned from the Innovation Accelerator pilot, which was funding by levelling up money.

    Plus, established mayoralties will get an annual meeting with the science minister, more regular engagement with senior staff at UKRI, and the chance to be consulted on the development of relevant DSIT and UKRI strategies.

    All in all, it’s a decent start down the road of a more significantly devolved research landscape. Important to note, however, that the actual funding on offer to established mayors is contingent on next year’s spending review, and so we’re talking about 2026–27 onwards here. And we might also observe that the House of Commons science committee’s inquiry into regional R&D, announced last week, has clearly been set up with an eye to influencing how this all comes together.

    At least to begin with, there will also be a not insignificant gap between what’s on offer to the most established sites of devolution – some funds to spend as desired, a seat at the strategy table – and what those “foundation” strategic authorities receive, which will be little more than a bit of regional R&D data. There’s potential for imbalance between regions here. Foundation-level authorities are described as a “stepping stone” to later acquiring a mayor, but it could be a long and drawn-out process.

    Skills and more

    On skills, strategic authorities will retain ownership of the Adult Skills Fund (with ringfencing removed from bootcamp and free course pots to allow for flexibility), take on joint ownership of Local Skills Improvement Plans alongside employer representative bodies, and work with employers to take on responsibility for promoting 16-19 pathways. In future, strategic authorities will have a “substantial role” in careers and employment support design outside of the existing Jobcentre Plus network, as the Get Britain Working white paper gestured towards.

    You’ll have spotted that this does not immediately extend to higher education, except to the extent that universities and colleges already get involved with adult skills provision. However the centre of gravity is such that any provider with an avowed interest in the local area will end up developing close relationships with strategic authorities. It isn’t just on skills or innovation – many universities work with local government on issues that affect students (and staff!) such as housing, infrastructure, and transport, and will have a strong interest in working with strategic authorities with new and wider powers to act.

    Administrative geography corner

    If you are labouring under the impression that dividing England up into administrative chunks is a fairly straightforward task, may we introduce you to possibly the single finest document ever published by the Office for National Statistics: the Hierarchical Representation of UK Geographies.

    Pedants may also note that the existing geography of LSIPs, which was controversially allowed to evolve into being outside of the established local authority boundaries, does not map cleanly to current or proposed local authorities – something that a future iteration of plans may need to consider. Likewise, the scope of university core recruitment areas or civic aspirations may not map to either.

    What we’d have loved to have shown you is a map showing which of the new strategic authorities your campus might be in. Sadly the boundaries of the “current map of English devolution” included in the white paper do not cleanly map to England’s many contradictory systems of administrative geography. Some of the devolved areas depicted are almost LSIP regions, one (Surrey) is a non-metropolitan ceremonial county, and one – Devon, including Torbay but not under any circumstances Plymouth – is just plain mad.

    As soon as we get an answer and some boundaries from ONS, we’ll let you know. In the meantime, here’s the map from the white paper:

    Source link

  • CBE Learning Platform Architecture White Paper –

    CBE Learning Platform Architecture White Paper –

    Earlier this year, I had the pleasure of consulting for the Education Design Lab (EDL) on their search for a Learning Management System (LMS) that would accommodate Competency-Based Education (CBE). While many platforms, especially in the corporate Learning and Development space, talked about skill tracking and pathways in their marketing, the EDL team found a bewildering array of options that looked good in theory but failed in practice. My job was to help them separate the signal from the noise.

    It turns out that only a few defining architectural features of an LMS will determine its fitness for CBE. These features are significant but not prohibitive development efforts. Rather, many of the firms we talked to, once they understood the true core requirements, said they could modify their platforms to accommodate CBE but do not currently see enough demand among customers to invest the resources required.

    This white paper, which outlines the architectural principles I discovered during the engagement, is based on my consulting work with EDL and is released with their blessing. In addition to the white paper itself, I provide some suggestions for how to move the vendors and a few comments about other missing pieces in the CBE ecosystem that may be underappreciated.

    The core principles

    The four basic principles for an LMS or learning platform to support CBE are simple:

    • Separate skill tree: Most systems have learning objectives that are attached to individual courses. The course is about the learning objectives. One of the goals of CBE is to create more granular tracking of progress that may run across courses. A skill learned in one course may count toward another. So a CBE platform must include a skill tree as a first-class citizen of the architecture, separate from the course.
    • Mastery learning: This heading includes a range of features, from standardized and simplified grading (e.g., competent/non-yet) to gates in which learners may only pass to the next competency after mastering the one they’re on. Many learning platforms already have these features. But they are not tied to a separate skill tree in a coherent way that supports mastery learning. This is not a huge development effort if the skill tree exists. And in a true CBE platform, it could mean being able to get rid of the grade book, which is a hideous, painful, never-ending time sink for LMS product developers.
    • Integration: In a traditional learning platform, the main integration points are with the registrar or talent management system (tracking registrations and final scores) and external tools that plug into the environment. A CBE platform must import skills, export evidence of achievement, and sometimes work as a delivery platform that gets wrapped into somebody else’s LMS (e.g., a university course built and run on their learning platform but appearing in a window of a corporate client’s learning platform). Most of these are not hard if the first two requirements are developed but they can require significant amounts of developer time.
    • Evidence of achievement: CBE standards increasingly lean toward rich packages that provide not only certification of achievement but also evidence of it. That means the learner’s work must be exportable. This can get complicated, particularly if third-party tools are integrated to provide authentic assessments.

    The full white paper is here:

    (The download button is in the top right corner.)

    Getting the vendors to move

    Vendors are beginning to move toward support for CBE, albeit slowly and piecemeal. I emphasize that the problem is not a lack of capability on their part to support CBE. It’s a lack of perceived demand. Many platform vendors can support these changes if they understand the requirements and see strong demand for them. CBE-interested organizations can take steps to accelerate vendor progress.

    First, provide the vendors with this white paper early in the selection process and tell them that your decision will be partly driven by their demonstrated ability to support the architecture described in the paper. Ask pointed questions and demand demos.

    Second, go to interoperability standards bodies like 1EdTech and work with them to establish a CBE reference architecture. Nothing in the white paper requires new interoperability standards any more than it requires a radical, ground-up rebuild of a learning platform. But if a standards body were to put them together into one coherent picture and offer a certification suite to test for the integrations, it could help. (Testing for the platform-internal functionality like competency dashboards is often outside the remit of interoperability groups, although there’s no law preventing them from taking it on.)

    Unfortunately, the mere existence of these standards and tests doesn’t guarantee that vendors will flock to implement CBE-friendly architectures. But the creation process can help rally a group that demonstrates demand while the existence of the standard itself makes the standard vendors have to meet clear and verifiable.

    What’s still missing

    Beyond the learning platform architecture, I see two pieces that seem to be under-discussed amid the impressive amount of CBE interoperability and coalition-building work that’s been happening lately. I already wrote about the first, which is capturing real job skills in real-time at a level of fidelity that will convince employers your competencies are meaningful to them. This is a hard problem, but it is becoming solvable with AI.

    The second one is tricky to even characterize but it has to do with the content production pipeline. Curricular materials publishers, by and large, are not building their products in CBE-friendly ways. Between the weak third-party content pipeline and the chronic shortage of learning design talent relative to the need, CBE-focused institutions often either tie themselves in knots trying to solve this problem or throw up their hands, focusing on authentic certification and mentoring. But there’s a limit to how much you can improve retention and completion rates if you don’t have strong learning experiences, including formative assessments that enable you to track students’ progress toward competency, address the sticking points in learning particular skills, and so on. This is a tough bind since institutions can’t ignore the quality of learning materials, can’t rely on third parties, and can’t keep up with demand themselves.

    Adding to this problem is a tendency to follow the CBE yellow brick road to what may look like its logical conclusion of atomizing everything. I’m talking about reusable learning objects. I first started experimenting with them at scale in 1998. By 2002, I had given up, writing instead about instructional design techniques to make recyclable learning objects. And that was within corporate training—as it is, not as we imagine it—which tends to focus on a handful of relatively low-level skills for limited and well-defined populations. The lack of a healthy Learning Object Repository (LOR) market should tell us something about how well reusable learning object strategy holds up under stress.

    And yet, CBE enthusiasts continue to find it attractive. In theory, it fits well with the view of smaller learning chunks that show up in multiple contexts. In practice, the LOR usually does not solve the right problems in the right way. Version control, discoverability, learning chunk size, and reusability are all real problems that have to be addressed. But because real-world learning design needs often can’t be met with content legos, starting from a LOR and adding complexity to fix its shortcomings usually brings a lot of pain without commensurate gain.

    There is a path through this architectural mess, just like there is a path through the learning platform mess. But it’s a complicated one that I won’t lay out in detail here.

    Source link

  • Ben Sasse and Paper Tigers in Academia

    Ben Sasse and Paper Tigers in Academia

     This quote caught my eye in the Gainesville Sun today. It is about, Ben Sasse,  the likely new president of UF, and faculty opposition: “I think many of my colleagues feel that his  academic credentials are not where we would have wanted them to be.”

    I’ve deleted the name of the person quoted because that quote is representative of  law professors speak. They say things that mean nothing or, put differently, allow for total deniability while at the same time stirring the pot ever so gently.  It’s the reason I was always an outsider in the Ivory Tower. 

     The statement, and that of law professors’ generally, reminds of a something John Cage said, “I have nothing to say and I am saying it.”

    For example, note the speaker only “thinks” this could be the case. This leaves room to say, if asked to defend the statement, “It’s only what I thought or the impression I had. I could be wrong.”And then there is the word “many.” What is “many?” Is it 12? Could be. Is it a majority? Maybe, maybe not. 

     This reminds me of what I call faculty trolling. For example, say you think someone up for tenure does not deserve it but you are too much of a wuss to say it. You go office to office and say, “I have heard that some people are concerned about Joe’s (the candidate) scholarship.” Not you, of course, unless the person you are talking to says someone like “Yes, I too was wondering about this.” If that is the response, the troller has has hit pay dirt and gets a movement started without ever actually taking a position. If the answer is “I have not heard anything about that.” The troller moves on to the next office.

    And could someone tell me what “where we would have wanted them to be” means. How about, “are not satisfactory” What on earth does “where we would have wanted them to be” actually say. “We would have/” Would have what? In a different universe? On Mars?

    But wait. In the same passage the writer does use the word “we” which includes “I.” So it could say “I wish his credentials were better.” The problem is nearly everyone wishes everything were better. I  wish my car got better mileage but what it gets is fine. I wish my dinner was better last night but it was fine. Wishing for better or wanting better is saying nothing. 

    So what would my quote have been of the Sun had asked me? “I can’t speak for everyone but his academic credentials make him unfit. In addition, he is obviously the product of a rigged search that was guaranteed to produce a candidate to the liking of our right wing, mean spirited Governor.”

    Source link