Category: Academic Professional Development

  • Anticipating Impact of Educational Governance – Sijen

    Anticipating Impact of Educational Governance – Sijen

    It was my pleasure last week to deliver a mini-workshop at the Independent Schools of New Zealand Annual Conference in Auckland. Intended to be more dialogue than monologue, I’m not sure if it landed quite where I had hoped. It is an exciting time to be thinking about educational governance and my key message was ‘don’t get caught up in the hype’.

    Understanding media representations of “Artificial Intelligence”.

    Mapping types of AI in 2023

    We need to be wary of the hype around the term AI, Artificial Intelligence. I do not believe there is such a thing. Certainly not in the sense the popular press purport it to exist, or has deemed to have sprouted into existence with the advent of ChatGPT. What there is, is a clear exponential increase in the capabilities being demonstrated by computation algorithms. The computational capabilities do not represent intelligence in the sense of sapience or sentience. These capabilities are not informed by the senses derived from an organic nervous system. However, as we perceive these systems to mimic human behaviour, it is important to remember that they are machines.

    This does not negate the criticisms of those researchers who argue that there is an existential risk to humanity if A.I. is allowed to continue to grow unchecked in its capabilities. The language in this debate presents a challenge too. We need to acknowledge that intelligence means something different to the neuroscientist and the philosopher, and between the psychologist and the social anthropologist. These semiotic discrepancies become unbreachable when we start to talk about consciousness.

    In my view, there are no current Theory of Mind applications… yet. Sophia (Hanson Robotics) is designed to emulate human responses, but it does not display either sapience or sentience.

    What we are seeing, in 2023, is the extension of both the ‘memory’, or scope of data inputs, into larger and larger multi-modal language models, which are programmed to see everything as language. The emergence of these polyglot super-savants is remarkable, and we are witnessing the unplanned and (in my view) cavalier mass deployment of these tools.

    Three ethical spheres Ethical spheres for Governing Boards to reflect on in 2023

    Ethical and Moral Implications

    Educational governing bodies need to stay abreast of the societal impacts of Artificial Intelligence systems as they become more pervasive. This is more important than having a detailed understanding of the underlying technologies or the way each school’s management decides to establish policies. Boards are required to ensure such policies are in place, are realistic, can be monitored, and are reported on.

    Policies should already exist around the use of technology in supporting learning and teaching, and these can, and should, be reviewed to ensure they stay current. There are also policy implications for admissions and recruitment, selection processes (both of staff and students) and where A.I. is being used, Boards need to ensure that wherever possible no systemic bias is evident. I believe Boards would benefit from devising their own scenarios and discussing them periodically.

     

    Source link

  • Journal of Open, Flexible and Distance Learning (JOFDL) Vol 22(2) – Sijen

    Journal of Open, Flexible and Distance Learning (JOFDL) Vol 22(2) – Sijen

    It is my privilege to serve alongside Alison Fields as co-editor of the Journal of Open, Flexible and Distance Learning, an international high-quality peer-reviewed academic journal. I also have a piece in this issue entitled ‘Definitions of the Terms Open, Distance, and Flexible in the Context of Formal and Non-Formal Learning‘.

    Issue 26 (2) of the Journal of Open, Flexible and Distance Learning (JOFDL) is now available to the world. It begins with an editorial looking at readership and research trends in the journal post-COVID, followed by a thought-provoking Invited Article about the nature of distance learning by Professor Jon Dron. This general issue follows with 7 articles on different aspects of research after COVID-19.
    Alison Fields and Simon Paul Atkinson, JOFDL Joint Editors. 

    Editorial

    Post-pandemic Trends: Readership and Research After COVID-19

    Alison Fields, Simon Paul Atkinson

    1-6

    Image of Jon Dron

    Invited Article

    Technology, Teaching, and the Many Distances of Distance Learning

    Jon Dron

    7-17

    Position Piece

    Definitions of the Terms Open, Distance, and Flexible in the Context of Formal and Non-Formal Learning

    Simon Paul Atkinson

    18-28

    Articles – Primary studies

    Images of Hulbert and Koh

    The Role of Non-Verbal Communication in Asynchronous Talk Channels ‎

    Image of Leomar Miano

    An An Initial Assessment of Soft Skills Integration in Emergency Remote Learning During the COVID-19 Pandemic: A Learners’ PerspectiveA Learners Perspective

    Image of small child at a laptop

    Supporting English Language Development of English Language Learners in Virtual Kindergarten: A Parents’ Perspective

    Image of Lockias Chitanana

    Parents’ Experience with Remote Learning during COVID-19 Lockdown in Zimbabwe

    Image of Martin Watts & Ioannis Andreadis

    First-year Secondary Students’ Perceptions of the Impact of iPad Use on Their Learning in a BYOD Secondary International School

    venn diagram for AIM

    Teaching, Engaging, and Motivating Learners Online Through Weekly, Tailored, and Relevant CommunicationAcademic Content, Information for the Course, and Motivation (AIM)


    Source link

  • Book on Writing Good Learning Outcomes – Sijen

    Book on Writing Good Learning Outcomes – Sijen

    Introducing a short guide entitled: “Writing Good Learning Outcomes and Objectives”, aimed at enhancing the learner experience through effective course design. Available at https://amazon.com/dp/0473657929

    The book has sections on the function and purpose of intended learning outcomes as well as guidance on how to write them with validation in mind. Sections explore the use of different educational taxonomies as well as some things to avoid, and the importance of context. There is also a section on ensuring your intended learning outcomes are assessable. The final section deals with how you might go about designing an entire course structure based on well-structured outcomes, breaking these outcomes down into session-level objectives that are not going to be assessed.

    #ad #education #highereducation #learningdesign #coursedesign #learningoutcomes #instructionaldesign


    Source link

  • Empower Learners for the Age of AI: a reflection – Sijen

    Empower Learners for the Age of AI: a reflection – Sijen

    During the Empower Learners for the Age of AI (ELAI) conference earlier in December 2022, it became apparent to me personally that not only does Artificial intelligence (AI) have the potential to revolutionize the field of education, but that it already is. But beyond the hype and enthusiasm there are enormous strategic policy decisions to be made, by governments, institutions, faculty and individual students. Some of the ‘end is nigh’ messages circulating on Social Media in the light of the recent release of ChatGPT are fanciful click-bait, some however, fire a warning shot across the bow of complacent educators.

    It is certainly true to say that if your teaching approach is to deliver content knowledge and assess the retention and regurgitation of that same content knowledge then, yes, AI is another nail in that particular coffin. If you are still delivering learning experiences the same way that you did in the 1990s, despite Google Search (b.1998) and Wikipedia (b.2001), I am amazed you are still functioning. What the emerging fascination about AI is delivering an accelerated pace to the self-reflective processes that all university leadership should be undertaking continuously.

    AI advocates argue that by leveraging the power of AI, educators can personalize learning for each student, provide real-time feedback and support, and automate administrative tasks. Critics argue that AI dehumanises the learning process, is incapable of modelling the very human behaviours we want our students to emulate, and that AI can be used to cheat. Like any technology, AI also has its disadvantages and limitations. I want to unpack these from three different perspectives, the individual student, faculty, and institutions.


    Get in touch with me if your institution is looking to develop its strategic approach to AI.


    Individual Learner

    For learners whose experience is often orientated around learning management systems, or virtual learning environments, existing learning analytics are being augmented with AI capabilities. Where in the past students might be offered branching scenarios that were preset by learning designers, the addition of AI functionality offers the prospect of algorithms that more deeply analyze a student’s performance and learning approaches, and provide customized content and feedback that is tailored to their individual needs. This is often touted as especially beneficial for students who may have learning disabilities or those who are struggling to keep up with the pace of a traditional classroom, but surely the benefit is universal when realised. We are not quite there yet. Identifying ‘actionable insights’ is possible, the recommended actions harder to define.

    The downside for the individual learner will come from poorly conceived and implemented AI opportunities within institutions. Being told to complete a task by a system, rather than by a tutor, will be received very differently depending on the epistemological framework that you, as a student, operate within. There is a danger that companies presenting solutions that may work for continuing professional development will fail to recognise that a 10 year old has a different relationship with knowledge. As an assistant to faculty, AI is potentially invaluable, as a replacement for tutor direction it will not work for the majority of younger learners within formal learning programmes.

    Digital equity becomes important too. There will undoubtedly be students today, from K-12 through to University, who will be submitting written work generated by ChatGPT. Currently free, for ‘research’ purposes (them researching us), ChatGPT is being raved about across social media platforms for anyone who needs to author content. But for every student that is digitally literate enough to have found their way to the OpenAI platform and can use the tool, there will be others who do not have access to a machine at home, or the bandwidth to make use of the internet, or even to have the internet at all. Merely accessing the tools can be a challenge.

    The third aspect of AI implementation for individuals is around personal digital identity. Everyone, regardless of their age or context, needs to recognise that ‘nothing in life is free’. Whenever you use a free web service you are inevitably being mined for data, which in turn allows the provider of that service to sell your presence on their platform to advertisers. Teaching young people about the two fundamental economic models that operate online, subscription services and surveillance capitalism, MUST be part of ever curriculum. I would argue this needs to be introduced in primary schools and built on in secondary. We know that AI data models require huge datasets to be meaningful, so our data is what fuels these AI processes.

    Faculty

    Undoubtedly faculty will gain through AI algorithms ability to provide real-time feedback and support, to continuously monitor a student’s progress and provide immediate feedback and suggestions for improvement. On a cohort basis this is proving invaluable already, allowing faculty to adjust the pace or focus of content and learning approaches. A skilled faculty member can also, within the time allowed to them, to differentiate their instruction helping students to stay engaged and motivated. Monitoring students’ progress through well structured learning analytics is already available through online platforms.

    What of the in-classroom teaching spaces. One of the sessions at ELAI showcased AI operating in a classroom, interpreting students body language, interactions and even eye tracking. Teachers will tell you that class sizes are a prime determinant of student success. Smaller classes mean that teachers can ‘read the room’ and adjust their approaches accordingly. AI could allow class sizes beyond any claim to be manageable by individual faculty.

    One could imagine a school built with extensive surveillance capability, with every classroom with total audio and visual detection, with physical behaviour algorithms, eye tracking and audio analysis. In that future, the advocates would suggest that the role of the faculty becomes more of a stage manager rather than a subject authority. Critics would argue a classroom without a meaningful human presence is a factory.

    Institutions

    The attraction for institutions of AI is the promise to automate administrative tasks, such as grading assignments and providing progress reports, currently provided by teaching faculty. This in theory frees up those educators to focus on other important tasks, such as providing personalized instruction and support.

    However, one concern touched on at ELAI was the danger of AI reinforcing existing biases and inequalities in education. An AI algorithm is only as good as the data it has been trained on. If that data is biased, its decisions will also be biased. This could lead to unfair treatment of certain students, and could further exacerbate existing disparities in education. AI will work well with homogenous cohorts where the perpetuation of accepted knowledge and approaches is what is expected, less well with diverse cohorts in the context of challenging assumptions.

    This is a problem. In a world in which we need students to be digitally literate and AI literate, to challenge assumptions but also recognise that some sources are verified and others are not, institutions that implement AI based on existing cohorts is likely to restrict the intellectual growth of those that follow.

    Institutions rightly express concerns about the cost of both implementing AI in education and the costs associated with monitoring its use. While the initial investment in AI technologies may be significant, the long-term cost savings and potential benefits may make it worthwhile. No one can be certain how the market will unfurl. It’s possible that many AI applications become incredibly cheap under some model of surveillance capitalism so as to be negligible, even free. However, many of the AI applications, such as ChatGPT, use enormous computing power, little is cacheable and retained for reuse, and these are likely to become costly.

    Institutions wanting to explore the use of AI are likely to find they are being presented with additional, or ‘upgraded’ modules to their existing Enterprise Management Systems or Learning Platforms.

    Conclusion

    It is true that AI has the potential to revolutionize the field of education by providing personalized instruction and support, real-time feedback, and automated administrative tasks. However, institutions need to be wary of the potential for bias, aware of privacy issues and very attentive to the nature of the learning experiences they enable.


    Get in touch with me if your institution is looking to develop its strategic approach to AI.


    Image created using DALL-E

    Source link