Tag: Ways

  • How it Breaks in Subtle Ways –

    How it Breaks in Subtle Ways –

    In my last post, I explained how generative AI memory works and why it will always make mistakes without a fundamental change in its foundational technology. I also gave some tips for how to work around and deal with that problem to safely and productively incorporate imperfect AI into EdTech (and other uses). Today, I will draw on the memory issue I wrote about last time as a case study of why embracing our imperfect tools also means recognizing where they are likely to fail us and thinking hard about dealing realistically with their limitations.

    This is part of a larger series I’m starting on a term of art called “product/market fit.” The simplest explanation of the idea is the degree to which the thing you’re building is something people want and are willing to pay the cost for, monetary or otherwise. In practice, achieving product/market fit is complex, multifaceted, and hard. This is especially true in a sector like education, where different contextual details often create the need for niche products, where the buyer, adopter, and user of the product are not necessarily the same, and where measurable goals to optimize your product for are hard to find and often viewed with suspicion.

    Think about all the EdTech product categories that were supposed to be huge but disappointed expectations. MOOCs. Learning analytics. E-portfolios. Courseware platforms. And now, possibly OPMs. The list goes on. Why didn’t these product categories achieve the potential that we imagined for them? There is no one answer. It’s often in the small details specific to each situation. AI in action presents an interesting use case, partly because it’s unfolding right now, partly because it seems so easy, and partly because it’s odd and unpredictable, even to the experts. I have often written about “the miracle, the grind, and the wall” with AI. We will look at a couple of examples of moving from the miracle to the grind. These moments provide good lessons in the challenges of product/market fit.

    In my next post, I’ll examine product/market fit for universities in a changing landscape, focusing on applying CBE to an unusual test case. In the third post, I’ll explore product/market fit for EdTech interoperability standards and facilitating the growth of a healthier ecosystem.

    Khanmingo: the grind behind the product

    Khan Academy’s Kristen DiCerbo gave us all a great service by writing openly about the challenges of producing a good AI lesson plan generator. They started with prompt engineering. Well-written prompts are miracles. They’re like magic spells. Generating a detailed lesson plan in seconds with a well-written prompt is possible. But how good is that lesson plan? How well did Khanmingo’s early prompts produce the lesson plans?

    Kristen writes,

    At first glance, it wasn’t bad. It produced what looked to be a decent lesson plan—at least on the surface. However, on closer inspection, we saw some issues, including the following:

    • Lesson objectives just parroted the standard
    • Warmups did not consistently cover the most logical prerequisite skills
    • Incorrect answer keys for independent practice
    • Sections of the plan were unpredictable in length and format
    • The model seemed to sometimes ignore parts of the instructions in the prompt

    Prompt Engineering a Lesson Plan: Harnessing AI for Effective Lesson Planning

    You can’t tell the quality of the AI’s lesson plans without having experts examine them closely. You also want feedback from people who will actually use those lesson plans. I guarantee they will find problems that you will miss. Every time. Remember, the ultimate goal of product/market fit is to make something that the intended adopters will actually want. People will tolerate imperfections in a product. But which ones? What’s most important to them? How will they use the product? You can’t answer these questions confidently without the help of actual humans who would be using the product.

    At any rate, Khan Academy realized their early prompt engineering attempts had several shortcomings. Here’s the first:

    Khanmigo didn’t have enough information. There were too many undefined details for Khanmigo to infer and synthesize, such as state standards, target grade level, and prerequisites. Not to mention limits to Khanmigo’s subject matter expertise. This resulted in lesson plans that were too vague and/or inaccurate to provide significant value to teachers.

    Prompt Engineering a Lesson Plan: Harnessing AI for Effective Lesson Planning

    Read that passage carefully. With each type of information or expertise, ask yourself, “Where could I find that? Where is it written down in a form the AI can digest?” The answer is different for each one. How can the AI learn more about what state standards mean? Or about target grade levels? Prerequisites? Subject-matter expertise for each subject? No matter how much ChatGPT seems to know, it doesn’t know everything. And it is often completely ignorant about anything that isn’t well-documented on the internet. A human educator has to understand all these topics to write good lesson plans. A synthetic one does too. But a synthetic educator doesn’t have experience to draw on. It only has whatever human educators have publicly published about their experiences.

    Think about the effort involved in documenting all these various types of knowledge for a synthetic educator. (This, by the way, is very similar to why learning analytics disappointed as a product category. The software needs to know too much that wasn’t available in the systems to make sense of the data.)

    Here’s the second challenge that the Khanmingo team faced:

    We were trying to accomplish too much with a single prompt. The longer a prompt got and the more detailed its instructions were, the more likely it was that parts of the prompt would be ignored. Trying to produce a document as complex and nuanced as a comprehensive lesson plan with a single prompt invariably resulted in lesson plans with neglected, unfocused, or entirely missing parts.

    Prompt Engineering a Lesson Plan: Harnessing AI for Effective Lesson Planning

    I suspect this is a subtle manifestation of the memory problem I wrote about in my last post. Even with a relatively short text like a complex prompt, the AI couldn’t hold onto all the details. The Khanmingo team ended up breaking up the prompt into smaller pieces. This produced better results as the AI could “concentrate on”—or remember the details— one step at a time. I’ll add that this approach provides more opportunities to put humans in the loop. An expert—or a user—can examine and modify the output of each step.

    We fantasize about AI doing work for us. In some cases, it’s not just a fantasy. I use AI to be more productive literally every day. But it fails me often. We can’t know what it will take for AI to solve any particular problem without looking closely at the product’s capabilities and the user’s very specific needs. This is product/market fit.

    Learning design in the real world

    Developing skill in product/market fit is hard. Think about all those different topics the Khanmingo team needed to not only know, but know their relevance to creating lesson plans well enough to diagnose the gaps in the AI’s understanding.

    Refining a product is also inherently iterative. No matter how good you are at product designer how well you know your audience, and how brilliant you are, you will be wrong about some of your ideas early on. Because people are complicated. Organizations are complicated The skills workers need are often complicated and non-obvious. And the details of how the people need to work, individually and together, are often distinctive in ways that are invisible to them. Most people only know their own context. They take a lot for granted. Good product people spend their time uncovering these invisible assumptions and finding the commonalities and the differences. This is always a discovery process that takes time.

    Learning design is a classic case of this problem. People have been writing and adopting learning design methodologies longer than I’ve been alive. The ADDIE model—”Analyze, Design, Develop, Implement, and Evaluate”—was created by Florida State University for the military in the 1970s. “Backward Design” was invented in 1949 by Ralph W. Tyler. Over the past 30 years, I’ve seen a handful of learning design or instructional design tools that attempt to scaffold and enforce these and other design methodologies. I’ve yet to see one get widespread adoption. Why? Poor product/market fit.

    While the goal of learning design (or “instructional design,” to use the older term) is to produce a structured learning experience, the thought process of creating it is non-linear and iterative. As we develop and draft, we see areas that need tuning or improving. We move back and forth across the process. Nobody ever follows learning design methodologies strictly in practice. And I’m talking about trained learning design professionals. Untrained educators stray even further from the model. That’s why the two most popular learning design tools, by far, are Microsoft Word and Google Docs.

    If you’ve ever used ChatGPT and prompt engineering to generate the learning design of a complex lesson, you’ve probably run into unexpected limits to its usefulness. The longer you spend tinkering with the lesson, the more your results start to get worse rather than better. It’s the same problem the Khanmingo team had. Yes, ChatGPT and Claude can now have long conversations. But both research and experience show us that they tend to forget the stuff in the middle. By itself, ChatGPT is useful in lesson design to a point. But I find that when writing complex documents, I paste different pieces of my conversation into Word and stitch them together.

    And that’s OK. If that process saves me design time, that’s a win. But there are use cases where the memory problems are more serious in ways that I haven’t heard folks talking about yet.

    Combining documents

    Here’s a very common use case in learning design:

    First, you start with a draft of a lesson or a chapter that already exists. Maybe it’s a chapter from an OpenStax textbook. Maybe it’s a lesson that somebody on your team wrote a while ago that needs updating. You like it, but you don’t love it.

    You have an article with much of the information you want to add to the new version you want to create. If you were using a vendor’s textbook, you’d have to require the students to read the outdated lesson and then read the article separately. But this is content you’re allowed to revise. If you’re using the article in a way that doesn’t violate copyright—for example, because you’re using it to capture publicly known facts that have changed rather than something novel in the article itself—you can simply use the new information to revise the original lesson. That was often too much work the old way. But now we have ChatGPT, so, you know…magic.

    While you’re at it, you’d like to improve the lesson’s diversity, equity, and inclusion (DEI). You see opportunities to write the chapter in ways that represent more of your students and include examples relevant to their lived experiences. You happen to have a document with a good set of DEI guidelines.

    So you feed your original chapter, new article, and DEI guidelines to the AI. “ChatGPT, take the original lesson and update it with the new information from the article. Then apply the DEI guidelines, including examples in topics X, Y, and Z that represent different points of view. Abracadabra!”

    You can write a better prompt than this one. But no matter how carefully you engineer your prompt, you will be disappointed with the results. Don’t take my word for it. Try it yourself.

    Why does this happen? Because the generative AI doesn’t “remember” these three documents perfectly. Remember what I wrote in my last article:

    The LLMs can be “trained” on data, which means they store information like how “beans” vs. “water” modify the likely meaning of “cool,” what words are most likely to follow “Cool the pot off in the,” and so on. When you hear AI people talking about model “weights,” this is what they mean.

    Notice, however, that none of the original sentences are stored anywhere in their original form. If the LLM is trained on Wikipedia, it doesn’t memorize Wikipedia. It models the relationships among the words using combinations of vectors (or “matrices”) and probabilities. If you dig into the LLM looking for the original Wikipedia article, you won’t find it. Not exactly. The AI may become very good at capturing the gist of the article given enough billions of those tensor/workers. But the word-for-word article has been broken down and digested. It’s gone.

    How You Will Never Be Able to Trust Generative AI (and Why That’s OK)

    Your lesson and articles are gone. They’ve been digested. The AI remembers them, but it’s designed to remember the meaning, not the words. It’s not metaphorically sitting down with the original copy and figuring out where to insert new information or rewrite a paragraph. That may be fine. Maybe it will produce something better. But it’s a fundamentally different process than human editing. We won’t know if the results it generates have good product/market until we test it out with folks.

    To the degree that you need to preserve the fidelity of the original documents, you’ve got a problem. And the more you push generative AI to do this kind of fine-tuning work across multiple documents, the worse it gets. You’re running headlong into one of your synthetic co-worker’s fundamental limitations. Again, you might get enough value from it to achieve a net gain in productivity. But you might not because this seemingly simple use case is pushing hard on functionality that hasn’t been designed, tested, and hardened for this kind of use.

    Engineering around the problem

    Any product/market fit problem has two sides: product and market. On the market side, how good will be good enough? I’ve specifically positioned my ALDA project as producing a first draft with many opportunities for a human in the loop. This is a common approach we’re seeing in educational content generation right now, for good reasons. We’re reducing the risk to the students. Risk is one reason the market might reject the productket.

    Another is failing to deliver the promised time savings. If the combination of the documents is too far off from the humans’ goal, it will be rejected. Its speed will not make up for the time required for the human to fix its mistakes. We have to get as close to the human need as possible, mitigate the’ consequences, and test to see if we’ve achieved a cost/benefit for the users good enough that they will adopt the product.

    There is no perfect way to solve the memory problem. You will always need a human in the loop. But we could make a good step forward if we could get the designs solid enough to be directly imported into the learning platform and fine-tuned there, skipping the word processor step. Being able to do so requires tackling a host of problems, including (but not limited to) the memory issue. We don’t need the AI to get the combination of these documents perfect, but we do need it to get close enough that our users don’t need to dump the output into a full word processor to rewrite the draft.

    When I raised this problem to a colleague who is a digital humanities scholar and an expert in AI, he paused before replying. “Nobody is working on this kind of problem right now,” he said. “On one side, AI experts are experimenting with improving the base models. On the other side, I see articles all the time about how educators can write better prompts. Your problem falls in between those two.”

    Right. As a sector, we’re not discussing product/market fit for particular needs. The vendors are, each within their own circumscribed world. But on the customer side? I hear people tell me they’re conducting “experiments.” It sounds a bit like when university folk told me they were “working with learning analytics,” which turned out to mean that they were talking about working with learning analytics. I’m sure there are many prompt engineering workshops and many grants being written for fancy AI solutions that sound attractive to the National Science Foundation or whoever the grantor happens to be. But in the middle ground? Making AI usable to solve specific problems? I’m not seeing much of that yet.

    The document combination problem can likely be addressed adequately well through a combination of approaches that improve the product and mitigate the consequences of the imperfections to make them more tolerable for the market. After consulting with some experts, I’ve come up with a combination of approaches to try first. Technologically, I know it will work. It doesn’t depend on cutting-edge developments. Will the market accept the results? Will the new approach be better than the old one? Or will it trip over some deal-breaker, like so many products before it?

    I don’t know. I feel pretty good about my hypothesis. But I won’t know until real learning designers test it on real projects.

    We have a dearth of practical, medium-difficulty experiments with real users right now. That is a big, big problem. It doesn’t matter how impressive the technology is if its capabilities aren’t the ones the users need to solve real-world problems. You can’t fix this gap with symposia, research grants, or even EdTech companies that have the skills but not necessarily the platform or business model you need.

    The only way to do it is to get down into the weeds. Try to solve practical problems. Get real humans to tell you what does and doesn’t work for them in your first, second, third, fourth, and fifth tries. That’s what the ALDA project is all about. It’s not primarily about the end product. I am hopeful that ALDA itself will prove to be useful. But I’m not doing it because I want to commercialize a product. I’m doing it to teach and learn about product/market fit skills with AI in education. We need many more experiments like this.

    We put too much faith in the miracle, forgetting the grind and the wall are out there waiting for us. Folks in the education sector spend too much time staring at the sky, waiting for the EdTech space aliens to come and take us all to paradise,.

    I suggest that at least some of us should focus on solving today’s problems with today’s technology, getting it done today, while we wait for the aliens to arrive.

    Source link

  • 3 Ways To Take New Photos For Your Website

    3 Ways To Take New Photos For Your Website

    You want a new photo of yourself for your personal website. What’s the best way to take new photos? Here are 3 ways for you to get beautiful photos for your website.

    Hi! I’m Jennifer van Alstyne. Welcome to my blog about understanding your digital footprint in research and Higher Education. I help professors have an amazing online presence through my company The Academic Designer LLC. I empower academics to feel confident when showing up online.

    The form above subscribes you to new posts published on The Social Academic blog.
    Want emails from Jennifer about building your online presence? Subscribe to her email list.
    Looking for the podcast? Subscribe on Spotify.
    Prefer to watch videos? Subscribe on YouTube.

    Listen to the podcast

    Take a selfie

    A woman wearing a hijab and a green sweater holds her phone and smiles at the camera while taking a selfie photo.

    A selfie is a photo that you take of yourself. You can use selfies on your personal website! The most popular way to take a selfie is with your phone camera. You can use the front camera of your phone, but you’ll take higher quality photos if you use the rear camera.

    Look for a space with good lighting. You may find great natural lighting outside in the last hour before sunset. Or, if you’re an early riser (unlike me), try just before sunrise. Photographers call these periods the ‘golden hour’ because the light can be perfect.

    Avoid a distracting background in your photo. When in doubt, you can take a photo against a neutral background like a wall. Then you can edit the photo to remove the background like I did for the selfie above.

    Use a cleaning cloth to gently wipe the camera lens clean before you start your photo shoot.

    Try holding the phone at different angles (slightly higher, slightly lower). You can also hold the phone out to your side and turn your head towards it. Don’t forget to smile.

    Your phone camera may have settings to help you take a better selfie. For instance, there may be a portrait mode. Or, an option to take a wide selfie.

    You can use your phone’s self-timer setting to take hands-free and full body selfies. You may want to prop up your phone using books. You could also invest in an inexpensive tripod or selfie stick.

    A black and white dog wearing blue sunglasses lays on its back on the grass taking a selfie photo with a phone in holds with its paw.

    Devices For Taking Selfies

    1. Mobile phone
    2. Camera
    3. High resolution web cam

    Other Tools To Help

    • Selfie stick
    • Tripod
    • Remote
    • Ring light
    • Books, a box, or other items to prop up your phone

    Ask a friend or family member

    A man holds a phone taking a photo of his friend who is smiling. They are in Los Angeles. The Hollywood sign is visible in the background, but unreadable because the background of the photo is very blurry.

    One of the best ways to get a new photo of yourself for your website is to ask a friend or family member to help.

    If you have a friend who is good at taking photos, great! You may not. That’s OK. What’s more important is that you have someone with you that you feel comfortable with.

    Set expectations by letting them know it may take a bit of time. That you want to try out different poses (or locations). Ask them to try taking photos at different angles. Try watching a YouTube video together about creative portrait photos with a phone.

    My best tip is to ask your friend to take photos holding the phone vertically and horizontally. That gives you flexibility with what you use the photos for.

    It’s possible that this photo shoot may not result in the photos you want. So don’t put too much pressure on your friend. Do encourage them to take lots of photos so you can choose the best ones. If you find photos you like in the set, ask your friend if they’d like to be credited as photographer when you share the photo online.

    The form above subscribes you to new posts published on The Social Academic blog.
    Want emails from Jennifer about building your online presence? Subscribe to her email list.
    Looking for the podcast? Subscribe on Spotify.
    Prefer to watch videos? Subscribe on YouTube.

    Hire a professional photographer

    Photography and video production studio with a yellow chair, plant, and pink backdrop. There are large lights on tripods.

    Hiring a professional portrait photographer is the best option. The mistake you want to avoid is asking for a headshot when booking your photo shoot.

    Professional photographers have the gear, skills, and experience to take your new website photos. If you can afford a professional photo shoot, it’s your best option. You want to hire an experienced local photographer.

    You may have a place to take your photos in mind. Professionals often include 1 location in your package, or shoot in their studio. Be sure to ask for their recommendations.

    Photographs are the intellectual property of the photographer. You’ll need to sign a contract or licensing agreement for permission to use the photos. Help ensure you get the right license by sharing how you want to use your photos when booking your shoot. Most photographers have professional liability insurance.

    You’re looking for a portrait photographer, not a headshot photographer. Be clear about how many photos you’re looking for, as your photographer may offer a ‘brand photo shoot’ or a ‘website photo package.’

    Portrait photographers can ranges from $100-$400/hour in the United States. Local Facebook groups are often an effective way to find recommendations for photographers in your area.

    Here are 3 ways to get new photos for your personal website

    A graphic for 'New Photos For Your Website' with tips including take a selfie, ask a friend for help, and hire a professional photographer. On the graphic are a cutout of a Sony professional camera, the backs of 2 cell phone cameras, and a selfie photo of Jennifer van Alstyne smiling.

    The form above subscribes you to new posts published on The Social Academic blog.
    Want emails from Jennifer about building your online presence? Subscribe to her email list.
    Looking for the podcast? Subscribe on Spotify.
    Prefer to watch videos? Subscribe on YouTube.

    My name is Jennifer van Alstyne. I’ve been helping professors make beautiful personal websites to share their research since 2018. I’d love to help you!

    Let’s talk on a no pressure Zoom call about working with me on your online presence and website.

    Personal Website How To’s The Social Academic

    Source link

  • Suicide Prevention and Awareness: Four Ways HR Can Lead the Conversation – CUPA-HR

    Suicide Prevention and Awareness: Four Ways HR Can Lead the Conversation – CUPA-HR

    by CUPA-HR | August 31, 2022

    This blog post was contributed by Maureen De Armond, Executive Director, Human Resources at Drake University.

    In higher education, we must plan for many worst-case scenarios, including tornados, fires, active-shooter situations, and, as we now know, pandemics. Among this wide range of difficult scenarios that could present themselves on our campuses at any time, suicide is one that deserves more attention and discussion.

    Like other scenarios, suicide prevention and planning should contain at least these components: awareness and prevention at the front end; crisis-response protocols to deploy in the moment; and post-incident support and debriefing.

    Here are four ways HR can take the lead on awareness and prevention efforts:

    Normalize Mental Health Conversations

    HR can set the example in normalizing conversations about mental health. From new employee orientation to leadership trainings to trainings offered during open enrollment, make mental health as normal a topic to discuss as being sick with the flu or needing rehab due to an injured back. We know that mental health carries a stigma; openly discussing mental health helps chip away at that stigma.

    Coordinate Messaging

    Tailor communications to your institution’s practices and use more than one channel for communication. If your institution sends newsletters, plan articles for each week of September. Consider emails as well. Be sure to provide your leadership teams with prepared messages and information they can share with their teams. Point them to helplines, training opportunities, reminders about EAPs, and tips for what to do and where to go if they or someone they know is having mental health crisis.

    Collaborative messaging sent from campus and community partners can also create a widespread impact. Consider reaching out to student services, the provost’s office, Title IX/Equal Opportunity, campus safety, student senate, faculty senate, student counseling, faculty subject matter experts, and your institution’s employee assistance program (EAP) providers and health plan partners to team up on mental health messaging throughout the month.

    Train, Train, Train

    Offer learning and development opportunities that focus on mental health awareness as well as suicide prevention. This fall semester, Drake University is offering Question, Persuade and Refer suicide prevention training in addition to Mental Health First Aid for Higher Education for faculty and staff. Faculty partners are facilitating these sessions. We’ve found that having faculty-led sessions can help attract faculty attendees, leverage internal expertise and offer faculty additional forms of service to the institution.

    Inventory Resources, Benefits and Policies

    Take a fresh look at your well-being/wellness programming. Does it appropriately address mental health? Explore what resources and trainings may be available through your existing EAP contracts. Does your health plan offer virtual doctor’s visits for mental health care? If so, shine a spotlight on those resources. Making mental health care as accessible as possible may mean more people will consider using it. Review sick, personal and other paid-time-off leave policies to ensure mental health is clearly addressed. This includes handbook and web language, too.

    While suicide awareness and prevention shouldn’t be a once-a-year conversation, September is a great month for HR to demonstrate leadership in normalizing conversations about mental health and suicide prevention and planning.

    Related resources:

    Reassessing Your Institution’s EAP: Steps for HR Pros to Increase Awareness and Accessibility (The Higher Ed Workplace Blog)

    HEERF Funds Can Be Used to Support Mental Health Resources (The Higher Ed Workplace Blog)

    Mental Health Month Focus: Higher Ed Campus Culture (The Higher Ed Workplace Blog)



    Source link

  • CUPA-HR Participates in Hill Meetings With House Ways and Means Committee Member Offices – CUPA-HR

    CUPA-HR Participates in Hill Meetings With House Ways and Means Committee Member Offices – CUPA-HR

    by CUPA-HR | May 10, 2022

    Over the last month, CUPA-HR’s government relations team joined the American Council on Education (ACE) and other higher education organizations in virtual Capitol Hill meetings to discuss tax priorities for the higher education community. Meetings have been held with staffers of Members of the House Ways and Means Committee to advocate for tax policies and proposals to alleviate various burdens placed on students, employees and institutions alike.

    Specifically, the meetings have allowed the higher education community to encourage members’ action on the following issues:

    • Supporting the extension and expansion of the universal, non-itemizer charitable deduction;
    • Repealing the taxability of scholarships and grant aid, specifically for the Pell Grant and other scholarships for graduate and medical students;
    • Enhancing higher ed tax credits like the American Opportunity Tax Credit and the Lifetime Learning Credit;
    • Repealing the endowment tax;
    • Expanding and modernizing tax-free employer-provided educational assistance as granted under Section 127 of the Internal Revenue Code (IRC);
    • Reinstating advance refunding of tax-exempt bonds and expanding debt issuance with a Direct Pay Bonds program;
    • Creating “lifelong learning and training accounts” to provide workers and employers the opportunity to make tax-free contributions to pay for future training and credentials; and
    • Repealing the unrelated business income tax “basketing” provision.

    In June 2021, ACE sent a letter to House Ways and Means Committee and Senate Finance Committee leadership requesting these proposals and others be included in the American Jobs and American Families Plans. CUPA-HR signed onto this letter, along with several other higher education groups.

    CUPA-HR joined the most recent meetings specifically to advocate for the Section 127 expansion and modernization. Section 127 of the IRC is an educational assistance program that allows employers to pay or reimburse an employee tuition or student loan repayments on a tax-free basis up to $5,250. CUPA-HR previously advocated for the program to include student loan repayments, which was granted under the 2020 CARES Act and the Consolidated Appropriations Act of 2021, as well as to increase the annual exclusion cap of $5,250 to an amount closer to $12,000, to expand coverage to employee’s partners and dependents, and to expand coverage to gig workers and independent contractors, all of which were a focus during the meetings.

    CUPA-HR will continue to participate in these meetings and will keep members apprised of any legislative proposals that result from these meetings.



    Source link

  • 4 Ways HR Pros Can Support DEI During Black History Month and Beyond – CUPA-HR

    4 Ways HR Pros Can Support DEI During Black History Month and Beyond – CUPA-HR

    by CUPA-HR | February 2, 2022

    “Black History Month is an opportunity to understand the stories of Black Americans as something more than a history of racism and strife. It’s a time to recognize their undeniable impact on our country and culture.” – BestColleges.com

    Since 1976, U.S. presidents have officially designated February as Black History Month. This month-long celebration of the historic contributions of the Black community is the legacy of historian and scholar Carter G. Woodson, who worked tirelessly to reform the way Black history is taught in schools.

    Today, higher ed institutions recognize and honor Black History Month in myriad ways, but the work required to create and sustain equality, an inclusive workplace culture and a sense of belonging on our campuses is ongoing.

    The CUPA-HR resources listed below provide insights and tools to help individuals and institutions build on their understanding of the issues and take action to bring about change. The current 21-Day Equity Habit Building Challenge, in particular, offers a one-of-a-kind opportunity to learn from institutions that are making meaningful strides in this work.

    • Participate in a 21-Day Challenge. The concept of the 21-day challenge was introduced several years ago by diversity expert Eddie Moore, Jr. to create greater understanding of the intersections of race, power, privilege, supremacy, oppression and equity. There are several challenges to choose from:
    • Watch the on-demand webinar, Measurements That Matter: Using HR Data to Advance DEI Goals. In this webinar, presenters from the USC Race and Equity Center shared their insights and strategies for increasing diversity in campus workforces. You’ll learn what types of data to collect, how to use that data to get a better sense of your institution’s workforce diversity gaps and how to provide equitable responses to any issues uncovered.
    • Add DEI resources to your toolbelt. Explore the Diversity, Equity and Inclusion Toolkit (CUPA-HR members-only resource). Here you’ll find resources, trainings, policies, forms and templates, and much more to help support your institution’s DEI efforts.
    • Watch the recording of the virtual town hall, Partners in Justice, We Will Not Be Silent! This discussion features voices from our higher ed HR community and explores what it means to move beyond existing DEI initiatives to create real systemic and cultural change at our colleges and universities.

    Throughout the month, take time to dive into one or more of these resources (individually or with a group) and explore new ways to take action now and throughout the year.



    Source link