Tag: Technology

  • Using Technology to Restore Trust in Testing

    Using Technology to Restore Trust in Testing

    • Francesca Woodward is Group Managing Director for English at Cambridge University Press & Assessment.

    Anyone who has ever taken English language tests to advance in their studies or work knows how important it is to have confidence in their accuracy, fairness and transparency. 

    Trust is fundamental to English proficiency tests. But at a time of digital disruption, with remote testing on the rise and AI tools evolving rapidly, the integrity of English language testing is under pressure.

    Applied proportionally and ethically, technology can boost our trust in the exam process –adapting flexibly to test-takers’ skill levels, for instance, or allowing quicker marking and delivery of results. The indiscriminate use of technology, however, is likely to have unintended and undesirable consequences.

    Technology is not the problem. Overreliance on technology can be. A case in point is the shift to remote language testing that removes substantial human supervision from the process.

    During the pandemic, many educational institutions and test providers were forced to move to online-only delivery. Universities and employers adapted to the exceptional circumstances by recognising results from some of those newer and untried providers.

    The consequences of rushed digital adoption are becoming clear. Students arriving at UK universities after passing newer at-home tests have been found to be poorly equipped, relative to their peers – and more prone to academic misconduct. Students were simply not being set up to succeed.

    Some new at-home tests have since been de-recognised by universities amid reports that they have enabled fraud in the UK. Elsewhere, students have been paying proxies to sit online exams remotely. Online, videos explaining how to cheat on some of the newer tests have become ubiquitous.

    So how can universities mitigate against these risks, while ensuring that genuine test-takers thrive academically?

    When it comes to teaching and learning a language – as well as assessing a learner’s proficiency – human expertise cannot be replaced. This is clear to experts – including researchers at Cambridge, which has been delivering innovation in language learning and testing for more than a century. 

    Cambridge is one of the forces behind IELTS, the world’s most trusted English test. We also deliver Cambridge English Qualifications, Linguaskill and other major assessments. Our experience tells us that people must play a critical role at every step of teaching, assessment and qualification.

    While some may be excited by the prospect of an “AI-first” model of testing, we should pursue the best of both worlds – human oversight prioritised and empowered by AI. This means, for instance, human-proctored tests delivered in test centres that use tried and proven tech tools.

    In language testing – particularly high-stakes language testing, such as for university or immigration purposes – one size does not fit all. While an online test taken at home may be suitable and even secure for some situations for some learners, others prefer or need to be assessed in test centres, where help is on hand and the technology can be consistently relied upon. For test-takers and universities, choice and flexibility are crucial.

    Cambridge has been using and experimenting with AI for decades. We know in some circumstances that AI can be transformative in improving users’ experience. For the highest stakes assessments, innovation alone is no alternative to real human teaching, learning and understanding. And the higher the stakes, the more important human oversight becomes.

    The sector must reaffirm its commitment to quality, rigour and fairness in English language testing. This means resisting shortcuts and challenging providers that are all too ready to compromise on standards. It means investing in human expertise. It means using technology to enhance, not undermine, trust.

    This is not the time to “move fast and break things”. Every test provider, every university and every policymaker must play their part.

    Source link

  • Machine learning technology is transforming how institutions make sense of student feedback

    Machine learning technology is transforming how institutions make sense of student feedback

    Institutions spend a lot of time surveying students for their feedback on their learning experience, but once you have crunched the numbers the hard bit is working out the “why.”

    The qualitative information institutions collect is a goldmine of insight about the sentiments and specific experiences that are driving the headline feedback numbers. When students are especially positive, it helps to know why, to spread that good practice and apply it in different learning contexts. When students score some aspect of their experience negatively, it’s critical to know the exact nature of the perceived gap, omission or injustice so that it can be fixed.

    Any conscientious module leader will run their eye down the student comments in a module feedback survey – but once you start looking across modules to programme or cohort level, or to large-scale surveys like NSS, PRES or PTES, the scale of the qualitative data becomes overwhelming for the naked eye. Even the most conscientious reader will find that bias sets in, as comments that are interesting or unexpected tend to be foregrounded as having greater explanatory power over those that seem run of the mill.

    Traditional coding methods for qualitative data require someone – or ideally more than one person – to manually break down comments into clauses or statements that can be coded for theme and sentiment. It’s robust, but incredibly laborious. For student survey work, where the goal might be to respond to feedback and make improvements at pace, institutions are open that this kind of robust analysis is rarely, if ever, the standard practice. Especially as resources become more constrained, devoting hours to this kind of detailed methodological work is rarely a priority.

    Let me blow your mind

    That is where machine learning technology can genuinely change the game. Student Voice AI was founded by Stuart Grey, an academic at the University of Strathclyde (now working at the University of Glasgow), initially to help analyse student comments for large engineering courses. Working with Advance HE he was able to train the machine learning model on national PTES and PRES datasets. Now, further training the algorithm on NSS data, Student Voice AI offers literally same-day analysis of student comments for NSS results for subscribing institutions.

    Put the words “AI” and “student feedback” in the same sentence and some people’s hackles will immediately rise. So Stuart spends quite a lot of time explaining how the analysis works. The word he uses to describe the version of machine learning Student Voice AI deploys is “supervised learning” – humans manually label categories in datasets and “teach” the machine about sentiment and topic. The larger the available dataset the more examples the machine is exposed to and the more sophisticated it becomes. Through this process Student Voice AI has landed on a discreet number of comment themes and categories for taught students and the same for postgraduate research students that the majority of student comments consistently fall into – trained on and distinctive to UK higher education student data. Stuart adds that the categories can and do evolve:

    “The categories are based on what students are saying, not what we think they might be talking about – or what we’d like them to be talking about. There could be more categories if we wanted them, but it’s about what’s digestible for a normal person.”

    In practice that means that institutions can see a quantitative representation of their student comments, sorted by category and sentiment. You can look at student views of feedback, for example, and see the balance of positive, neutral and negative sentiment, overall, segment it into departments or subject areas, or years of study, then click through to see the relevant comments to see what’s driving that feedback. That’s significantly different from, say, dumping your student comments into a third party generative AI platform (sharing confidential data with a third party while you’re at it) and asking it to summarise. There’s value in the time and effort saved, but also in the removal of individual personal bias, and the potential for aggregation and segmentation for different stakeholders in the system. And it also becomes possible to compare student qualitative feedback across institutions.

    Now, Student Voice AI is partnering with student insight platform evasys to bring machine learning technology to qualitative data collected via the evasys platform. And evasys and Student Voice AI have been commissioned by Advance HE to code and analyse open comments from the 2025 PRES and PTES surveys – creating opportunities to drill down into a national dataset that can be segmented by subject discipline and theme as well as by institution.

    Bruce Johnson, managing director at evasys is enthused about the potential for the technology to drive culture change both in how student feedback is used to inform insight and action across institutions:

    “When you’re thinking about how to create actionable insight from survey data the key question is, to whom? Is it to a module leader? Is it to a programme director of a collection of modules? Is it to a head of department or a pro vice chancellor or the planning or quality teams? All of these are completely different stakeholders who need different ways of looking at the data. And it’s also about how the data is presented – most of my customers want, not only quality of insight, but the ability to harvest that in a visually engaging way.”

    “Coming from higher education it seems obvious to me that different stakeholders have very different uses for student feedback data,” says Stuart Grey. “Those teaching at the coalface are interested in student engagement; at the strategic level the interest is in strategic level interest in trends and sentiment analysis and there are also various stakeholder groups in professional services who never get to see this stuff normally, but we can generate the reports that show them what students are saying about their area. Frequently the data tells them something they knew anyway but it gives them the ammunition to be able to make change.”

    The results are in

    Duncan Berryman, student surveys officer at Queens University Belfast, sums up the value of AI analysis for his small team: “It makes our life a lot easier, and the schools get the data and trends quicker.” Previously schools had been supplied with Excel spreadsheets – and his team were spending a lot of time explaining and working through with colleagues how to make sense of the data on those spreadsheets. Being able to see a straightforward visualisation of student sentiment on the various themes means that, as Duncan observes rather wryly, “if change isn’t happening it’s not just because people don’t know what student surveys are saying.”

    Parama Chaudhury, professor of economics and pro vice provost education (student academic experience) at University College London explains where qualitative data analysis sits in the wider ecosystem for quality enhancement of teaching and learning. In her view, for enhancement purposes, comparing your quantitative student feedback scores to those of another department is not particularly useful – essentially it’s comparing apples with oranges. Yet the apparent ease of comparability of quantitative data, compared with the sense of overwhelm at the volume and complexity of student comments, can mean that people spend time trying to explain the numerical differences, rather than mining the qualitative data for more robust and actionable explanations that can give context to your own scores.

    It’s not that people weren’t working hard on enhancement, in other words, but they didn’t always have the best possible information to guide that work. “When I came into this role quite a lot of people were saying ‘we don’t understand why the qualitative data is telling us this, we’ve done all these things,’” says Parama. “I’ve been in the sector a long time and have received my share of summaries of module evaluations and have always questioned those summaries because it’s just someone’s ‘read.’ Having that really objective view, from a well-trained algorithm makes a difference.”

    UCL has tested two-page summaries of student comments to specific departments this academic year, and plans to roll out a version for every department this summer. The data is not assessed in a vacuum; it forms part of the wider institutional quality assurance and enhancement processes which includes data on a range of different perspectives on areas for development. Encouragingly, so far the data from students is consistent with what has emerged from internal reviews, giving the departments that have had the opportunity to engage with it greater confidence in their processes and action plans.

    None of this stops anyone from going and looking at specific student comments, sense-checking the algorithm’s analysis and/or triangulating against other data. At the University of Edinburgh, head of academic planning Marianne Brown says that the value of the AI analysis is in the speed of turnaround – the institutionl carries out a manual reviewing process to be sure that any unexpected comments are picked up. But being able to share the headline insight at pace (in this case via a PowerBI interface) means that leaders receive the feedback while the information is still fresh, and the lead time to effect change is longer than if time had been lost to manual coding.

    The University of Edinburgh is known for its cutting edge AI research, and boasts the Edinburgh (access to) Language Models (ELM) a platform that gives staff and students access to generative AI tools without sharing data with third parties, keeping all user data onsite and secured. Marianne is clear that even a closed system like ELM is not appropriate for unfettered student comment analysis. Generative AI platforms offer the illusion of a thematic analysis but it is far from robust because generative AI operates through sophisticated guesswork rather than analysis of the implications of actual data. “Being able to put responses from NSS or our internal student survey into ELM to give summaries was great, until you started to interrogate those summaries. Robust validation of any output is still required,” says Marianne. Similarly Duncan Berryman observes: “If you asked a gen-AI tool to show you the comments related to the themes it had picked out, it would not refer back to actual comments. Or it would have pulled this supposed common theme from just one comment.”

    The holy grail of student survey practice is creating a virtuous circle: student engagement in feedback creates actionable data, which leads to education enhancement, and students gain confidence that the process is authentic and are further motivated to share their feedback. In that quest, AI, deployed appropriately, can be an institutional ally and resource-multiplier, giving fast and robust access to aggregated student views and opinions. “The end result should be to make teaching and learning better,” says Stuart Grey. “And hopefully what we’re doing is saving time on the manual boring part, and freeing up time to make real change.”

    Source link

  • Trump cuts could expose student data to cyber threats

    Trump cuts could expose student data to cyber threats

    When hackers hit a school district, they can expose Social Security numbers, home addresses, and even disability and disciplinary records. Now, cybersecurity advocates warn that the Trump administration’s budget and personnel cuts, along with rule changes, are stripping away key defenses that schools need.

    “Cyberattacks on schools are escalating and just when we need federal support the most, it’s being pulled away,” said Keith Krueger, chief executive officer of the Consortium for School Networking, an association of technology officials in K-12 schools. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    The stakes are high. Schools are a top target in ransomware attacks, and cyber criminals have sometimes succeeded in shutting down whole school districts. The largest such incident occurred in December, when hackers stole personal student and teacher data from PowerSchool, a company that runs student information systems and stores report cards. The theft included data from more than 60 million students and almost 10 million teachers. PowerSchool paid an undisclosed ransom, but the criminals didn’t stop. Now, in a second round of extortion, the same cyber criminals are demanding ransoms from school districts.  

    The federal government has been stepping up efforts to help schools, particularly since a 2022 cyberattack on the Los Angeles Unified School District, the nation’s second-largest. Now this urgently needed assistance is under threat. 

    Warning service

    Of chief concern is a cybersecurity service known as MS-ISAC, which stands for Multi-State Information Sharing and Analysis Center. It warns more than 5,700 schools around the country that have signed up for the service about malware and other threats and recommends security patches. This technical service is free to schools, but is funded by an annual congressional appropriation of $27 million through the Cybersecurity and Infrastructure Security Agency (CISA), an agency within the Department of Homeland Security.

    On March 6, the Trump administration announced a $10 million funding cut as part of broader budget and staffing cuts throughout CISA. That was ultimately negotiated down to $8.3 million, but the service still lost more than half of its remaining $15.7 budget for the year. The non-profit organization that runs it, the Center for Internet Services, is digging into its reserves to keep it operating. But those funds are expected to run out in the coming weeks, and it is unclear how the service will continue operating without charging user fees to schools. 

    “Many districts don’t have the budget or resources to do this themselves, so not having access to the no cost services we offer is a big issue,” said Kelly Lynch Wyland, a spokeswoman for the Center for Internet Services.  

    Sharing threat information

    Another concern is the effective disbanding of the Government Coordinating Council, which helps schools address ransomware attacks and other threats through policy advice, including how to respond to ransom requests, whom to inform when an attack happens and good practices for preventing attacks. This coordinating council was formed only a year ago by the Department of Education and CISA. It brings together 13 nonprofit school organizations representing superintendents, state education leaders, technology officers and others. The council met frequently after the PowerSchool data breach to share information. 

    Now, amid the second round of extortions, school leaders have not been able to meet because of a change in rules governing open meetings. The group was originally exempt from meeting publicly because it was discussing critical infrastructure threats. But the Department of Homeland Security, under the Trump administration, reinstated open meeting rules for certain advisory committees, including this one. That makes it difficult to speak frankly about efforts to thwart criminal activity.

    Non-governmental organizations are working to resurrect the council, but it would be in a diminished form without government participation.

    “The FBI really comes in when there’s been an incident to find out who did it, and they have advice on whether you should pay or not pay your ransom,” said Krueger of the school network consortium. 

    A federal role

    A third concern is the elimination in March of the education Department’s Office of Educational Technology. This seven-person office dealt with education technology policies — including cybersecurity. It issued cybersecurity guidance to schools and held webinars and meetings to explain how schools could improve and shore up their defenses. It also ran a biweekly meeting to talk about K-12 cybersecurity across the Education Department, including offices that serve students with disabilities and English learners. 

    Eliminating this office has hampered efforts to decide which security controls, such as encryption or multi-factor authentication, should be in educational software and student information systems. 

    Many educators worry that without this federal coordination, student privacy is at risk. “My biggest concern is all the data that’s up in the cloud,” said Steve Smith, the founder of the Student Data Privacy Consortium and the former chief information officer for Cambridge Public Schools in Massachusetts. “Probably 80 to 90 percent of student data isn’t on school-district controlled services. It’s being shared with ed tech providers and hosted on their information systems.”

    Security controls

    “How do we ensure that those third-party providers are providing adequate security against breaches and cyber attacks?” said Smith. “The office of ed tech was trying to bring people together to move toward an agreed upon national standard. They weren’t going to mandate a data standard, but there were efforts to bring people together and start having conversations about the expected minimum controls.”

    That federal effort ended, Smith said, with the new administration. But his consortium is still working on it. 

    In an era when policymakers are seeking to decrease the federal government’s involvement in education, arguing for a centralized, federal role may not be popular. But there’s long been a federal role for student data privacy, including making sure that school employees don’t mishandle and accidentally expose students’ personal information. The Family Educational Rights and Privacy Act, commonly known as FERPA, protects student data. The Education Department continues to provide technical assistance to schools to comply with this law. Advocates for school cybersecurity say that the same assistance is needed to help schools prevent and defend against cyber crimes.

    “We don’t expect every town to stand up their own army to protect themselves against China or Russia,” said Michael Klein, senior director for preparedness and response at the Institute for Security and Technology, a nonpartisan think tank. Klein was a senior advisor for cybersecurity in the Education Department during the previous administration. “In the same way, I don’t think we should expect every school district to stand up their own cyber-defense army to protect themselves against ransomware attacks from major criminal groups.” 

    And it’s not financially practical. According to the school network consortium only a third of school districts have a full-time employee or the equivalent dedicated to cybersecurity. 

    Budget storms ahead

    Some federal programs to help schools with cybersecurity are still running. The Federal Communications Commission launched a $200 million pilot program to support cybersecurity efforts by schools and libraries. FEMA funds cybersecurity for state and local governments, which includes public schools. Through these funds, schools can obtain phishing training and malware detection. But with budget battles ahead, many educators fear these programs could also be cut. 

    Perhaps the biggest risk is the end to the entire E-Rate program that helps schools pay for the internet access. The Supreme Court is slated to decide this term on whether the funding structure is an unconstitutional tax.

    “If that money goes away, they’re going to have to pull money from somewhere,” said Smith of the Student Data Privacy Consortium. “They’re going to try to preserve teaching and learning, as they should.  Cybersecurity budgets are things that are probably more likely to get cut.

    “It’s taken a long time to get to the point where we see privacy and cybersecurity as critical pieces,” Smith said. “I would hate for us to go back a few years and not be giving them the attention they should.”

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about student cybersecurity was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Microsoft and FFA help students use smart sensors and AI to learn about the future of farming and technology

    Microsoft and FFA help students use smart sensors and AI to learn about the future of farming and technology

    Microsoft Corp. and the National FFA Organization on Tuesday announced the national expansion of FarmBeats for Students, a cutting-edge educational program integrating smart sensors, data science and artificial intelligence (AI) to teach precision agriculture in classrooms. Starting today, FFA teachers and students throughout the United States, including FFA chapters in 185 middle and high schools, will receive a classroom set of FarmBeats for Students kits free of charge. The kits include ready-to-use sensor systems along with curriculum for teachers and are designed for classrooms of all kinds; no prior technical experience is required.

    More and more farmers are adopting advanced technology, including automating systems such as tractors and harvesters and using drones and data analysis to intervene early against pests and disease, to maximize crop yield, optimize resource usage, and adjust to changing weather patterns. Gaining hands-on experience with machine automation, data science and AI will help American agricultural students remain competitive in the global market.

    Using the FarmBeats for Students kits and free curriculum, students build environmental sensor systems and use AI to monitor soil moisture and detect nutrient deficiencies — allowing them to understand what is happening with their plants and make data-driven decisions in real time. Students can adapt the kit to challenges unique to their region — such as drought, frost and pests — providing them with practical experience in tackling real-world issues in their hometowns.

    “Microsoft is committed to ensuring students and teachers have the tools they need to succeed in today’s tech-driven world, and that includes giving students hands-on experience with precision farming, data science and AI,” said Mary Snapp, Microsoft vice president, Strategic Initiatives. “By teaming up with FFA to bring FarmBeats for Students to students across the country, we hope to inspire the next generation of agriculture leaders and equip them with the skills to tackle any and all challenges as they guide us into the future.”

    “Our partnership with Microsoft exemplifies the power of collaboration in addressing industry needs while fostering personal and professional growth among students,” said Christine White, chief program officer, National FFA Organization. “Supporting agricultural education and leadership development is crucial for shaping the next generation of innovators and problem solvers. Programs like this equip students with technical knowledge, confidence and adaptability to thrive in diverse and evolving industries. Investing in these young minds today sets the stage for a more sustainable, innovative and resilient agricultural future.”

    In addition, teachers, students or parents interested in FarmBeats for Students can purchase a kit for $35 at this link and receive free training at Microsoft Learn.

    Any educator interested in implementing the FarmBeats for Students program can now access a new, free comprehensive course on the Microsoft Educator Learn Center, providing training on precision agriculture, data science and AI, allowing teachers to earn professional development hours and badges. 

    FarmBeats for Students was co-developed by Microsoft, FFA and agriculture educators. The program aligns with the AI for K-12 initiative guidelines; Agriculture, Food and Natural Resources career standards; Computer Science Teachers Association standards; and Common Core math standards.

    For more information about FarmBeats for Students, visit aka.ms/FBFS.

    Kevin Hogan
    Latest posts by Kevin Hogan (see all)

    Source link

  • A big reason why students with math anxiety underperform — they just don’t do enough math

    A big reason why students with math anxiety underperform — they just don’t do enough math

    Math anxiety isn’t just about feeling nervous before a math test. It’s been well-known for decades that students who are anxious about math tend to do worse on math tests and in math classes.

    But recently, some of us who research math anxiety have started to realize that we may have overlooked a simple yet important reason why students who are anxious about math underperform: They don’t like doing math, and as a result, they don’t do enough of it.

    We wanted to get a better idea of just what kind of impact math anxiety could have on academic choices and academic success throughout college. In one of our studies, we measured math anxiety levels right when students started their postsecondary education. We then followed them throughout their college career, tracking what classes they took and how well they did in them.

    We found that highly math-anxious students went on to perform worse not just in math classes, but also in STEM classes more broadly. This means that math anxiety is not something that only math teachers need to care about — science, technology and engineering educators need to have math anxiety on their radar, too.

    We also found that students who were anxious about math tended to avoid taking STEM classes altogether if they could. They would get their math and science general education credits out of the way early on in college and never look at another STEM class again. So not only is math anxiety affecting how well students do when they step into a STEM classroom, it makes it less likely that they’ll step into that classroom in the first place.

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.

    This means that math anxiety is causing many students to self-sort out of the STEM career pipeline early, closing off career paths that would likely be fulfilling (and lucrative).

    Our study’s third major finding was the most surprising. When it came to predicting how well students would do in STEM classes, math anxiety mattered even more than math ability. Our results showed that if you were a freshman in college and you wanted to do well in your STEM classes, you would likely be better off reducing your math anxiety than improving your math ability.

    We wondered: How could that be? How could math anxiety — how you feel about math — matter more for your academic performance than how good you are at it? Our best guess: avoidance.

    If something makes you anxious, you tend to avoid doing it if you can. Both in our research and in that of other researchers, there’s been a growing understanding that in addition to its other effects, math anxiety means that you’ll do your very best to engage with math as little as possible in situations where you can’t avoid it entirely.

    This might mean putting in less effort during a math test, paying less attention in math class and doing fewer practice problems while studying. In the case of adults, this kind of math avoidance might look like pulling out a calculator whenever the need to do math arises just to avoid doing it yourself.

    In some of our other work, we found that math-anxious students were less interested in doing everyday activities precisely to the degree that they thought those activities involved math. The more a math-anxious student thought an activity involved math, the less they wanted to do it.

    If math anxiety is causing students to consistently avoid spending time and effort on their classes that involve math, this would explain why their STEM grades suffer.

    What does all of this mean for educators? Teachers need to be aware that students who are anxious about math are less likely to engage with math during class, and they’re less likely to put in the effort to study effectively. All of this avoidance means missed opportunities for practice, and that may be the key reason why many math-anxious students struggle not only in math class, but also in science and engineering classes that require some math.

    Related: Experts share the latest research on how teachers can overcome math anxiety

    Math anxiety researchers are at the very beginning of our journey to understand ways to make students who are anxious about math stop avoiding it but have already made some promising suggestions for how teachers can help. One study showed that a direct focus on study skills could help math-anxious students.

    Giving students clear structure on how they should be studying (trying lots of practice problems) and how often they should be studying (spaced out over multiple days, not just the night before a test) was effective at helping students overcome their math anxiety and perform better.

    Especially heartening was the fact that the effects seen during the study persisted in semesters beyond the intervention; these students tended to make use of the new skills into the future.

    Math anxiety researchers will continue to explore new ways to help math-anxious students fight their math-avoidant proclivities. In the meantime, educators should do what they can to help their students struggling with math anxiety overcome this avoidance tendency — it could be one of the most powerful ways a math teacher can help shape their students’ futures.

    Rich Daker is a researcher and founder of Pinpoint Learning, an education company that makes research-backed tools to help educators identify why their students make mistakes. Ian Lyons is an associate professor in Georgetown University’s Department of Psychology and principal investigator for the Math Brain Lab.

    Contact the opinion editor at [email protected].

    This story about math anxiety was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Balancing Technology and Connection in College Recruitment

    Balancing Technology and Connection in College Recruitment

    Let’s be real: college planning is not the only thing on your prospective students’ minds. They’re juggling school, jobs, relationships, social media, and, you know, just trying to figure out life. So, when we talk about AI in college planning, it’s crucial to remember that it’s just one piece of a much larger puzzle.

    At RNL, we’re constantly looking at the trends shaping higher education, and AI is definitely a big one. But here’s the thing: it’s not a one-size-fits-all solution. To truly connect with students, you need to understand how they’re using (or not using) these tools, and meet them where they are.

    It’s all about personas

    Our latest research dives deep into student attitudes toward AI in college planning, and the results are fascinating. We’ve identified four key “AI Adoption Personas” that can help you tailor your outreach and messaging:

    Pioneers (early adopters, enthusiastic users): These digital natives are all-in on AI, using it for everything from college research to essay writing.

    • Key takeaway: Pioneers are already on board but value human guidance. 76% would feel more comfortable if a school advisor explained the benefits and risks of AI.

    Aspirers (interested but cautious adopters): Aspirers see the potential of AI but need a little nudge.

    • Key takeaway: Show them the value! 51% are motivated by easy access to free AI tools, and 41% want to see success stories from other students.

    Fence Sitters (uncertain, passive users): These students are on the fence about AI, often lacking confidence in their current college planning approach. Y

    • Key takeaway: Don’t overwhelm them. 40% haven’t even used online college planning tools! Focus on highlighting the potential of AI and offering advisor support.

    Resistors (skeptical, avoid AI in college planning): Resistors are the most reluctant to embrace AI, preferring traditional methods like guidance counselors and college websites.

    • Key takeaway: Respect their preferences, but don’t write them off entirely. 48% would feel more comfortable with an advisor explaining AI, even if they’re not ready to use it themselves.

    Beyond the bots: human connection still matters

    Image of high school students looking at the cell phones

    No matter which persona your students fall into, one thing is clear: human connection still matters. While AI can provide valuable information and streamline certain tasks, it can’t replace the empathy, guidance, and personalized support students crave.

    Think about it: choosing a college is a huge life decision, and students want to feel understood and supported throughout the process.

    Our research shows that students use a variety of resources for college planning, and these often involve human interaction:

    • College websites (often reviewed with parents or counselors)
    • Parents/family (a trusted source of advice and support)
    • Social media (connecting with current students and alumni)
    • Guidance counselors (providing expert advice and personalized recommendations)
    • Friends/peers (sharing experiences and offering encouragement)
    • Books/online articles (supplementing their knowledge and exploring different options)

    AI is just one tool in their toolbox. It’s a powerful tool, no doubt, but it works best when it complements these other resources, rather than replacing them.

    What does this mean for you?

    It means your staff—admissions counselors, enrollment specialists, and marketing team—are more important than ever. They are the human face of your institution, who can build relationships with prospective students, answer their questions, and alleviate their anxieties.

    The good news is that institutions already know this. Our 2025 Marketing Practices For Undergraduate Students Report confirms that “human-based” enrollment strategies are consistently rated highly effective, often more effective than just two years ago.

    For example, the report shows that:

    • In-person meetings remain a top strategy across all institution types (4-year private, 4-year public, and 2-year), with effectiveness ratings consistently at or near 100%.
    • Personalized videos sent directly to students have seen a significant rise in effectiveness, particularly for 4-year institutions.
    • Even with the rise of digital tools, strategies like SMS, social media, and email communications remain foundational and highly effective, largely because they enable personalized, one-on-one communication.

    These findings underscore that in an increasingly digital world, the human touch truly sets institutions apart.

    Here are a few ways to bring that human touch to your college planning efforts:

    • Invest in training for your staff. Ensure they understand AI’s benefits and limitations, and how to integrate it ethically and effectively into their work.
    • Encourage personalized communication. Don’t rely solely on automated emails and chatbots. Encourage your staff to contact students individually, offering tailored advice and support.
    • Create opportunities for connection. Host virtual or in-person events where students meet current students, faculty, and staff.
    • Highlight the human stories. Share stories of successful alumni, dedicated faculty, and supportive staff. Show prospective students what makes your institution unique.

    Ultimately, success in today’s ever-evolving higher education landscape hinges on a delicate balance: embracing the power of technology like AI while never losing sight of the fundamental importance of human connection.

    By deeply understanding your students – their individual needs, their preferred college planning resources, and their unique “AI Adoption Persona” – and leveraging data to personalize their experience, you can create an effective and genuinely human recruitment and enrollment strategy.

    It’s about blending the efficiency of AI with the empathy and guidance that only your dedicated staff can provide, ensuring that every student feels seen, supported, and confident in their college journey.

    Ready to dive deeper?

    Do you want to learn more about AI in college planning and how to connect with today’s students?

    3 Reasons to Attend the RNL National Conference

    Join us in Atlanta July 22-24 for the most comprehensive conference on enrollment and student success.

    1. Choose from more than 120 sessions on recruitment, retention, financial aid, and more.
    2. Hear the keynote from former Secretary of Education Dr. Miguel Cardona on the future of higher education.
    3. Interact with campus professionals and national experts about ways you can achieve your goals.

    See all the details

    Source link

  • St. Catherine University Partners with Collegis Education to Advance Technology Strategy and Student Experience

    St. Catherine University Partners with Collegis Education to Advance Technology Strategy and Student Experience

    The strategic partnership will strengthen the University’s student-centered mission through agile technology, operational innovation, and a shared commitment to community.

    St. Paul, Minn. – (May 5, 2025) St. Catherine University (St. Kate’s) and Collegis Education announced today that they have entered into a strategic partnership to enhance the University’s delivery of IT services.

    The decision to seek external IT support was driven by the University’s growing need to accelerate progress on strategic technology initiatives that had slowed within the existing tech infrastructure. The University recognized the need for a partner with the expertise, agility, and shared mission to help build a more responsive, future-ready infrastructure.

    “We realized that the pace of change in technology—and the expectations of our students—were outpacing what our internal systems and structures could support,” said Latisha Dawson, Vice President of Human Resources and Project Lead. “Our institution is centered around student connection and academic excellence. But to uphold that mission, we needed a partner with the technical expertise and scalability to move faster, innovate more nimbly, and help us deliver a modern student experience. Collegis allows us to do just that, so we can spend less time managing systems and more time serving our students.”

    In this partnership, Collegis will provide day-to-day IT operational support, a dedicated Chief Information Officer (CIO), and technological infrastructure that supports the university’s forward progress on strategic projects, while upholding strong data governance and enabling real-time responsiveness.

    As part of the deal, St. Kate will gain access to Collegis Education’s Connected Core®, a secure, composable data platform powered by Google Cloud. As a tech-agnostic solution, Connected Core unifies siloed systems and data sets, enables real-time and actionable institutional intelligence, produces AI-powered data strategies, and delivers proven solutions that enhance recruitment, retention, operations, and student experiences — driving measurable impact across the entire student lifecycle.

    St. Kate’s selected Collegis following a thorough evaluation of potential partners. “A lot of vendors can fill a gap, but that’s not what we were looking for,” said Dawson. “We were looking for someone to meet us where we are, grow with us, and truly enable us to excel. The real differentiator with Collegis was the spirit of partnership, and beyond that, community. From the beginning, they didn’t feel like an outsider. The team has become part of our community, and  a part of helping us advance our mission.”

    “Collegis is honored to join the St. Kate’s community in a shared commitment to the future of higher education,” said Kim Fahey, President and CEO of Collegis Education. “We see technology not as an end but as an enabler, an extension of the institution’s mission to educate women to lead and influence. This partnership is about building agile systems that empower faculty, enrich the student experience, and keep the University ahead of what’s next.”

    The partnership also reflects St. Kate’s strategic priority to build a more nimble technology foundation that shortens the timeline between priority-setting and implementation. The transition enables the university to move away from legacy systems and toward a model that supports real-time innovation, strategic flexibility, and long-term sustainability.

    “Our partnership with Collegis is rooted in our values,” said Marcheta Evans, PhD, President of St. Catherine University. “It allows us to remain focused on our mission while bringing in trusted expertise to support the evolving needs of our students, faculty, and staff.”

    Dawson concludes, “We’ve always been guided by the principle of meeting the needs of the time. Embracing this next level of technology ensures we can continue nurturing the powerful, personal connection between our faculty and students, which is what makes us uniquely St. Kate’s.”

    About Collegis Education

    As a mission-oriented, tech-enabled services provider, Collegis Education partners with higher education institutions to help align operations to drive transformative impact across the entire student lifecycle. With over 25 years as an industry pioneer, Collegis has proven how to leverage data, technology, and talent to optimize institutions’ business processes that enhance the student experience. With the strategic expertise that rivals the leading consultancies, a full suite of proven service lines, including marketing, enrollment, retention, IT, and its world-class Connected Core® data platform, Collegis helps its partners enable impact and drive revenue, growth, and innovation. Learn more at CollegisEducation.com or via LinkedIn.

    About St. Catherine University

    Sustained by a legacy of visionary women, St. Catherine University educates women to lead and influence. We are a diverse community of learners dedicated to academic rigor, core Catholic values, and a heartfelt commitment to social justice. St. Kate’s offers degrees at all levels in the humanities, arts, sciences, healthcare, and business fields that engage women in uncovering positive ways of transforming the world. St. Kate’s students learn and discern wisely, and live and lead justly — all to power lives of meaning. Discover more at stkate.edu. 

    Media Contacts:

    Collegis Education

    Alyssa Miller

    [email protected]

    973-615-1292

    St. Catherine University

    Sarah Voigt

    [email protected]

    651-690-8756

    Source link

  • AI in Education: Beyond the Hype Cycle

    AI in Education: Beyond the Hype Cycle

    We just can’t get away from it. AI continues to take the oxygen out of every edtech conversation. Even the Trump administration, while actively destroying federal involvement in public education, jumped on the bandwagon this week.

    Who better to puncture this overused acronym than edtech legend Gary Stager. In this conversation, he offers a pragmatic perspective on AI in education, cutting through both fear and hype. Gary argues that educators should view AI as simply another useful technology rather than something to either fear or blindly embrace. He criticizes the rush to create AI policies and curricula by administrators with limited understanding of the technology, suggesting instead that schools adopt minimal, flexible policies while encouraging hands-on experimentation. Have a listen: