Curriculum assessment and evaluation projects may not always spark immediate enthusiasm, but a recent well-organized team project in our higher education program demonstrated the power of collaboration. Thanks to the incredible participation of our faculty, improvements in student learning were not just planned, but successfully implemented. The changes made as part of this academic assessment/evaluation project to enhance student success yielded significant improvements. The positive impact on student learning results is rewarding when the impact on student learning results is positive!
The measures are crucial to our curriculum assessment/evaluation, program accreditation, and appraising evidence of student success. A key aspect of our approach is the use of both indirect and direct measures to gauge student learning within program and course assessments. This method allows us to redesign course assessments, incorporating authentic direct and indirect measures to foster student success.
What is a Direct Measure Assessment?
Our program faculty evaluated and updated our program curriculum, including assessment measures, as an aspect of continuous quality improvement within our educational assessment program. A direct measure objectively quantifies student learning (Bugeja & Garrett, 2019). It is crucial that these direct measures, which include course assignments (for example, papers, demonstrations, worksheets, presentations, virtual simulations, live simulations), examinations, capstone projects, student portfolios, and internship evaluations (Bugeja & Garrett, 2019), align with learning outcomes. This alignment ensures that the assessment accurately reflects the intended learning goals.
Well-constructed rubrics further enhance the quality of the assessment measure by communicating the criteria to the student and increasing the objectivity of the measure (Postmes et al., 2023). Linking examinations to learning outcomes and blueprints (List of topics the examination is assessing) increases authenticity (Abdellatif & Al-Shahrani, 2019). See the sample alignment template included at the end of this article.
What is an Indirect Measure Assessment?
An indirect measure is a subjective perception of the experience (Bugeja & Garrett, 2019). Examples of indirect measures include surveys (student, employer), focus groups, and exit interviews (Bugeja & Garrett, 2019). Indirect measures should not solely measure a course outcome or competency but should support a direct assessment measure. For example, in a major course assignment or project such as a Change Theory Application Project paper or poster presentation (direct measure), the student also completes a reflective self-evaluation survey of the project specific to learning outcomes or competencies (indirect measure). The indirect measure is valuable because it provides a different perspective but supports the direct measure.
Implementing Direct and Indirect Assessments
The program-level outcomes measure student success at program graduation. They are also known as end-of-program outcomes or terminal program outcomes. Course-level outcomes should match the attainment of the program-level outcomes.
After reviewing program-level assessment plans, the focus becomes the course level. We recommend reviewing and updating an existing alignment table or creating one that includes the course outcomes, module/unit objectives, competencies, learning activities, and assessments (modify the items per your school). We have included a sample alignment table at the end of this article. The alignment table map helps check the Revised Bloom’s Taxonomy Leveling and ensures all course outcomes and competencies link to course assessments (direct and indirect measures).
Next, examine the current course assessments. Identify course assessments worthy of updating and revision or develop a new assessment measure. Use mostly direct assessments with a few supporting indirect assessments, such as exit evaluations, focus groups, alums, and employer surveys. Remember, an indirect assessment should only add another viewpoint that supports the direct assessment measure.
Map the course outcomes, module learning objectives, course-assigned knowledge, skills, attitudes/values, competencies, and learning activities (for example, readings, videos, posters) in an alignment table. Add the final assessments to the alignment table after updating course assessment measures. Check that most measures are direct and that all course outcomes, objectives, and competencies link to a direct measure. Use only a few supporting indirect measures to safeguard genuine assessment of learning.
Summary
Educational assessment is valuable for informing curricula, demonstrating student learning, and determining program effectiveness for academic accreditation. Using direct and indirect measures in education provides data about student learning and program evaluation. Alignment mapping can help plan and check for consistency in the course. Multiple data measures, such as direct measures and a few supporting indirect measures, increase confidence in the findings and analysis, known as triangulation (Demeter et al., 2019). After completing course revisions with a particular emphasis on using direct and indirect assessment measures, we have noted improvements in student learning assessment at the course level.
Course Level Alignment Table Map
Course Outcomes
Module Objectives
Course Competencies
Learning Activities
Course Learning Assessments (Identify Direct and Indirect Measures)
Course Level Alignment Table Map Tips:
Add and delete rows and columns as needed.
The verb levels in the outcomes, objectives, and competencies should align with the learning activities and assessment measures.
Not all courses will have competencies; this may vary depending on the discipline.
Colleges and universities vary in the word use of outcomes and objectives, so modify the headings in the alignment table to match your school.
Nancyruth Leibold, EdD, RN, MSN, PHN, CNE, AHN-BC is an Associate Professor of Nursing at Minnesota State University, Mankato. Nancyruth is the Health Sciences Editor at Multimedia Educational Resource for Learning and Online Teaching (MERLOT).
Laura Schwarz, DNP, RN, PHN, CNE, AHN-BC is a Professor at Minnesota State University, Mankato. Laura is a Certified Peer Reviewer for Quality Matters.
References
Abdellatif, H., & Al-Shahrani, A. M. (2019). Effect of blueprinting methods on test difficulty, discrimination, and reliability indices: cross-sectional study in an integrated learning program. Advances in Medical Education and Practice, 10, 23–30. https://doi.org/10.2147/AMEP.S190827
Bugeja, M., & Garrett, M. (2019). “Making the connection”: Aggregate internship data as direct and indirect measure informing curricula and assessment. Journalism & Mass Communication Educator, 74(1), 17–30. https://doi.org/10.1177/1077695817749077
Demeter, E., Robinson, C., & Frederick, J. G. (2019). Holistically assessing critical thinking and written communication learning outcomes with direct and indirect measures. Research & Practice in Assessment, 14(1), 41-51.
Postmes, L., Bouwmeester, R., de Kleijn, R., & van der Schaaf, M. (2023). Supervisors’ untrained postgraduate rubric use for formative and summative purposes. Assessment and Evaluation in Higher Education, 48(1), 41–55. https://doi.org/10.1080/02602938.2021.2021390
But while it recognises that students have diverse and changeable views about their interests, it is still significant that it characterises these as “the student interest” rather than “students’ interests”.
The reason for doing this is that it makes it much more rhetorically powerful to claim you are doing something in relation to an interest that is definitive, rather than interests which are multivarious and shifting.
And be clear, the OfS proposed strategy shows a huge appetite to intervene in higher education in the name of “the student interest”.
Much talk, no sources
In the draft, OfS boasts that it has done a great deal of work to renew its understanding of the student interest – polling students, holding focus groups, hosting engagement sessions and talking to their own student panel.
But two things are particularly noticeable about this work. First, whilst a lot of other sources are referenced in their strategy consultation, this is one area where no evidence is provided.
This means the OfS interpretation of the outcomes of this consultation cannot be interrogated in any way. Clearly OfS knows best how to interpret this interest and isn’t interested in collective conversations to explore its ambiguities and complexities.
Second, none of this work involves open ended engagement with students and their representative organisations (who appear to have been excluded completely, or at least their involvement is not detailed). They are all forms of consultation in which OfS would have framed the terms and agenda of the discussions (non-decision-making power, as Steven Lukes would have it). It’s consultation – but within tightly defined limits of what can legitimately be said.
This seems to explain the remarkable number of priorities in the strategy (freedom of speech, mental health, sexual harassment) that are said to be in the student interest but previously appeared in ministerial letters outlining the strategic priorities of the OfS.
Get a job
Perhaps most concerning is that the government/treasury logic that the only real reason for going to university is to get a well-paid job is now central to the student interest. Sometimes this is done more subtly by positioning it in the (never-)popular student language of “a return on investment”:
…in return for their investment of time, money and hard work they [students] expect that education to continue to provide value into the longer-term, including in ways that they may not be able to anticipate while they study (p.12).
At other times, we are left in no doubt that the primary function of higher education is to serve the economy:
Our proposals…will support a higher education system equipped to cultivate the skills the country needs and increase employer confidence in the value of English higher education qualifications. High quality higher education will be accessible to more people, and students from all backgrounds will be better able to engage with and benefit from high quality higher education, supporting a more equal society which makes better use of untapped talent and latent potential. The supply of skilled graduates will support local and national economies alike, while the ‘public goods’ associated with high quality higher education will accrue to a wide range of individuals and communities. Public goods include economic growth, a more equal society and greater knowledge understanding (OfS 2024 p.30-31).
So what we are left with is a proposed strategy that makes powerful claims to be grounded in the student interest – but which could have easily formed part of the last government’s response to the Augar review.
Whose priorities?
Through its consultation on its proposed strategy, OfS has presented the priorities of the previous government as if they are drawn straight from its engagement with students.
We don’t yet know the higher education priorities of the current government, but given the proposed strategy was published under their watch it looks like we are moving in a depressingly familiar direction.
It is worth reflecting on the profound injustice of this. Students are expected to pay back the cost of their higher education and now have the previous government’s priorities presented as their interest so that OfS can intervene in higher education.
Yes, you have to pay – but the government and its friendly neighbourhood regulator are here to tell you why you want to pay! It seems that despite the excoriating criticism of the House of Lords Committee, OfS have not really learned how to engage with students or to reflect and reconcile their interests.
This week on the podcast Minister of State for Skills Jacqui Smith helped launch a pamphlet on whether universities are “worth it” – and was notably cold on extra money. But does she mean outlay or eventual return to the Treasury?
Plus there’s changes afoot in Scotland, UKVI is cracking down on attendance for international students and students are still feeling the pinch financially – is a return to maintenance grants a lost possibility?
With Ben Vulliamy, Executive Director at the Association of Heads of University Administration, Dani Payne, Senior Researcher at the Social Market Foundation, Michael Salmon, News Editor at Wonkhe and presented by Jim Dickinson, Associate Editor at Wonkhe.
Personalized, timely, and relevant communication is key to engaging prospective students and meeting enrollment goals in higher education.
Effective strategies rely on immediacy, relevance, automation, and trackability, ensuring impactful and consistent interactions.
Omnichannel outreach, using a mix of email, SMS, print, and digital platforms, enhances visibility and builds trust by meeting students where they are.
Connecting with prospective undergraduate students in meaningful ways requires a thoughtful blend of strategy, immediacy, and personalization. Gone are the days when generic messaging could effectively spark interest or drive engagement. Today’s prospective students expect communications that reflect an understanding of their individual needs, aspirations, and priorities and their value to your institution.
Institutions aiming to enhance their enrollment strategies must adopt a more data-informed and strategic approach to communication. This means reaching out with the right message, at the right time, and through the right channels.
Laying the Foundation for Communication Success
Effective communication with students is built on four key principles: immediacy, relevance, automation, and trackability. Each element plays a critical role in ensuring that interactions resonate with students and influence their decision-making process.
Immediacy: Quick and timely responses that change as students’ behaviors change demonstrate attentiveness and can make a significant impression on prospective students. Delays in following up on inquiries or campus visits risk the loss of momentum and interest. Statistics show that the school that responds to inquiries first is more likely to convince that student to enroll.
Relevance: Tailored, personalized communication should go beyond basic name inclusion. Students expect messages that address their specific interests. Misaligned content, such as sending information unrelated to a student’s expressed major, can quickly undermine trust.
Automation: Streamlined, automated workflows keep communication consistent and dependable, even during staff transitions or times of high demand. Manual processes, such as college fair follow-ups that sit unprocessed for long periods, can derail engagement. Automation prevents these bottlenecks, enabling timely responses even when staff are unavailable.
Trackability: Monitoring communication effectiveness helps institutions refine their strategies and optimize ROI.
By integrating these principles, higher education institutions can deliver a cohesive and impactful communication strategy that strengthens student engagement and builds trust.
The Importance of Omnichannel Outreach
While email has long been—and remains—a cornerstone of communication, relying on it exclusively is no longer sufficient. The sheer volume of emails students receive daily makes it easy for even the most well-crafted messages to be overlooked. To stand out, institutions must adopt an omnichannel approach with campaigns that combine email with print materials, SMS messaging, voice blasts, digital ads, social media engagement, and microsites, all tailored to student interests.
Each channel serves a unique purpose for student engagement in higher education. Print materials, for example, are particularly effective at involving families in the decision-making process. A well-designed brochure placed on a kitchen table can spark conversations among family members, especially parents, who are often key influencers in the college selection process.
Similarly, integrating consistent, tailored messaging across multiple channels ensures that students receive a seamless experience. Whether they encounter an institution on social media, via a targeted ad, by SMS message, or through an email campaign, the message should feel cohesive and tailored to their interests. Omnichannel strategies, timed appropriately through the enrollment timeline, not only improve visibility but also demonstrate an institution’s commitment to meeting students where they are, thus building trust and rapport.
Leveraging Data for Personalization
Modern communication strategies must be rooted in data. By analyzing student preferences and behaviors, institutions can craft messages that resonate on an individual level. With data-informed insights, institutions can identify what matters most to prospective students—whether that’s career outcomes, financial aid, or specific academic opportunities—and address those priorities directly.
For example, students interested in STEM programs may be more receptive to communications highlighting research opportunities and faculty expertise, while first-generation students may appreciate messages emphasizing affordability and support services.
To further maximize impact, institutions can use surveys and initial engagement data to tailor their outreach strategies, which allows them to deploy resources efficiently while maintaining relevance. For example, expensive print materials can be reserved for students who show strong interest in particular programs, while a social media campaign may be more appropriate for inquiries earlier in the enrollment cycle.
Real-time data tracking lets institutions segment their strategies dynamically. If a particular campaign underperforms across the board or for certain cohorts of students, modifications can be made immediately to better align with student preferences. This agility is essential for maintaining relevance and impact throughout the recruitment cycle.
Building a Sustainable Communication Infrastructure
Sustainable communication strategies rely on the integration of advanced tools and technologies. While a customer relationship management (CRM) system lays a strong foundation, institutions often need more specialized solutions to elevate their outreach efforts. Liaison offers a suite of products designed to enhance and streamline communication and enrollment strategies, including:
Enrollment Marketing (EM): Liaison’s EM software and marketing services help institutions manage and analyze personalized, automated omnichannel campaigns, ensuring consistent and effective messaging across multiple channels.
Othot: This AI-driven tool leverages predictive and prescriptive analytics to optimize communication strategies and enrollment decisions, tailoring outreach to align with student behavior and institutional goals.
Centralized Application Service (CAS): By simplifying the admissions process for students and providing institutions with tools for marketing, data management, and application processing, CAS supports efficient communication with applicants.
By incorporating these technologies, along with Liaison’s CRMs, institutions can maintain a seamless and unified communication flow so that prospective students receive timely, relevant, and personalized messages. These solutions also allow institutions to monitor campaign performance and adjust strategies in real-time, maximizing the effectiveness of resources and making messaging more impactful for target audiences. This integration reduces reliance on fragmented workflows, preventing gaps or delays caused by disconnected platforms.
Aligning tools and strategies across departments using Liaison’s technologies keeps messaging consistent and impactful, even as prospective students engage with multiple touchpoints throughout their journey.
Achieving Long-Term Engagement
Effective communication with students is about building relationships that extend beyond the initial stages of recruitment. Institutions that invest in understanding and addressing the unique needs of their prospective students position themselves as partners in their academic journey.
By delivering personalized, timely, and relevant messages through multiple channels, institutions can foster deeper connections and enhance student engagement in higher education. As the competitive landscape of enrollment continues to shift, adopting a strategic and data-informed approach to communication will remain essential for success.
Ready to elevate your communication strategies? Discover how Liaison’s advanced tools and technologies can transform how you connect with prospective students. From personalized, omnichannel campaigns to data-driven insights, our solutions help you engage students meaningfully and meet your enrollment goals. Contact us today to learn more.
About the Author
Craig Cornell is the Vice President for Enrollment Strategy at Liaison. In that capacity, he oversees a team of enrollment strategists and brings best practices, consultation, and data trends to campuses across the country in all things enrollment management. Craig also serves as the dedicated resource to NASH (National Association of Higher Education Systems) and works closely with the higher education system that Liaison supports. Before joining Liaison in 2023, Craig served for over 30 years in multiple higher education executive enrollment management positions. During his tenure, the campuses he served often received national recognition for enrollment growth, effective financial aid leveraging, marketing enhancements, and innovative enrollment strategies.
This audio is auto-generated. Please let us know if you have feedback.
Dive Brief:
Enrollment of first-year students grew 5.5% in fall 2024 compared to the year before, representing an increase of about 130,000 students, according to a final tally from the National Student Clearinghouse Research Center.
The figure is a striking reversal from the clearinghouse’s preliminary findings in October, which erroneously reported a decline in first-year students. Earlier this month, the clearinghouse said the early data contained a research error and suspended its preliminary enrollment reports, which use different methodologies to determine first-year student counts than the research center’s reports on final enrollment figures.
College enrollment overall grew 4.5% in fall 2024 compared to the year before, according to the final data, rebounding to levels seen before the coronavirus pandemic caused widespread declines.
Dive Insight:
The new data is promising for higher education institutions, many of which have weathered steep enrollment declines in the wake of the pandemic.
“It is encouraging to see the total number of postsecondary students rising above the pre-pandemic level for the first time this fall,” Doug Shapiro, the research center’s executive director, said in a Wednesday statement.
Undergraduate enrollment surged 4.7% this fall,representing an increase of about 716,000 students. Graduate enrollment likewise spiked 3.3%, representing an uptick of about 100,000 students.
All sectors enjoyed enrollment increases. For-profit, four-year institutions had the largest enrollment growth, with headcounts rising 7.5% in fall 2024 compared to the year before. Public two-year institutions and public primarily associate-degree granting baccalaureate institutions, or PABs, saw similar levels of growth — 5.8% and 6.3%, respectively.
Enrollment also increased at four-year nonprofits. Overall headcounts grew 3.8% at private colleges and 3.1% at public institutions.
Older students largely drove the growth in first-year students.Enrollment of first-year students from ages 21 to 24 surged 16.7% in fall 2024, while headcounts of students 25 and older spiked by a whopping 19.7%.
Enrollment of younger first-year students also increased, though the growth was more muted.
Headcounts of 18-year-old students grew 3.4%. However, this group of first-year students has still not recovered to pre-pandemic levels, Shapiro said in a statement.
Similarly, enrollment of first-year students ages 19 to 20 increased 4.5%.
Two-year public colleges and public PABs enjoyed strong increases in their first-year student population, with 6.8% and 8.4% growth, respectively. However, for-profit, four-year colleges saw the largest increase, 26.1%, according to the new data.
Headcounts of first-year students also spiked at four-year nonprofits, rising 3.3% at public institutions and 2.8% at private colleges.
Shapiro addressed the research center’s methodological error during a call Wednesday with reporters.The erroneous preliminary report found that first-year enrollment had declined by 5% — over 10 percentage points lower than what the final data showed.
“I think our sensitivity to abnormally large changes was somewhat reduced because we had a host of kind of ready explanations for why we might be seeing these declines,” Shapiro said, citing issues with the federal student aid form, growing concerns with student debt and changes in the labor market.
The research center staff has been investigating its other publications to see if the issue crept into them.
So far, they discovered that the flawed methodology also impacted a February 2024 report on transfer students. The clearinghouse will correct that data when it issues its next transfer report in February.
The research center previously announced that the error affected other reports in its “Stay Informed” series, which shares preliminary enrollment data. It has halted those reports — which launched at the height of the pandemic — until it vets a new methodology.
Authenticity has become a cornerstone of successful education marketing campaigns. Nothing speaks louder to prospective students than real experiences shared by current students. That’s why we recommend the combined use of two powerful tools: student ambassador programs and user-generated content (UGC).
These strategies harness the voices of your students to create compelling, authentic narratives that resonate. In this blog, we’ll explore the enrollment-boosting potential of student ambassadors and UGC for education marketing, the benefits they offer, and actionable steps to integrate them into your strategy. Let’s get started!
Struggling with enrollment?
Our expert digital marketing services can help you attract and enroll more students!
Understanding the Role of a Student Ambassador
What is a student ambassador? A student ambassador is a current student who represents your institution in various capacities, from marketing and recruitment to campus events. These individuals are typically chosen for their enthusiasm, communication skills, and ability to connect with diverse audiences.
What do student ambassadors do? As the face of your school, student ambassadors embody its culture and values, offering prospective students and their families an authentic glimpse into campus life.
The roles of student ambassadors are varied. They may host campus tours, participate in Q&A sessions during open houses, or even create content for your social media platforms. By sharing their personal experiences, they help humanize your institution, breaking down barriers and building trust.
Source: University of Waterloo
Example: On its website, the University of Waterloo has a dedicated page for members of its community who are interested in its student ambassador program. This page details the role of a student ambassador, the requirements for candidates, their workload, and compensation. When you launch your student ambassador program, use site content to provide vital information to potential candidates and the students they’ll support in their roles. Use social media to keep your audience updated on the application process and involve student ambassadors in content creation to establish a relationship between them and the rest of your student body.
Reach out for help implementing effective enrollment-boosting digital marketing strategies!
What Is User-Generated Content (UGC)?
User-generated content (UGC) refers to any content created by your students, alumni, or even staff, rather than your marketing team. This can include photos, videos, testimonials, social media posts, or blogs that showcase their authentic experiences. Unlike polished advertising campaigns, UGC is often raw and unfiltered, making it highly relatable and trustworthy.
Now that audiences are bombarded with promotional material, UGC stands out. It delivers a level of authenticity that professionally crafted content simply cannot replicate. For prospective students, seeing someone “just like them” thriving at your institution can be the deciding factor in their enrollment journey.
Source: University of Oxford | TikTok
Example: Take a look at the comments on this TikTok video. The bottom one shows how many prospective students are turning to current students for advice and insights into their journey with your institution. This “day in the life” video from a University of Oxford student offers a glimpse into campus life from a personal perspective. Videos shared on a student’s personal page often feel more genuine since they don’t come across as promotional content.
That’s not to say your school shouldn’t engage with these posts! Use hashtags, like #universityofoxford, to find UGC created by your community and reshare it on your school’s profile. To encourage more of this content, promote specific hashtags and even run contests or challenges to inspire creativity and engagement.
The Benefits of Student Ambassadors and UGC
Though their methodology is different, both student ambassador programs and UGC help to tell your school’s unique story authentically.
These methods are particularly effective at humanizing your school’s brand. Discover some more of the unique benefits you can see when you combine these strategies correctly.
Authenticity and Trust: Both student ambassadors and UGC provide unfiltered insights into your institution. Prospective students are more likely to trust the words of a peer than a marketing brochure. When real students share their stories, it creates a sense of transparency and trust.
Increased Engagement: Content created by student ambassadors and peers often performs better on social media platforms. Audiences are more likely to engage with posts that feel genuine and relatable. This increased engagement can translate to higher visibility for your institution.
Cost-Effectiveness: Leveraging the voices of your students can reduce the need for extensive advertising budgets. While there may be costs associated with training or compensating ambassadors, the return on investment through increased applications and enrollment often outweighs the initial expenditure.
Community Building: By involving students in your marketing efforts, you foster a sense of pride and belonging. Ambassadors feel more connected to your institution, and their enthusiasm is infectious, positively influencing both their peers and prospective students.
How to Build a Successful Student Ambassador Program
Building a student ambassador program involves creating a structured initiative that aligns with your school’s marketing goals and fosters authentic engagement. A successful program requires careful planning, clear objectives, and ongoing support to empower ambassadors as true representatives of your institution. Here, we’ll walk you through the essential steps to design and implement a program that connects with prospective students and amplifies your school’s story.
Define Clear Objectives
Clear objectives are the cornerstone of a student ambassador program, aligning with your marketing goals and guiding ambassadors toward success. Start by clearly outlining the program’s purpose. For example, increasing applications, enhancing campus tour experiences, or boosting social media engagement.
This clarity of intent should be paired with measurable goals, to help ambassadors understand what success looks like. Measurable goals could be increasing tour attendance by 20% or generating a set number of social media posts each month
Tailor these objectives to match the unique strengths of each ambassador, assigning roles that play to their talents, such as public speaking for campus tours or storytelling for blog posts and videos. Providing a clear role description that details their responsibilities, tasks, and time commitments is equally critical to avoid confusion and set expectations.
To foster motivation, explain the “why” behind their tasks, helping them see how their efforts impact prospective students, build trust in the institution, and contribute to enrollment goals. Regular check-ins or feedback sessions can also ensure ambassadors stay on track, allowing for adjustments and maintaining engagement. With clearly defined objectives and the right support, ambassadors can confidently represent your institution and drive meaningful results.
Recruit the Right Ambassadors
Select ambassadors who reflect the diversity and values of your institution. Look for individuals who are enthusiastic, articulate, and comfortable sharing their experiences. Peer recommendations, faculty referrals, and application processes can help identify the best candidates.
Foster Collaboration
Facilitate collaboration between ambassadors and your marketing team. Regular meetings can help align their content with your broader campaigns while maintaining authenticity. Ambassadors should feel supported but not micromanaged.
Source: University of Windsor
Example: The University of Windsor demonstrates trust in its student ambassadors with a unique feature on its website. It allows current and prospective students to select an ambassador to chat with for answers to their school-related questions. To replicate this success, implement a comprehensive training program to ensure consistency and quality. Clear expectations enable your ambassadors to take on key responsibilities confidently, delivering a strong return on your investment.
Provide Comprehensive Training
Familiarize Ambassadors with Your Institution’s Key Messaging and Values Begin by familiarizing ambassadors with your institution’s key messaging and values. This includes providing them with a clear understanding of your school’s mission, vision, and what sets it apart from competitors. Equip them with talking points about academics, extracurricular offerings, campus facilities, and student life, ensuring consistency in how they communicate your brand. Role-playing exercises can be particularly effective here, helping ambassadors practice delivering messages in a variety of scenarios, such as open houses, campus tours, or online Q&A sessions.
Train Ambassadors on Social Media Best Practices Training should also include social media best practices, especially if ambassadors are creating content for your platforms. Teach them how to craft posts that are engaging and aligned with your school’s tone and style. Provide guidelines on appropriate language, photo and video quality, and compliance with privacy policies.
Develop Public Speaking Skills Since many ambassadors will engage with prospective students and families in person, public speaking training is invaluable. Help them refine their communication skills with workshops that focus on clarity, confidence, and storytelling. Encourage them to share personal anecdotes about their experiences at your school, as these authentic stories are often the most memorable. Practice sessions with constructive feedback can significantly boost their comfort in delivering presentations or handling impromptu questions.
Build Soft Skills for Diverse Audiences Effective training also involves building soft skills like empathy, adaptability, and cultural awareness, especially for ambassadors interacting with diverse audiences.
Include scenarios that challenge them to navigate different cultural perspectives or address sensitive questions tactfully. By fostering these skills, you ensure ambassadors can create welcoming and inclusive experiences for prospective students and their families.
Incorporate Interactive Training Methods To make training engaging and practical, use a mix of interactive methods such as role-playing, group discussions, and hands-on activities. Incorporate real-world examples and success stories from past ambassadors to inspire new recruits and show them what’s possible. Providing a training manual or digital resource hub can also serve as a handy reference for ambassadors as they grow into their roles.
Provide Ongoing Support and Refreshers Finally, ongoing support and refreshers are critical. Schedule periodic check-ins to provide additional guidance, address challenges, and celebrate successes. The more prepared they are, the more effectively they’ll represent your school.
Empower Ambassadors to Create
Empowering student ambassadors to create their own content is one of the most effective ways to showcase the authentic, lived experiences that resonate with prospective students. By trusting ambassadors with creative freedom, you enable them to craft content that feels genuine and relatable—qualities that polished marketing campaigns often struggle to replicate.
Start by encouraging ambassadors to focus on their personal experiences and unique perspectives. Heartfelt testimonials are another powerful form of content. Whether it’s a written story, a video, or a social media post, ambassadors sharing their personal journeys—why they chose your school, how it’s impacted their lives, and what they’ve learned—can create an emotional connection with viewers.
To provide inspiration and structure, consider giving student ambassadors a content calendar – a detailed content plan that outlines the where, what, and when of your posts. Highlighting diverse voices within your ambassador team ensures a broad range of experiences and perspectives are represented, appealing to a wider audience.
Celebrate Their Contributions
Recognize and reward your ambassadors for their efforts. This can range from financial compensation to exclusive perks like access to networking events or career development opportunities. Publicly celebrating their work reinforces their value and motivates others to get involved.
Source: New York University
Example: Here, New York University’s School of Global Public Health welcomes a new student ambassador, celebrating her accomplishments in the field, describing her role in the NYU community, and directing the audience to her student blog post. In addition to monetary rewards, student ambassadors appreciate public acknowledgments of their contributions.
Measure Success
Track the impact of your ambassador program using metrics such as social media engagement, website traffic, and application rates. Use this data to refine your approach, ensuring continuous improvement.
Incorporating UGC into Your Marketing Strategy
A UGC marketing campaign can be a goldmine for schools looking to leverage their communities to tell their story. By encouraging students to share their experiences, you tap into a wealth of relatable and engaging material that resonates with prospective students. Let’s explore how to integrate UGC into your marketing strategy for maximum impact.
Create Opportunities for UGC
Encourage your students to share their experiences by hosting contests, themed hashtag campaigns, or student takeovers on social media. The more accessible you make the process, the more likely students are to participate.
Source: Caleontwins | TikTok
Example: Here, Humber College has paid well-known influencers to promote a contest called Humber Bring It. The aim was to showcase all the unique skills students brought to their community. In their video, the Caleon twins shared all the essential details of the contest such as the deadline, prizes for winners (a 5000 dollar tuition credit or a laptop), and the hashtag that each contestant should use. Contests like this are the perfect way to create a UGC buzz around your institution.
Showcase UGC Across Platforms
To maximize the impact of user-generated content (UGC), feature it prominently across your marketing platforms. Incorporate student stories, photos, and videos on your website’s homepage, within program pages, and in blog posts to provide a genuine glimpse into campus life. Social media channels are another natural home for UGC, where they can drive engagement and create relatable touchpoints with prospective students. Consider integrating this content into admissions brochures, emails, and campus tour presentations to ensure consistent messaging.
Before sharing any UGC, prioritize student consent. Always seek permission from contributors, clearly explaining where and how their content will be used. Providing written guidelines and gaining explicit agreement ensures transparency and builds trust. By celebrating your students’ experiences respectfully and prominently, you showcase your school’s vibrant community and also create a foundation of authenticity and ethical storytelling that resonates with your audience.
Maintain Quality Control
While UGC is inherently less polished, maintaining a level of quality ensures it aligns with your institution’s values and messaging. Begin by establishing clear guidelines for students contributing content.
These guidelines should outline your school’s tone, branding, and expectations for appropriateness, while still encouraging creativity and individuality. For example, provide tips on photography and video basics, such as lighting and framing, to enhance visual appeal without compromising authenticity.
Review content before publication to ensure it represents your school positively. This doesn’t mean heavily editing or sanitizing the material—rather, it’s about ensuring the content reflects your institution’s culture, is free of inappropriate language or imagery, and avoids unintentional misrepresentation.
Offering feedback to students can also be a valuable learning experience, helping them refine their work while staying true to their voice. By balancing authenticity with quality, you showcase the best of your community in a way that’s both relatable and professional.
Engage with UGC Creators
Show appreciation for students who contribute content by engaging with their posts, sharing their work, or even spotlighting them in dedicated campaigns. This not only boosts their morale but also encourages others to participate.
Use UGC to Tell Stories
Go beyond individual posts by weaving UGC into cohesive narratives. For example, compile videos and testimonials into a series showcasing different aspects of campus life. Storytelling adds depth and emotional resonance to your campaigns.
Bringing It All Together
Student ambassador programs and UGC are avenues for building authentic connections with your audience. By leveraging the voices of your students, you showcase your institution’s unique story in a way that resonates deeply with prospective students and their families.
At Higher Education Marketing, we specialize in helping schools like yours unlock the potential of these strategies and many others. Whether you’re just starting or looking to refine your approach, our expertise ensures your campaigns drive meaningful engagement and results.
Your students are your greatest storytellers. Let their voices elevate your brand and inspire the next generation to join your community.
Struggling with enrollment?
Our expert digital marketing services can help you attract and enroll more students!
Frequently Asked Questions
What is a student ambassador?
A student ambassador is a current student who represents your institution in various capacities, from marketing and recruitment to campus events.
What do student ambassadors do?
As the face of your school, student ambassadors embody its culture and values, offering prospective students and their families an authentic glimpse into campus life.
Ricardo Torres, the CEO of the National Student Clearinghouse, is retiring next month after 17 years at the helm. His last few weeks on the job have not been quiet.
On Jan. 13, the clearinghouse’s research team announced they had found a significant error in their October enrollment report: Instead of freshman enrollment falling by 5 percent, it actually seemed to have increased; the clearinghouse is releasing its more complete enrollment report tomorrow. In the meantime, researchers, college officials and policymakers are re-evaluating their understanding of how 2024’s marquee events, like the bungled FAFSA rollout, influenced enrollment; some are questioning their reliance on clearinghouse research.
It’s come as a difficult setback at the end of Torres’s tenure. He established the research center in 2010, two years after becoming CEO, and helped guide it to prominence as one of the most widely used and trusted sources of postsecondary student data.
The clearinghouse only began releasing the preliminary enrollment report, called the “Stay Informed” report, in 2020 as a kind of “emergency measure” to gauge the pandemic’s impact on enrollment, Torres told Inside Higher Ed. The methodological error in October’s report, which the research team discovered this month, had been present in every iteration since. And a spokesperson for the clearinghouse said that after reviewing the methodology for their “Transfer and Progress” report, which they’ve released every February since 2023, was also affected by the miscounting error; the 2025 report will be corrected, but the last two were skewed.
Torres said the clearinghouse is exploring discontinuing the “Stay Informed” report entirely.
Such a consequential snafu would put a damper on anyone’s retirement and threaten to tarnish their legacy. But Torres is used to a little turbulence: He oversaw the clearinghouse through a crucial period of transformation, from an arm of the student lending sector to a research powerhouse. He said the pressure on higher ed researchers is only going to get more intense in the years ahead, given the surging demand for enrollment and outcomes data from anxious college leaders and ambitious lawmakers. Transparency and integrity, he cautioned, will be paramount.
His conversation with Inside Higher Ed, edited for length and clarity, is below.
Q: You’ve led the clearinghouse since 2008, when higher ed was a very different sector. How does it feel to be leaving?
A: It’s a bit bittersweet, but I feel like we’ve accomplished something during my tenure that can be built upon. I came into the job not really knowing about higher ed; it was a small company, a $13 million operation serving the student lending industry. We were designed to support their fundamental need to understand who’s enrolled and who isn’t, for the purposes of monitoring student loans. As a matter of fact, the original name of the organization was the National Student Loan Clearinghouse. When you think about what happened when things began to evolve and opportunities began to present themselves, we’ve done a lot.
Q: Tell me more about how the organization has changed since the days of the Student Loan Clearinghouse.
A: Frankly, the role and purpose of the clearinghouse and its main activities have not changed in about 15 years. The need was to have a trusted, centralized location where schools could send their information that then could be used to validate loan status based on enrollments. The process, prior to the clearinghouse, was loaded with paperwork. The registrars that are out there now get this almost PTSD effect when they go back in time before the clearinghouse. If a student was enrolled in School A, transferred to School B and had a loan, by the time everybody figured out that you were enrolled someplace else, you were in default on your loan. We were set up to fix that problem.
What made our database unique at that time was that when a school sent us enrollment data, they had to send all of the learners because they actually didn’t know who had a previous loan and who didn’t. That allowed us to build a holistic, comprehensive view of the whole lending environment. So we began experimenting with what else we could do with the data.
Our first observation was how great a need there was for this data. Policy formulation at almost every level—federal, state, regional—for improving learner outcomes lacked the real-time data to figure out what was going on. Still, democratizing the data alone was insufficient because you need to convert that insight into action of some kind that is meaningful. What I found as I was meeting schools and individuals was that the ability and the skill sets required to convert data to action were mostly available in the wealthiest institutions. They had all the analysts in the world to figure out what the hell was going on, and the small publics were just scraping by. That was the second observation, the inequity.
The third came around 2009 to 2012, when there was an extensive effort to make data an important part of decision-making across the country. The side effect of that, though, was that not all the data sets were created equal, which made answering questions about what works and what doesn’t that much more difficult.
The fourth observation, and I think it’s still very relevant today, is that the majority of our postsecondary constituencies are struggling to work with the increasing demands they’re getting from regulators: from the feds, from the states, from their accreditors, the demand for reports is increasing. The demand for feedback is increasing. Your big institutions, your flagships, might see this as a pain in the neck, but I would suggest that your smaller publics and smaller private schools are asking, “Oh my gosh, how are we even going to do this?” Our data helps.
Q: What was the clearinghouse doing differently in terms of data collection?
A: From the postsecondary standpoint, our first set of reports that we released in 2011 focused on two types of learners that at most were anecdotally referred to: transfer students and part-time students. The fact that we included part-time students, which [the Integrated Postsecondary Education Data System] did not, was a huge change. And our first completion report, I believe, said that over 50 percent of baccalaureate recipients had some community college in their background. That was eye-popping for the country to see and really catalyzed a lot of thinking about transfer pathways.
We also helped spur the rise of these third-party academic-oriented organizations like Lumina and enabled them to help learners by using our data. One of our obligations as a data aggregator was to find ways to make this data useful for the field, and I think we accomplished that. Now, of course, demand is rising with artificial intelligence; people want to do more. We understand that, but we also think we have a huge responsibility as a data custodian to do that responsibly. People who work with us realize how seriously we take that custodial relationship with the data. That has been one of the hallmarks of our tenure as an organization.
Q: Speaking of custodial responsibility, people are questioning the clearinghouse’s research credibility after last week’s revelation of the data error in your preliminary enrollment report. Are you worried it will undo the years of trust building you just described? How do you take accountability?
A: No. 1: The data itself, which we receive from institutions, is reliable, current and accurate. We make best efforts to ensure that it accurately represents what the institutions have within their own systems before any data is merged into the clearinghouse data system.
When we first formed the Research Center, we had to show how you can get from the IPEDS number to the clearinghouse number and show people our data was something they could count on. We spent 15 years building this reputation. The key to any research-related error like this is, first, you have to take ownership of it and hold yourself accountable. As soon as I found out about this we were already making moves to [make it public]—we’re talking 48 hours. That’s the first step in maintaining trust.
That being said, there’s an element of risk built into this work. Part of what the clearinghouse brings to the table is the ability to responsibly advance the dialogue of what’s happening in education and student pathways. There are things that are happening out there, such as students stopping out and coming back many years later, that basically defy conventional wisdom. And so the risk in all of this is that you shy away from that work and decide to stick with the knitting. But your obligation is, if you’re going to report those things, to be very transparent. As long as we can thread that needle, I think the clearinghouse will play an important role in helping to advance the dialogue.
We’re taking this very seriously and understand the importance of the integrity of our reports considering how the field is dependent on the information we provide. Frankly, one of the things we’re going to take a look at is, what is the need for the preliminary report at the end of the day? Or do we need to pair it with more analysis—is it just enough to say that total enrollments are up X or down Y?
Q: Are you saying you may discontinue the preliminary report entirely?
A: That’s certainly an option. I think we need to assess the field’s need for an early report—what questions are we trying to answer and why is it important that those questions be answered by a certain time? I’ll be honest; this is the first time something like this has happened, where it’s been that dramatic. That’s where the introspection starts, saying, “Well, this was working before; what the heck happened?”
When we released the first [preliminary enrollment] report [in 2020], we thought it’d be a one-time thing. Now, we’ve issued other reports that we thought were going to be one-time and ended up being a really big deal, like “Some College, No Credential.” We’re going to continue to look for opportunities to provide those types of insights. But I think any research entity needs to take a look at what you’re producing to make sure there’s still a need or a demand, or maybe what you’re providing needs to pivot slightly. That’s a process that’s going to be undertaken over the next few months as we evaluate this report and other reports we do.
Q: How did this happen, exactly? Have you found the source of the imputation error?
A: The research team is looking into it. In order to ensure for this particular report that we don’t extrapolate this to a whole bunch of other things, you just need to make sure that you know you’ve got your bases covered analytically.
There was an error in how we imputed a particular category of dual-enrolled students versus freshmen. But if you look at the report, the total number of learners wasn’t impacted by that. These preliminary reports were designed to meet a need after COVID, to understand what the impact was going to be. We basically designed a report on an emergency basis, and by default, when you don’t have complete information, there’s imputation. There’s been a lot of pressure on getting the preliminary fall report out. That being said, you learn your lesson—you gotta own it and then you keep going. This was very unfortunate, and you can imagine the amount of soul searching to ensure that this never happens again.
Q: Do you think demand for more postsecondary data is driving some irresponsible analytic practices?
A: I can tell you that new types of demands are going to be put out there on student success data, looking at nondegree credentials, looking at microcredentials. And there’s going to be a lot of spitballing. Just look at how ROI is trying to be calculated right now; I could talk for hours about the ins and outs of ROI methodology. For example, if a graduate makes $80,000 after graduating but transferred first from a community college, what kind of attribution does the community college get for that salary outcome versus the four-year school? Hell, it could be due to a third-party boot camp done after earning a degree. Research on these topics is going to be full of outstanding questions.
Q: What comes next for the clearinghouse’s research after you leave?
A: I’m excited about where it’s going. I’m very excited about how artificial intelligence can be appropriately leveraged, though I think we’re still trying to figure out how to do that. I can only hope that the clearinghouse will continue its journey of support. Because while we don’t directly impact learner trajectories, we can create the tools that help people who support learners every year impact those trajectories. Looking back on my time here, that’s what I’m most proud of.
This one is going to take a hot minute to dissect. Minnesota Public Radio (MPR) has the story.
The plot contours are easy. A PhD student at the University of Minnesota was accused of using AI on a required pre-dissertation exam and removed from the program. He denies that allegation and has sued the school — and one of his professors — for due process violations and defamation respectively.
Starting the case.
The coverage reports that:
all four faculty graders of his exam expressed “significant concerns” that it was not written in his voice. They noted answers that seemed irrelevant or involved subjects not covered in coursework. Two instructors then generated their own responses in ChatGPT to compare against his and submitted those as evidence against Yang. At the resulting disciplinary hearing, Yang says those professors also shared results from AI detection software.
Personally, when I see that four members of the faculty unanimously agreed on the authenticity of his work, I am out. I trust teachers.
I know what a serious thing it is to accuse someone of cheating; I know teachers do not take such things lightly. When four go on the record to say so, I’m convinced. Barring some personal grievance or prejudice, which could happen, hard for me to believe that all four subject-matter experts were just wrong here. Also, if there was bias or petty politics at play, it probably would have shown up before the student’s third year, not just before starting his dissertation.
Moreover, at least as far as the coverage is concerned, the student does not allege bias or program politics. His complaint is based on due process and inaccuracy of the underlying accusation.
Let me also say quickly that asking ChatGPT for answers you plan to compare to suspicious work may be interesting, but it’s far from convincing — in my opinion. ChatGPT makes stuff up. I’m not saying that answer comparison is a waste, I just would not build a case on it. Here, the university didn’t. It may have added to the case, but it was not the case. Adding also that the similarities between the faculty-created answers and the student’s — both are included in the article — are more compelling than I expected.
Then you add detection software, which the article later shares showed high likelihood of AI text, and the case is pretty tight. Four professors, similar answers, AI detection flags — feels like a heavy case.
Denied it.
The article continues that Yang, the student:
denies using AI for this exam and says the professors have a flawed approach to determining whether AI was used. He said methods used to detect AI are known to be unreliable and biased, particularly against people whose first language isn’t English. Yang grew up speaking Southern Min, a Chinese dialect.
Although it’s not specified, it is likely that Yang is referring to the research from Stanford that has been — or at least ought to be — entirely discredited (see Issue 216 and Issue 251). For the love of research integrity, the paper has invented citations — sources that go to papers or news coverage that are not at all related to what the paper says they are.
Does anyone actually read those things?
Back to Minnesota, Yang says that as a result of the findings against him and being removed from the program, he lost his American study visa. Yang called it “a death penalty.”
With friends like these.
Also interesting is that, according to the coverage:
His academic advisor Bryan Dowd spoke in Yang’s defense at the November hearing, telling panelists that expulsion, effectively a deportation, was “an odd punishment for something that is as difficult to establish as a correspondence between ChatGPT and a student’s answer.”
That would be a fair point except that the next paragraph is:
Dowd is a professor in health policy and management with over 40 years of teaching at the U of M. He told MPR News he lets students in his courses use generative AI because, in his opinion, it’s impossible to prevent or detect AI use. Dowd himself has never used ChatGPT, but he relies on Microsoft Word’s auto-correction and search engines like Google Scholar and finds those comparable.
That’s ridiculous. I’m sorry, it is. The dude who lets students use AI because he thinks AI is “impossible to prevent or detect,” the guy who has never used ChatGPT himself, and thinks that Google Scholar and auto-complete are “comparable” to AI — that’s the person speaking up for the guy who says he did not use AI. Wow.
That guy says:
“I think he’s quite an excellent student. He’s certainly, I think, one of the best-read students I’ve ever encountered”
Time out. Is it not at least possible that professor Dowd thinks student Yang is an excellent student because Yang was using AI all along, and our professor doesn’t care to ascertain the difference? Also, mind you, as far as we can learn from this news story, Dowd does not even say Yang is innocent. He says the punishment is “odd,” that the case is hard to establish, and that Yang was a good student who did not need to use AI. Although, again, I’m not sure how good professor Dowd would know.
As further evidence of Yang’s scholastic ability, Dowd also points out that Yang has a paper under consideration at a top academic journal.
You know what I am going to say.
To me, that entire Dowd diversion is mostly funny.
More evidence.
Back on track, we get even more detail, such as that the exam in question was:
an eight-hour preliminary exam that Yang took online. Instructions he shared show the exam was open-book, meaning test takers could use notes, papers and textbooks, but AI was explicitly prohibited.
Exam graders argued the AI use was obvious enough. Yang disagrees.
Weeks after the exam, associate professor Ezra Golberstein submitted a complaint to the U of M saying the four faculty reviewers agreed that Yang’s exam was not in his voice and recommending he be dismissed from the program. Yang had been in at least one class with all of them, so they compared his responses against two other writing samples.
So, the exam expressly banned AI. And we learn that, as part of the determination of the professors, they compared his exam answers with past writing.
I say all the time, there is no substitute for knowing your students. If the initial four faculty who flagged Yang’s work had him in classes and compared suspicious work to past work, what more can we want? It does not get much better than that.
Then there’s even more evidence:
Yang also objects to professors using AI detection software to make their case at the November hearing.
He shared the U of M’s presentation showing findings from running his writing through GPTZero, which purports to determine the percentage of writing done by AI. The software was highly confident a human wrote Yang’s writing sample from two years ago. It was uncertain about his exam responses from August, assigning 89 percent probability of AI having generated his answer to one question and 19 percent probability for another.
“Imagine the AI detector can claim that their accuracy rate is 99%. What does it mean?” asked Yang, who argued that the error rate could unfairly tarnish a student who didn’t use AI to do the work.
First, GPTZero is junk. It’s reliably among the worst available detection systems. Even so, 89% is a high number. And most importantly, the case against Yang is not built on AI detection software alone, as no case should ever be. It’s confirmation, not conviction. Also, Yang, who the paper says already has one PhD, knows exactly what an accuracy rate of 99% means. Be serious.
A pattern.
Then we get this, buried in the news coverage:
Yang suggests the U of M may have had an unjust motive to kick him out. When prompted, he shared documentation of at least three other instances of accusations raised by others against him that did not result in disciplinary action but that he thinks may have factored in his expulsion.
He does not include this concern in his lawsuits. These allegations are also not explicitly listed as factors in the complaint against him, nor letters explaining the decision to expel Yang or rejecting his appeal. But one incident was mentioned at his hearing: in October 2023, Yang had been suspected of using AI on a homework assignment for a graduate-level course.
In a written statement shared with panelists, associate professor Susan Mason said Yang had turned in an assignment where he wrote “re write it, make it more casual, like a foreign student write but no ai.” She recorded the Zoom meeting where she said Yang denied using AI and told her he uses ChatGPT to check his English.
She asked if he had a problem with people believing his writing was too formal and said he responded that he meant his answer was too long and he wanted ChatGPT to shorten it. “I did not find this explanation convincing,” she wrote.
I’m sorry — what now?
Yang says he was accused of using AI in academic work in “at least three other instances.” For which he was, of course, not disciplined. In one of those cases, Yang literally turned in a paper with this:
“re write it, make it more casual, like a foreign student write but no ai.”
He said he used ChatGPT to check his English and asked ChatGPT to shorten his writing. But he did not use AI. How does that work?
For that one where he left in the prompts to ChatGPT:
the Office of Community Standards sent Yang a letter warning that the case was dropped but it may be taken into consideration on any future violations.
Yang was warned, in writing.
If you’re still here, we have four professors who agree that Yang’s exam likely used AI, in violation of exam rules. All four had Yang in classes previously and compared his exam work to past hand-written work. His exam answers had similarities with ChatGPT output. An AI detector said, in at least one place, his exam was 89% likely to be generated with AI. Yang was accused of using AI in academic work at least three other times, by a fifth professor, including one case in which it appears he may have left in his instructions to the AI bot.
On the other hand, he did say he did not do it.
Findings, review.
Further:
But the range of evidence was sufficient for the U of M. In the final ruling, the panel — comprised of several professors and graduate students from other departments — said they trusted the professors’ ability to identify AI-generated papers.
Several professors and students agreed with the accusations. Yang appealed and the school upheld the decision. Yang was gone. The appeal officer wrote:
“PhD research is, by definition, exploring new ideas and often involves development of new methods. There are many opportunities for an individual to falsify data and/or analysis of data. Consequently, the academy has no tolerance for academic dishonesty in PhD programs or among faculty. A finding of dishonesty not only casts doubt on the veracity of everything that the individual has done or will do in the future, it also causes the broader community to distrust the discipline as a whole.”
Slow clap.
And slow clap for the University of Minnesota. The process is hard. Doing the review, examining the evidence, making an accusation — they are all hard. Sticking by it is hard too.
Seriously, integrity is not a statement. It is action. Integrity is making the hard choice.
MPR, spare me.
Minnesota Public Radio is a credible news organization. Which makes it difficult to understand why they chose — as so many news outlets do — to not interview one single expert on academic integrity for a story about academic integrity. It’s downright baffling.
Worse, MPR, for no specific reason whatsoever, decides to take prolonged shots at AI detection systems such as:
Computer science researchers say detection software can have significant margins of error in finding instances of AI-generated text. OpenAI, the company behind ChatGPT, shut down its own detection tool last year citing a “low rate of accuracy.” Reports suggest AI detectors have misclassified work by non-native English writers, neurodivergent students and people who use tools like Grammarly or Microsoft Editor to improve their writing.
“As an educator, one has to also think about the anxiety that students might develop,” said Manjeet Rege, a University of St. Thomas professor who has studied machine learning for more than two decades.
We covered the OpenAI deception — and it was deception — in Issue 241, and in other issues. We covered the non-native English thing. And the neurodivergent thing. And the Grammarly thing. All of which MPR wraps up in the passive and deflecting “reports suggest.” No analysis. No skepticism.
That’s just bad journalism.
And, of course — anxiety. Rege, who please note has studied machine learning and not academic integrity, is predictable, but not credible here. He says, for example:
it’s important to find the balance between academic integrity and embracing AI innovation. But rather than relying on AI detection software, he advocates for evaluating students by designing assignments hard for AI to complete — like personal reflections, project-based learnings, oral presentations — or integrating AI into the instructions.
Absolute joke.
I am not sorry — if you use the word “balance” in conjunction with the word “integrity,” you should not be teaching. Especially if what you’re weighing against lying and fraud is the value of embracing innovation. And if you needed further evidence for his absurdity, we get the “personal reflections and project-based learnings” buffoonery (see Issue 323). But, again, the error here is MPR quoting a professor of machine learning about course design and integrity.
MPR also quotes a student who says:
she and many other students live in fear of AI detection software.
“AI and its lack of dependability for detection of itself could be the difference between a degree and going home,” she said.
Nope. Please, please tell me I don’t need to go through all the reasons that’s absurd. Find me one single of case in which an AI detector alone sent a student home. One.
Two final bits.
The MPR story shares:
In the 2023-24 school year, the University of Minnesota found 188 students responsible of scholastic dishonesty because of AI use, reflecting about half of all confirmed cases of dishonesty on the Twin Cities campus.
Just noteworthy. Also, it is interesting that 188 were “responsible.” Considering how rare it is to be caught, and for formal processes to be initiated and upheld, 188 feels like a real number. Again, good for U of M.
The MPR article wraps up that Yang:
found his life in disarray. He said he would lose access to datasets essential for his dissertation and other projects he was working on with his U of M account, and was forced to leave research responsibilities to others at short notice. He fears how this will impact his academic career
Stating the obvious, like the University of Minnesota, I could not bring myself to trust Yang’s data. And I do actually hope that being kicked out of a university for cheating would impact his academic career.
And finally:
“Probably I should think to do something, selling potatoes on the streets or something else,” he said.
Dude has a PhD in economics from Utah State University. Selling potatoes on the streets. Come on.
The University of Texas at Dallas has a troubling history of trying to silence students. Now those students are fighting back.
Today, the editors of The Retrograde published their first print edition, marking a triumphant return for journalism on campus in the face of administrative efforts to quash student press.
Headlines above the fold of the first issue of The Retrograde, a new independent student newspaper at UT Dallas.
Why call the newspaper The Retrograde? Because it’s replacing the former student newspaper, The Mercury, which ran into trouble when it covered the pro-Palestinian encampments on campus and shed light on UT Dallas’s use of state troopers (the same force that broke up UT Austin’s encampment just one week prior) and other efforts to quash even peaceful protest. As student journalists reported, their relationship with the administration subsequently deteriorated. University officials demoted the newspaper’s advisor and even removed copies of the paper from newsstands. At the center of this interference were Lydia Lum, director of student media, and Jenni Huffenberger, senior director of marketing and student media, whose titles reflect the university’s resistance to editorial freedom.
The conflict between the paper and the administration came to a head when Lum called for a meeting of the Student Media Oversight Board, a university body which has the power to remove student leaders, accusing The Mercury’s editor-in-chief, Gregorio Olivares Gutierrez, of violating student media bylaws by having another form of employment, exceeding printing costs, and “bypassing advisor involvement.” Yet rather than follow those same bylaws, which offer detailed instructions for removing a student editor, Lum told board members from other student media outlets not to attend the meeting. A short-handed board then voted to oust Gutierrez. Adding insult to injury, Huffenberger unilaterally denied Gutierrez’s appeal, again ignoring the bylaws, which require the full board to consider any termination appeals.
The student journalists of The Retrograde have shown incredible spirit. With your help, we can ensure their efforts — and the rights of all student journalists — are respected.
In response, The Mercury’s staff went on strike, demanding Gutierrez’s reinstatement. To help in that effort, FIRE and the Student Press Law Center joined forces to pen a Nov. 12, 2024 letter calling for UT Dallas to honor the rights of the student journalists. We also asked them to pay the students the money they earned for the time they worked prior to the strike.
UT Dallas refused to listen. Instead of embracing freedom of the press, the administration doubled down on censorship, ignoring both the students’ and our calls for justice.
FIRE took out a full page ad in support of The Retrograde at UT Dallas.
In our letter, we argued that the university’s firing of Gutierrez was in retaliation for The Mercury’s unflattering coverage of the way administrators had handled the encampments. This is not even the first time UT Dallas has chosen censorship as the “best solution;” look no further than in late 2023 when they removed the “Spirit Rocks” students used to express themselves. Unfortunately, the university ignored both the students’ exhortations and FIRE’s demands, leaving UT Dallas without its newspaper.
But FIRE’s Student Press Freedom Initiative is here to make sure censorship never gets the last word.
Students established The Retrograde, a fully independent newspaper. Without university resources, they have had to crowdfund and source their own equipment, working spaces, a new website, and everything else necessary to provide quality student-led journalism to the UT Dallas community. They succeeded, and FIRE is proud to support their efforts, placing a full-page ad in this week’s inaugural issue of The Retrograde.
The fight for press freedom at UT Dallas is far from over — but we need your help to make a difference.
Demand accountability from UT Dallas. The student journalists of The Retrograde have shown incredible spirit. With your help, we can ensure their efforts — and the rights of all student journalists — are respected.
Back in April you’ll recall that UKVI shared a draft “remote delivery” policy with higher education providers for consultation.
That process is complete – and now it’s written to providers to confirm the detail of the new arrangements.
Little has changed in the proposal from last Spring – there are some clarifications on how it will apply, but the main impact is going to be on providers and students who depend, one way or another, on some of their teaching not being accessed “in person”.
The backstory here is that technically, all teaching for international students right now is supposed to be in-person. That was relaxed during the pandemic for obvious reasons – and since, the rapid innovations in students being able to access types of teaching (either synchronously or asynchronously) has raised questions about how realistic and desirable that position remains.
Politics swirls around this too – the worry/allegation is that students arrive and then disappear, and with a mixture of relaxed attendance regulation (UKVI stopped demanding a specific number of contact points a few years ago for universities) and a worry that some students are faking or bypassing some of the attendance systems that are in place, the time has come, it seems, to tighten a little – “formalising the boundaries in which institutions can use online teaching methods to deliver courses to international students”, as UKVI puts it.
Its recent burst of compliance monitoring (with now public naming and shaming of universities “subject to an action plan”) seems to have been a factor too – with tales reaching us of officials asking often quite difficult questions about both how many students a provider thinks are on campus, and then how many actually are, on a given day or across a week.
The balance being struck is designed, says UKVI, to “empower the sector to utilise advances in education technology” by delivering elements of courses remotely whilst setting “necessary thresholds” to provide clarity and ensure there is “no compromise” of immigration control.
Remote or “optional”?
The policy that will be introduced is broadly as described back in April – first, that two types of “teaching delivery” are to be defined as follows:
Remote delivery is defined as “timetabled delivery of learning where there is no need for the student to attend the premises of the student sponsor or partner institution which would otherwise take place live in-person at the sponsor or partner institution site.
Face-to-face delivery is defined as “timetabled learning that takes place in-person and on the premises of the student sponsor or a partner institution.
You’ll see that that difference isn’t (necessarily) between teaching designed as in-person or designed as remote – it’s between hours that a student is required to be on campus for, and hours that they either specifically aren’t expected to come in for, or have the option to not come in for. That’s an important distinction:
Where the student has an option of online or in-person learning, this should count as a remote element for this purpose.
Then with those definitions set, we get a ratio.
As a baseline, providers (with a track record of compliance) will be allowed to deliver up to 20 per cent of the taught elements of any degree level and above course remotely.
Then if a provider is able to demonstrate how the higher usage is consistent with the requirements of the relevant educational quality standards body (OfS in England, QAA in Wales and Scotland) and remains consistent with the principles of the student route, they’ll be able to have a different ratio – up to 40 per cent of the teaching will be allowed to be in that “remote” category.
Providers keen to use that higher limit will need to apply to do so via the annual CAS allocation process – and almost by definition will attract additional scrutiny as a result, if only to monitor how the policy is panning out. They’ll also have to list all courses provided to sponsored students that include remote delivery within that higher band – and provide justification for the higher proportion of remote learning based on educational value.
(For those not immersed in immigration compliance, a CAS (Confirmation of Acceptance for Studies) is an electronic document issued by a UK provider to an international student that serves as proof of admission, and is required when applying for a student visa. The CAS includes a unique reference number, details of the course, tuition fees, and the institution’s sponsorship license information – and will soon have to detail if an international agent is involved too.)
One question plenty of people have asked is whether this changes things for disabled students – UKVI makes clear that by exception, remote delivery can permitted on courses of any academic level studied at a student sponsor in circumstances where requiring face to face delivery would constitute discrimination on the basis of a student’s protected characteristics under the Equality Act 2010.
A concern about that was that providers might not know if a student needs that exception in advance – UKVI says that it will trust providers to judge individual student circumstances in cases of extenuating circumstances and justify them during audits. The requirement to state protected characteristics on the CAS will be withdrawn.
Oh – and sponsors will also be permitted to use remote delivery where continuity of education provision would otherwise be interrupted by unforeseen circumstances – things like industrial action, extreme weather, periods of travel restriction and so on.
Notably, courses at levels 4 and 5 won’t be able to offer “remote delivery” at all – UKVI reckons they are “more vulnerable to abuse” from “non-genuine students”, so it’s resolved to link the more limited freedoms provided by Band 1 of the existing academic engagement policy to this provision of “remote” elements – degree level and above.
Yes but what is teaching?
A head-scratcher when the draft went out for consultation was what “counts” as teaching. Some will still raise questions with the answer – but UKVI says that activities like writing dissertations, conducting research, undertaking fieldwork, carrying out work placements and sitting exams are not “taught elements” – and are not therefore in scope.
Another way of looking at that is basically – if it’s timetabled, it probably counts.
Some providers have also been confused about modules – given that students on most courses are able to routinely choose elective modules (which themselves might contain different percentages of teaching in the two categories) after the CAS is assigned.
UKVI says that sponsors should calculate the remote delivery percentage on the assumption that the student will elect to attend all possible remote elements online. So where elective modules form part of the course delivery, the highest possible remote delivery percentage will have to be stated (!) And where hours in the timetable are optional, providers will have to calculate remote delivery by assuming that students will participate in all optional remote elements online.
The good news when managing all of that is that the percentage won’t have to be calculated on the basis of module or year – it’s the entire course that counts. And where the course is a joint programme with a partner institution based overseas, only elements of the course taking place in the UK will be taken into account.
What’s next
There’s no specific date yet on implementation – IT changes to the sponsor management system are required, and new fields will be added to the CAS and annual CAS allocation request forms first. The “spring” is the target, and there’s also a commitment to reviewing the policy after 12 months.
In any event, any university intending to utilise (any) remote delivery will need to have updated their internal academic engagement (ie attendance) policy ahead of submitting their next annual CAS allocation request – and UKVI may even require the policy to be submitted before deciding on the next CAS allocation request, and definitely by September 2025.
During the consultation, a number of providers raised the issue of equity – how would one justify international and home students being treated differently? UKVI says that distinctions are reasonable because international students require permission to attend a course in the UK:
If attendance is no longer necessary, the validity of holding such permission must be reassessed.
There’s no doubt that – notwithstanding that providers are also under pressure to produce (in many cases for the first time) home student attendance policies because of concerns about attendance and student loan entitlements – the new policy will cause some equity issues between home and international students.
In some cases those will be no different to the issues that exist now – some providers in some departments simply harmonise their requirements, some apply different regs by visa status, and some apply different rules for home students to different dept/courses depending on the relative proportion of international students in that basket. That may all have to be revisited.
The big change – for some providers, but not all – is those definitions. The idea of a student never turning up for anything until they “cram” for their “finals” is built into many an apocryphal student life tale – that definitely won’t be allowed for international students, and it’s hard to see a provider getting away with that in their SFE/SFW/SAAS demanded home student policy either.
Some providers won’t be keen to admit as such, but the idea of 100 per cent attendance to hours of teaching in that 80 per cent basket is going to cause a capacity problem in some lecture theatres and teaching spaces that will now need to be resolved. Module choice (and design) is also likely to need a careful look.
And the wider questions of the way in which students use “optional” attendance and/or recorded lectures to manage their health and time – with all the challenges relating to part-time work and commuting/travelling in the mix – may result in a need to accelerate timetable reform to reduce the overall number of now very-much “required” visits to campus.
One other thing not mentioned in here is the reality that UKVI is setting a percentage of a number of hours that is not specified – some providers could engage in reducing the number of taught hours altogether to make the percentages add up. Neither in the domestic version of this agenda nor in this international version do we have an attempt at defining what “full-time” really means in terms of overall taught hours – perhaps necessarily given programme diversity – but it’ll be a worry for some.
Add all of this up – mixing in UKVI stepping up compliance monitoring and stories of students sharing QR codes for teaching rooms on WhatsApp to evade attendance monitoring systems – and for some providers and some students, the change will be quite dramatic.
The consultation on the arrangements has been carried out quite confidentially so far – I’d tentatively suggest here that any revision to arrangements implemented locally should very much aim to switch that trend away from “UKVI said so” towards detailed discussion with (international) student representatives, with a consideration of wider timetabling, housing, travel and other support arrangements in the mix.