On Jan. 7, the House Oversight Committee approved a subpoena for Seth Harp, an investigative journalist and contributing editor at Rolling Stone, for posting information about a Delta Force commander. Congress has broad authority to issue subpoenas. But it must show far more restraint when aiming them at journalists without any evidence of wrongdoing.
In early January, Harp reposted a screenshot identifying a commander involved in the U.S. capture of Nicolás Maduro, Venezuela’s former dictator. X reportedly locked Harp’s account until he deleted the post. The House Oversight Committee then voted to approve a subpoena “for leaking classified information.” Republican Rep. Anna Paulina Luna of Florida’s 13th congressional district, who introduced the motion to subpoena Harp, said, “Putting a service member and their family in danger is dishonorable and feckless. Leaking classified information demands explanation and a criminal investigation.”
But publishing the news, even when the news contains classified information, is exactly the role of a journalist. And Rep. Luna did not cite any evidence that Harp broke the law to obtain the information.
Can Congress subpoena Harp over his reporting?
Congress has a broad subpoena power, subject to some constitutional limits.
Congress does have broad investigative authority tied to its legislative power, and subpoenas are a standard tool of that authority. It cannot investigate without the ability to compel people to share information.
But that authority still has limits. In Watkins v. United States(1957), a McCarthy-era congressional subpoena case, the Supreme Court held that while it is “unquestionably the duty” of citizens to cooperate with such subpoenas, the power to issue subpoenas at all “assumes that the constitutional rights of witnesses will be respected by the Congress as they are in a court of justice. The Bill of Rights is applicable to investigations as to all forms of governmental action.” The First Amendment prohibits government retaliation for engaging in protected speech. So under Watkins’ rationale, Congress should not subpoena a journalist merely because it dislikes their reporting.
If Congress abuses its subpoena power, will courts stop it?
In practice, the Speech or Debate Clause weakens Watkins’ constitutional limit on congressional subpoenas.
Even after Watkins, abusive congressional subpoenas are difficult to preemptively fight in court. One reason is the Speech or Debate Clause, which gives members of Congress immunity for legislative acts or statements, including subpoenas.
In Eastland v. U.S. Servicemen’s Fund (1975), the Senate investigated the defendant organization (including a subpoena for bank records) after it distributed anti-Vietnam war publications to the military. When the Servicemen’s Fund challenged the subpoena all the way to the Supreme Court, the Court held that the subpoena fell within Congress’s “legitimate legislative sphere” of investigating the “effect of subversive activities.” Because the committee acted within its investigatory powers, the Court concluded, the Speech or Debate Clause immunized the committee and its staff from suit. The subpoena remained on the books.
Eastland thus stands for the proposition that courts may not “look behind” a subpoena for constitutionally improper motives. It would be unconstitutional for Congress to investigate a nonprofit’s bank accounts, or a reporter’s sources, based on First Amendment-protected expression. But so long as Congress can prove it acted within the bounds of its power, any remedy for the constitutional violation must be found outside the courts.
Even if Congress can use its subpoena power to end-run around the First Amendment, should it?
Standardless subpoenas against reporters risk chilling journalism.
Even when Congress has facially legitimate (if arguably pretextual) grounds for its investigation, forced investigative questioning is a direct threat to the conditions that make journalistic inquiry possible. Freedom of the press — and of speech — requires the ability to pursue knowledge and ideas without fear of retribution. Otherwise, our knowledge grows stale, and our ability to assess the truth trends toward the state’s mandated line. As the Supreme Court noted in Sweezy v. New Hampshire(1957), “scholarship cannot flourish in an atmosphere of suspicion and distrust.” Replace “scholarship” with “protesting” or “reporting,” and the principle remains the same.
Floyd Abrams, who has spent his career litigating press cases, puts it plainly — such legal battles “cost an enormous amount of money, have enormous disruptive effects” and represent “an institutional threat to the behavior of a newspaper.” Subpoenas signal to sources that talking to the press could put them under a governmental spotlight. They force reporters and editors to ask: This story is accurate, but can we afford the cost of printing it?
FIRE’s recent work shows that when Congress goes overboard with investigations, it can scare people into silence — even when their speech is perfectly legal. Tyler Coward, lead government counsel at FIRE, condemned congressional investigations into student groups and nonprofits associated with pro-Palestine protests as “fishing expedition[s]” based on groups’ viewpoint.
Likewise, John Coleman, legislative counsel at FIRE, criticized the House’s investigation of Stanford researchers studying “misinformation.” Targeting protected academic inquiry might serve some legitimate congressional objective, Coleman argued, but such investigations deter future inquiry. For reporters, the same lesson is obvious: even if a subpoena is ultimately narrowed or withdrawn, if you want to avoid the risk, avoid the subject.
The issuance of speech-chilling subpoenas knows no partisan bounds, either. Republicans led the investigations into pro-Palestine groups and Stanford researchers. But in 2021, the House Select Committee on January 6th — chaired by Democratic Rep. Bennie Thompson — subpoenaed a photojournalist’s phone records from Verizon. At the time, the Reporters Committee for Freedom of the Press called on Thompson to withdraw the subpoena, calling it a “direct threat to newsgathering.” In Seth Harp’s case, the House Oversight Committee’s top Democrat, Rep. Robert Garcia, supported Rep. Luna’s motion to subpoena, and it was approved unanimously.
If courts are unlikely to stop Congress, who will protect journalists?
Even if constitutional, Congress should refrain from issuing standardless subpoenas against journalists.
Despite the fact that the Speech or Debate Clause largely immunizes Congress when it issues subpoenas, Congress has an independent obligation to follow the Constitution. Recall that Eastland held that courts may not “look behind” a subpoena to test whether the real aim was retaliation or harassment. Facially legitimate subpoenas will stand, even if they’re arguably illegal. That means Congress itself is the main check on subpoenas meant to retaliate against or harass reporters. And Congress must better police its subpoena process — otherwise it imperils not only our free press, but also free speech and our collective pursuit of truth and knowledge.
Congress should keep Watkins in mind when crafting subpoenas. At a base level, that means Congress should not issue subpoenas to journalists for merely reporting the news. Beyond that, Congress should ensure that there are no other means to obtain the requested information. It should tailor requests to avoid sweeping in things like sources, editorial deliberations, or other discussions essential to the newsgathering process. These suggestions are modest but vital institutional firewalls against congressional abuse of its oversight power.
The public cannot be informed — cannot check officials, evaluate policy, or hold politicians accountable — without strong protections for the press freedom to share information and to criticize without retaliation. Alexander Hamilton warned that constitutional safeguards are often not enough. The freedom of the press, he wrote in Federalist No. 84, rested not in “fine declarations” but rather in the “general spirit of the people and of the government.”
Congress must take it upon itself, in Harp’s case and others, to embody that spirit of a free press and refrain from investigating journalists for merely doing their job.
Google Analytics 4 (GA4) has reshaped how colleges and universities track prospective student behaviour online. With the retirement of Universal Analytics (UA) in 2023, GA4 is now the default analytics platform, and for many higher ed marketers, the transition has been disorienting. Gone are the familiar sessions and pageviews; in their place is an event-based model, a redesigned interface, and new metrics that require a shift in thinking.
But while the learning curve is real, so are the opportunities. GA4 offers deeper insights into student intent, behaviour, and engagement, insights that, when used effectively, can support measurable enrollment growth.
This guide breaks down GA4 in a practical, approachable way. We’ll walk through how to use its core features at each stage of the student recruitment funnel: Discovery, Engagement, Decision-Making, and Enrollment. You’ll learn which reports matter, which metrics to ignore, and how to use GA4’s exploration tools to uncover new conversion opportunities. Throughout, we’ll also highlight how Higher Education Marketing (HEM) can help you make the most of GA4, from free audits to CRM integration support.
Let’s start by shifting our perspective on what analytics can do, and then dive into how GA4 can support every phase of your student journey.
GA4 unlocks powerful enrolment insights.
Turn student journey data into smarter recruitment decisions with HEM.
GA4’s Event-Based Mindset vs. Universal Analytics
The most significant shift from Universal Analytics (UA) to Google Analytics 4 (GA4) is the underlying measurement model. UA was centred on sessions and pageviews, essentially counting a sequence of “hits” during a user’s visit. GA4, by contrast, is entirely event-based. Every interaction, whether it’s a pageview, a button click, a form submission, or a video play, is captured as an event. This model allows for a more flexible, granular view of user behaviour across devices and platforms, reflecting the idea that “everything is an event that signals user intent.”
What makes GA4 different from Universal Analytics for higher ed marketers? Higher ed marketers accustomed to UA’s pageviews and sessions are now confronted with a new event-based model, a slew of unfamiliar reports, and an interface that looks nothing like the old Google Analytics. GA4 offers richer insights into student behaviour and intent, which can directly fuel enrollment growth.
Crucially, GA4 is built for today’s privacy-first, multi-device world. It can track a single user’s journey across devices using User IDs or Google Signals and relies less on cookies, instead using machine learning to fill in data gaps, helping you stay compliant with emerging privacy standards.
For higher ed marketers, this opens up richer insight into the prospective student journey. GA4 for student recruitment automatically tracks many common interactions (like scrolls and file downloads) and lets you define custom events aligned to your goals.
New metrics also reflect this shift. Engagement Rate replaces bounce rate, highlighting sessions that last 10+ seconds, include 2+ pageviews, or trigger a conversion. Other core metrics include Engaged Sessions per User and Average Engagement Time, which are helpful indicators of whether your content holds attention or needs refinement.
GA4 also brings predictive capabilities. With built-in machine learning, it can surface emerging trends or flag anomalies in student behaviour. While some advanced features like Predictive Metrics may feel out of reach initially, knowing they exist helps future-proof your analytics approach.
It’s true, GA4 isn’t just an upgrade, it’s an entirely new platform. Many familiar reports have been retired or redesigned, and the interface now favours customizable dashboards over static reports. But don’t let the overhaul overwhelm you.
The key is to focus on the metrics that support your enrollment goals. In the next section, we’ll show how GA4’s event-based model aligns with each stage of the student journey, from first visit to application.
If you need support getting started, HEM offers a free GA4 audit to help identify top-performing lead sources, evaluate your marketing ROI, and ensure your setup is recruitment-ready.
Mapping GA4 to the Student Journey Stages
Every prospective student moves through distinct phases on the path to enrollment. GA4 can provide actionable insights at each stage if you know where to look. Below, we break down how to use GA4 effectively across the four stages of the student journey: Discovery, Engagement, Decision-Making, and Enrollment. We’ll also highlight key metrics to prioritize and reports you can skip to avoid analysis paralysis.
Stage 1: Discovery: Awareness & Early Interest
What it is: At this stage, prospective students are just beginning to explore postsecondary options. They may land on your site via a Google search, a digital ad, or a social post. They’re not ready to apply yet, but they’re starting to investigate. Your goal is to attract the right audiences and create a strong first impression.
What to use in GA4: Focus on the Acquisition reports under Life cycle > Acquisition:
User Acquisition Report Shows how new users first arrive, by channel, campaign, or source. This answers, “Where are our new prospects coming from?” and helps assess brand awareness performance.
Traffic Acquisition Report Tracks sessions from all users (new and returning). Use it to evaluate which traffic sources deliver engaged sessions and prompt interaction.
Key metrics to monitor:
Engaged Sessions per User: Are visitors exploring more than one page?
Engagement Rate: What percentage of sessions include meaningful interaction?
Event Count per Session: Are users watching videos, downloading brochures, or clicking calls-to-action?
These metrics reflect traffic quality, not just quantity. For example, if organic search traffic has a 75% engagement rate while paid social sits at 25%, that’s a clear sign of where to invest.
Landing Pages: Your Digital First Impression Check Engagement > Pages and Screens to see which pages users land on most. Are your program or admissions pages pulling in traffic? Are they generating long engagement times? That’s a signal they’re working. If top landing pages show low engagement, it’s time to refine content, CTAs, or UX.
What to skip:
Demographics and Tech Reports: Too broad to act on for now.
Real-time Report: Interesting, but not useful for strategic planning.
Pro tip: HEM’s free GA4 assessment can help you identify your highest-quality channels and flag low-performing ones so you can optimize marketing spend and attract better-fit prospects.
Once prospective students are aware of your institution and begin browsing your site in earnest, they enter the engagement or consideration stage. Here, they’re comparing programs, evaluating fit, and building interest, but may not yet be ready to contact you. Your goal is to nurture their intent by providing relevant content, encouraging micro-conversions, and guiding them toward decision-making.
GA4 Focus: Engagement & Behaviour Reports
In GA4, shift your attention to the Engagement reports under Life cycle > Engagement. These include:
Pages and Screens
Events
Conversions
Landing Pages
As HEM notes, “Engagement reports are all about what prospects do after landing on your site”, whether they go deeper or drop off.
1. Pages and Screens Report
This is your new “Top Pages” view. Use it to identify high-interest pages such as:
Program descriptions
Tuition and aid
Admissions criteria
Campus life
Key metrics:
Average Engagement Time
Conversions per Page
User Navigation Paths (Where users go next)
If your BBA program page has high engagement and links to “Schedule a Tour,” make sure the CTA is prominent and functional. If engagement is low, revise the content or layout.
2. Events Report
GA4 automatically tracks events like:
Scroll depth (90%)
File downloads
Outbound clicks
Video plays
You should also configure custom events for micro-conversions, such as:
“Request Info” form submissions
Brochure downloads
“Schedule a Visit” or “Start Application” clicks
These are the mid-funnel signals that indicate increasing interest. Mark them as Conversions in GA4 to elevate their importance in reporting.
Pro tip: Track 3–5 key events that correlate strongly with application intent.
3. Conversions Report
Once key events are marked as conversions, the report will show:
Total conversions by event type
Event frequency over time
Value (if assigned)
This helps determine which micro-conversions are driving engagement and which campaigns or pages are most effective.
4. Path Exploration
GA4’s Explorations > Path Analysis lets you visualize what users do after key pages or events. For example, if many students visit the “Admissions FAQ” after reading a program page, that suggests rising intent. Use this to improve internal linking and user flow.
What to Skip
Avoid advanced GA4 reports like:
Cohort Analysis
User Lifetime
User Explorer
These are often too detailed or irrelevant for short-term funnel optimization. Also, don’t feel obligated to use every Exploration template; build your own around your specific enrollment steps instead.
HEM Insight: Unsure if your GA4 is tracking these mid-funnel behaviours correctly? HEM offers audits, event configuration, and CRM integration support, ensuring that when a student requests info, that action is tracked, stored, and acted upon.
Ready for the next stage? Let’s move on to how GA4 supports Decision-Making.
Stage 3: Decision-Making: High Intent & Lead Conversion
In the decision-making stage, prospective students move from casual interest to serious consideration. They’re comparing programs, costs, outcomes, and culture. By now, they’ve likely returned to your site several times. The goal here is clear: convert an engaged visitor into a lead or applicant.
GA4 Focus: Conversion Tracking & Funnel Analysis
This is where your earlier GA4 setup pays off. With key conversion events (e.g., “Request Info,” “Submit Application”) defined, you can now analyze how and where those conversions happen. GA4’s Traffic Acquisition, Explorations, and Conversions tools are central at this stage.
Conversions by Source/Medium
To understand which marketing channels drive high-intent actions, use the Traffic Acquisition report and add columns for specific conversions (e.g., “Request Info count” and conversion rate). Alternatively, build an Exploration with source/medium as the dimension and conversion events as metrics.
HEM’s webinar emphasizes looking beyond raw volume: ask “Which sources deliver my highest-intent leads?” For example:
Organic Search: 30 info requests, 10 applications
Paid Social: 5 info requests, 0 applications
This data helps optimize channel strategy. If certain channels underperform in lead quality, revisit targeting, messaging, or landing pages.
Funnel Exploration
GA4’s Funnel Exploration is ideal for visualizing conversion paths. You can define steps like:
View Program Page
Click “Request Info”
Submit RFI Form
Start Application
Submit Application
Example funnel insight:
1,000 users view program pages
200 click “Inquire” (20%)
50 submit forms (25% of clicks)
30 start applications
20 submit applications (67% of starters)
This highlights where friction occurs, perhaps a clunky form (25% completion) or weak CTAs (20% inquiry rate). Use this to improve form UX, reinforce CTAs, or add nurturing touchpoints.
You can also segment student recruitment funnels by device or user type (e.g., international vs. domestic). If drop-off is worse on mobile, consider layout changes; if international students abandon applications, address barriers like unclear visa info.
Path Exploration
GA4’s Path Exploration can show common user journeys leading to conversion. Start with “Application Submitted” and trace backward. If scholarship pages, FAQs, or department overviews frequently appear in these paths, you’ve identified key conversion content.
Conversely, if users loop across pages without converting, that may signal confusion. Use these insights to surface critical info sooner or rework unclear sections.
User Explorer: Qualitative Insights
While not scalable, inspecting User Explorer for select journeys (e.g., converters vs. non-converters) can offer qualitative insight. One user might watch webinars and return five times before applying, proving content value. Others bounce after one visit, highlighting the need for nurturing.
Metrics That Matter
Focus on:
Conversion counts and rates per channel and funnel stage
Engaged sessions per user
Average engagement time for converters
Example: applicants average 5 sessions and 10 engagement minutes; non-converters average 1 session and 2 minutes. Clearly, repeat engagement correlates with conversion, and nurturing campaigns (email, retargeting) are essential.
What to Skip
Avoid getting distracted by:
Cohort Analysis or User Lifetime
Attribution modelling (unless you’re running major ad campaigns)
Default GA4 templates that don’t fit your student recruitment funnel
Stick with the custom funnel and path reports that reflect your application process.
Pro Tip: Not confident in GA4 setup? HEM’s experts can build your funnels, configure conversion tracking, and connect GA4 to your CRM, giving you clear, enrollment-focused dashboards and team training to act on the insights confidently.
Stage 4: Enrollment: Application to Enrollment (Bottom of Funnel)
The enrollment stage is the final stretch, transforming applicants into enrolled students. While much of this process shifts to admissions and offline workflows (e.g., application review, acceptance, deposit), digital analytics still play a critical role. GA4 helps marketing teams identify friction points, evaluate channel performance, and inform efforts that influence yield. It also closes the loop on campaign effectiveness, especially if tied to downstream outcomes.
GA4 Focus: Funnel Completion, Attribution, and Post-Application Insights
Application Funnel Completion
Using Funnel Exploration, ensure your funnel captures key milestones like “Apply Clicked” and “Application Submitted.” If many click “Apply” but few complete the form, GA4 highlights a clear drop-off. For instance, if desktop converts at 30% but mobile only 10%, there may be UX issues on mobile or a third-party form that isn’t optimized. This insight can guide IT discussions or quick fixes (e.g., warning banners or responsive design improvements).
Attribution Paths
GA4’s Advertising > Attribution > Conversion Paths report reveals the sequence of marketing touches that lead to applications. Common patterns in higher ed include:
Organic Search → Direct → Conversion
Paid Search → Organic → Direct → Conversion
Email → Direct → Conversion
These paths underscore that enrollment isn’t a single-touch journey. For instance, Organic Search may start the process, while Direct or Email closes it. If you frequently see Email leading to conversions, it validates your nurture sequences. Also, keep an eye on new referral sources, like “Chat” or “Perplexity”, which may signal traffic from AI tools, as teased in HEM’s presentation.
Post-Application Engagement
Some schools track events beyond submission (e.g., clicking an admitted student portal link, viewing housing or financial aid info). While GA4 may not capture yield or melt directly, it can show post-application interest signals. Continued engagement, like visiting tuition or residence life pages, suggests intent to enroll or lingering questions that marketing content can address.
Benchmarking and Outcomes
Use GA4 to evaluate ROI by channel. For example, if Paid Search generates 10 applications at $5,000, while Organic Search drives 30 at no direct ad cost, that’s a critical insight. While GA4 doesn’t include media spend (unless connected to Google Ads), you can overlay cost data offline to calculate rough efficiency.
You can also segment Applicants vs. Non-Applicants using GA4’s Explorations. Let’s say applicants averaged 8 sessions while non-applicants averaged 2. That suggests high engagement correlates with conversion, reinforcing the value of remarketing, email campaigns, and sticky content.
Research supports this: EAB found that highly engaged users (multiple sessions, longer duration) were significantly more likely to apply.
What to Skip
Once a student applies, most enrollment decisions move to CRM or SIS platforms, not GA4. Don’t expect GA4 to tell you who enrolled, who melted, or who was denied. Similarly, ignore reports like Predictive Metrics, User Lifetime, and Cohort Analysis, which are less actionable for enrollment marketing. Focus instead on your core funnel, attribution, and engagement data.
Final Takeaway
By now, your GA4 setup should illuminate your recruitment funnel: how students find you, how they behave, when they convert, and where they fall off. This data is crucial for optimizing spend, improving user experience, and shaping strategic decisions.
Priority GA4 Reports:
Traffic & User Acquisition (channel quality)
Pages and Screens (top content, engagement)
Events & Conversions (key actions)
Funnel & Path Explorations (journey analysis)
Attribution Paths (multi-touch influence)
Reports to Skip:
Demographics & Tech (unless troubleshooting)
Realtime (not strategic)
Cohorts, LTV, Default Templates (too advanced or unfocused)
Pro tip: HEM can help you build enrollment-specific GA4 funnels, connect data to your CRM, and surface dashboards that show “visits → inquiries → apps → yield” at a glance, so you can finally act on your data with confidence.
Real-World Examples: GA4 Insights Driving Enrollment in Higher Ed (from various colleges & universities)
Clemson University (College of Business) Clemson’s Wilbur O. and Ann Powers College of Business leveraged targeted digital campaigns and GA4 event tracking to dramatically increase prospective student engagement.
The college saw a 207% increase in page engagement and a 222% growth in program page views for a key graduate program after the campaign. In just a two-month push, GA4 recorded 498 users requesting information and 44 clicking “Apply” to begin their applications.
University College Dublin (UCD). This university fully transitioned to GA4 and implemented a unified analytics dashboard via a data warehouse for all its websites. The new GA4-powered reporting interface, featuring Overview, Page Performance, and User Engagement reports, loads much faster and retains up to two years of data.
This enables UCD’s faculties and departments to easily track user behaviour across the university’s web presence, gaining insights into what content is engaging visitors and where improvements can be made.
Boise State University. Boise State created a centralized GA4 “Comprehensive Dashboard” accessible to campus stakeholders and paired it with training tutorials on common GA4 tasks. Their web team produced self-paced video guides on how to filter GA4 data to answer specific questions (such as finding top pages, viewing traffic sources, or seeing visitor geolocation).
This approach empowers individual departments to slice the raw GA4 data for their own needs and quickly get answers about user behaviour, for example, identifying the most popular pages or where visitors are coming from, without needing advanced technical skills.
UC Riverside. UC Riverside moved all its many departmental and unit websites to GA4 under a centralized analytics structure. The university’s web team built a curated “Web Analytics for Campus Partners” GA4 dashboard with custom reports, including a Broken Links report and a Top Landing Pages report.
These tailored GA4 dashboards help site owners across campus quickly spot issues (e.g. finding and fixing 404 error pages) and identify content that attracts new traffic. By giving each department actionable insights, such as which pages are bringing in the most new visitors, UCR has improved user experience and informed content strategy across dozens of sites in its domain.
Texas A&M University. Texas A&M established an Analytics Community of Practice that meets monthly, bringing together marketers and communicators from different colleges and units to share GA4 insights and techniques.
In these sessions, participants discuss recent findings (for example, which pages on their sites show unusually high engagement rates, or how referral traffic patterns are shifting) in a collaborative forum. This ongoing knowledge exchange ensures continuous learning and helps cultivate a data-informed culture campus-wide.
Embracing GA4’s event-based, student-centric model can reshape how your team drives recruitment outcomes. By moving beyond vanity metrics like pageviews, GA4 prompts higher ed marketers to focus on real indicators of student intent, such as engaged sessions, application clicks, and program page sequences. Across each funnel stage, GA4 reveals which channels attract interest, what content sustains it, and which actions convert it.
This clarity empowers you to refine campaign targeting, improve website performance, and simplify the inquiry or application path. GA4 also bridges the long-standing gap between marketing and admissions by giving both teams shared metrics and a common funnel narrative. Instead of saying, “We got 10,000 visits,” marketing can report: “We drove 300 info requests and 50 applications, and here’s what influenced them.”
It’s true, GA4 can feel overwhelming at first. But by focusing on core engagement metrics, key conversion events, and simple funnel analyses, you can avoid the noise and surface what truly matters. Start small, then grow into more advanced insights as you gain confidence. What should higher ed marketers avoid focusing on in GA4? Don’t worry if GA4 isn’t tracking beyond the application.
Also, avoid misattributing things to GA4 that it can’t measure – e.g., GA4 won’t tell you ‘admitted vs. denied’ or ‘enrolled vs. melt’ – that’s outside its scope. Focus on what GA4 can concretely tell you about the marketing funnel leading up to enrollment.
Above all, GA4 is most powerful when used collaboratively. Share funnel data with admissions. Highlight high-performing content to your copy team. Use insights to inform international recruitment or retargeting campaigns. And if needed, partner with specialists. At HEM, we help institutions build clear, actionable GA4 setups, from audits and event tracking to CRM integrations, so your analytics directly support enrollment.
GA4 isn’t just an upgrade; it’s a strategic advantage. When aligned with your funnel, it can become your most effective tool for enrollment growth.
GA4 unlocks powerful enrolment insights.
Turn student journey data into smarter recruitment decisions with HEM.
FAQs
What makes GA4 different from Universal Analytics for higher ed marketers? Higher ed marketers accustomed to UA’s pageviews and sessions are now confronted with a new event-based model, a slew of unfamiliar reports, and an interface that looks nothing like the old Google Analytics. GA4 offers richer insights into student behaviour and intent, which can directly fuel enrollment growth.
What should higher ed marketers avoid focusing on in GA4? Don’t worry if GA4 isn’t tracking beyond the application. Also, avoid misattributing things to GA4 that it can’t measure, e.g., GA4 won’t tell you ‘admitted vs. denied’ or ‘enrolled vs. melt’, that’s outside its scope. Focus on what GA4 can concretely tell you about the marketing funnel leading up to enrollment.
Which GA4 reports should we prioritize for enrollment marketing? Focus on the critical reports:
Traffic Acquisition & User Acquisition (for awareness channel quality)
Engagement > Pages and Screens (for top content and engagement per page)
Carnegie announced a continued commitment to higher education that places student connection at the center of institutional strategy, aligning research, strategy, storytelling, media, and technology to help colleges and universities navigate today’s interconnected challenges. The update reflects an evolution in how Carnegie supports enrollment, trust, relevance, and student success in an era shaped by demographic change and AI-driven discovery.
A Moment of Change for Higher Education
As colleges and universities confront a period of sustained pressure, rising scrutiny, and rapid change, Carnegie today announced a continued commitment to how it supports higher education—placing student connection at the center of institutional strategy, decision-making, and long-term success.
The Announcement at the 2026 Carnegie Conference
The announcement was made on stage at the opening of the 2026 Carnegie Conference, where more than 400 higher education leaders and professionals gathered to examine the forces reshaping enrollment, reputation, strategy, and the student experience.
More Than a Brand Update—A Strategic Evolution
While Carnegie introduced an updated brand identity as part of the moment, company leaders emphasized that the announcement reflects a broader evolution in how the company is responding to the realities facing institutions today.
Carnegie is aligning its strategy around integrated, innovative approaches—bringing together research, data, AI-enabled technology, and strategy—to help leaders address challenges that are increasingly interconnected and complex.
Why This Shift Matters Now
“Higher education leaders are operating in an environment where the stakes are higher and the margin for error is smaller,” said Gary Colen, chief executive officer of Carnegie. “Our responsibility is to innovate with purpose—delivering clarity, focus, and solutions that help institutions make decisions that lead to better outcomes for students.”
Student Connection as a Strategic Imperative
Carnegie’s work is grounded in a single belief: when students succeed, higher education thrives—and the world wins.
As demographic shifts, changing learner expectations, technological disruption, and public accountability reshape the sector, Carnegie has aligned its strategy around helping higher ed institutions build meaningful, lasting connections with today’s diverse learners.
Meeting the Moment Higher Education Leaders Are Facing
According to Michael Mish, Chief Growth Officer, the timing of the announcement reflects what the company is hearing from campus leaders. “Higher education leaders need partners who deliver strategic expertise and forward-thinking innovation,” Mish said. “Our evolution is about connecting strategy and innovation in practical ways—so institutions can address today’s challenges while preparing for what’s next.”
What the Updated Carnegie Brand Represents
The updated brand brings greater cohesion to how Carnegie delivers research, strategy, storytelling, media, and technology—reinforcing its role as a strategic higher education partner focused on trust, relevance, and results rather than short-term wins.
A More Integrated Approach to Research, Strategy, and Execution
“Our intent wasn’t to make a statement about ourselves,” said Tyler Borders, Chief Brand Officer. “It was to be more precise about our role and our responsibility in this moment. The brand reflects how our work has evolved and the standard we expect of ourselves as a partner to higher education.”
What’s Launching Next
As part of the rollout, Carnegie has launched an updated digital experience and will introduce new research, offerings, and insights.
New Research and Insights
This week, the company is releasing a comprehensive research report focused on online learners. In February, Carnegie will debut an updated Carnegie Intelligence newsletter, expanding how it shares perspective and practical guidance with higher education leaders.
Introducing Answer Engine Optimization (AEO)
Carnegie is also introducing a new Answer Engine Optimization (AEO) solution designed to help higher education institutions improve visibility in AI-powered search experiences—ensuring institutions are accurately represented as students increasingly rely on AI to answer questions about programs, outcomes, cost, and fit.
Navigating the Now and the Next—Together
“This is ongoing work,” Colen added. “Our commitment is to keep earning trust—by helping institutions navigate what’s next without losing sight of what matters most: changing students’ lives for good.”
For every college and university facing urgent and complex challenges, Carnegie is the student connection company that helps you navigate the now and the next in higher education. Our experts design custom strategies fueled by data, technology, and insights—empowering you to connect with today’s diverse learners and stay focused on what matters most: changing students’ lives for good.
Frequently Asked Questions About Carnegie and Student Connection
Who is Carnegie in higher education?
Carnegie is a strategic partner to colleges and universities focused on enrollment, reputation, strategy, and student success. The company helps institutions navigate complex, interconnected challenges by aligning research, strategy, storytelling, media, and technology around what matters most: students.
What does it mean to be a “student connection company”?
Being a student connection company means helping institutions build meaningful, lasting relationships with today’s diverse learners. Carnegie focuses on connecting strategy, data, storytelling, and execution so institutions can support student success, institutional relevance, and long-term impact.
What prompted Carnegie’s updated brand and renewed commitment?
Carnegie’s updated brand reflects an evolution in how the company responds to the realities facing higher education today, including demographic shifts, technological disruption, and increased public accountability. The refresh clarifies Carnegie’s role as a strategic partner helping institutions navigate these interconnected challenges without losing focus on students.
How does Carnegie help colleges and universities navigate change?
Carnegie supports institutions through integrated research, strategic planning, brand and storytelling, media and digital marketing, and technology-enabled solutions. This approach helps leaders align enrollment goals, reputation, data, and execution to drive meaningful outcomes.
What is Carnegie’s Answer Engine Optimization (AEO) solution?
Carnegie’s Answer Engine Optimization (AEO) solution helps colleges and universities improve how they are represented in AI-powered search environments like ChatGPT, Google AI Overviews, and other answer engines. The solution focuses on content clarity, factual alignment, and structured optimization so institutions are trusted sources when students ask AI-driven questions.
“This is freedom of speech. This is America, right?”
Those were the incredulous words of Raquel Pacheco, a U.S. Army veteran and three-time candidate for local office. She made the remark while being questioned by police at her Miami Beach home last week for criticizing her mayor on Facebook.
On Jan. 6, Miami Beach Mayor Steven Meiner posted a message on his official Facebook page saying, among other things, that “Miami Beach is a safe haven for everyone” and that the city “is consistently ranked by a broad spectrum of groups as being the most tolerant in the nation.”
Speech is not a crime — even if it complicates ICE’s job
Aaron Terr explains why alerting others to law enforcement activity, or reporting on it, is protected by the First Amendment.
That is, apparently, unless you criticize him. Pacheco’s response — accusing Meiner of “consistently call[ing] for the death of all Palestinians,” trying “to shut down a theater for showing a movie that hurt his feelings,” and “REFUS[ING] to stand up for the LGBTQ community in any way” — appears to have been too much free speech for the mayor to tolerate.
Six days later, two Miami Police officers knocked on Pacheco’s door, claiming they were there “to have a conversation” and confirm that it was her who made those comments. In a video of the interaction, the officers justify their visit by saying they wanted to prevent “somebody else getting agitated or agreeing with” Pacheco’s post. They added that the line about Meiner’s views on Palestinians “can probably incite somebody to do something radical,” and advised her “to refrain from posting things like that because that could get something incited.”
What occurred at Pacheco’s home raises serious concerns in a free society. Her statements fall well short of the legal threshold for incitement, which applies only to speech which urges unlawful action and is likely to provoke it immediately. A careful reading of her post reveals no call for illegal activity, nor any indication that it would prompt others to act unlawfully.
If sharp but non-threatening criticism and political commentary can be treated as unlawful incitement, freedom of speech ceases to exist in any meaningful sense.
Residents of the United Kingdom are all too familiar with police interventions over social media content. In September, blogger Pete North was arrested for posting a meme displaying the text “F— Palestine F— Hamas F— Islam… Want to protest? F— off to a Muslim country & protest.” That same month, Deborah Anderson, an American who had been living in England for years, was visited by police for Facebook posts that “upset someone.” And last January, a couple were arrested on suspicion of harassment, evidently for comments as mild as describing an employee at their daughter’s school as a control freak in a parents’ WhatsApp chat. Sadly, such incidents are just a fraction of longstanding limitations on speech in the UK.
These examples demonstrate why the First Amendment sets the bar so high for its few, narrow exceptions. Democracy requires ample breathing room to speak about public issues. If sharp but non-threatening criticism and political commentary can be treated as unlawful incitement, freedom of speech ceases to exist in any meaningful sense.
Such cases highlight the need to safeguard free expression in both the U.S. and the UK. Censorious practices which appear in one place often spread elsewhere. Across the West, law enforcement responses to online criticism are becoming more common. Without vigilance, such interventions will continue. The principle is clear: free expression must be protected.
Join HEPI Director Nick Hillman OBE and SUMS Consulting at 11am tomorrow (22nd January 2026) for a webinar based on the report ‘University Lands: Mapping Risks and Opportunities for the HE Sector’. Sign up for the webinar here. Read the blogs HEPI had published on the report here, here and here.
This blog was kindly authored by Beverley Orr-Ewing, Consultant and Student Mobility Lead, Cormack Consulting Group, andProfessor Sally Wheeler, Vice-Chancellor, Birkbeck, University of London.
As the UK edges closer to a return to Erasmus+, attention is turning towards what this might mean for the Turing Scheme. For many universities, particularly those with deep European partnerships and strong Modern Language provision, the prospect of rejoining Erasmus is genuinely welcome. Politically, it is also an attractive signal – a step towards reversing some of the damage caused to relationships by Brexit and restoring a sense of connection with our European partners.
Against that backdrop, our concern is less about a return to Erasmus itself, and more about the assumption that Erasmus can simply take on the role Turing currently plays. The real question, therefore, is not which scheme is better but what we will lose if Turing disappears.
Different schemes, different problems
Erasmus+ and Turing were created to address different policy challenges, at different moments in time and in very different political contexts. Erasmus+ emerged in the late 1980s as part of a broader project of European social integration, designed to support long-term cooperation through reciprocal partnerships between largely publicly funded higher education systems.
By contrast, Turing, was designed in a post-Brexit landscape where the loss of Erasmus made a contraction in outward student mobility all but inevitable unless a new mechanism was put in place. From the outset, it placed greater emphasis on two areas Erasmus historically struggled with: widening participation and global reach. Turing was never intended to be a like-for-like replacement for Erasmus+, and treating it as such risks misunderstanding both its purpose and its value.
What Turing has enabled
A useful starting point is what Turing has enabled, and how it has reshaped and added value to student mobility.
First, a genuinely global approach. The Turing Scheme has supported tens of thousands of UK students each year to undertake study and work placements overseas. In the 2023–24 funding round alone, nearly 23,000 higher-education students were supported, with placements spanning more than 160 countries worldwide. Alongside European destinations, the top host countries also show how non-European mobility has become central to Turing. In 2023–24, six of the ten most common higher-education destination countries were outside the EU (the United States, Australia, Canada, Japan, South Korea and China), and these non-EU destinations accounted for just over half of learners across the top ten.This reflects a shift away from a primarily Europe-centred model towards mobility that aligns more closely with student interests and the increasingly global outlook of UK higher education.
Second, a more flexible, student-focused funding model. Turing places fewer structural requirements on the form mobility must take, allowing funding to follow the student rather than being shaped by institutional exchange frameworks. This has made it easier to support a wider range of mobility types, including short programmes, work placements and volunteering. While Erasmus also supports work and volunteering, Turing’s design has provided greater flexibility in contexts where traditional exchange models are difficult to sustain or risk excluding particular groups of students.
Third, a clear priority around widening participation. From its inception, Turing was explicitly framed around improving access to mobility for students who have historically been less likely to participate. Funding outcomes for 2023–24 indicate that close to half of higher-education Turing participants were from under-represented or disadvantaged backgrounds, reflecting the scheme’s prioritisation of access. Its support for shorter, more flexible forms of mobility has been particularly important in widening participation, creating opportunities that are more manageable alongside work, family and financial commitments.
This focus reflects purposeful choices shaped by both the diversity of today’s student body and the way UK universities now operate within a competitive global higher education environment.
Equity and access
The question, then, is how well Erasmus can sustain the broader patterns of participation that Turing has helped to establish. While Erasmus offers many strengths, its traditional models work best for students who are already well placed to participate – those who can commit to longer periods abroad, manage higher upfront costs and navigate study in another European language or academic system.
Turing marked a deliberate shift away from that default. It was designed not just to increase the number of students going abroad, but to ensure that students from less financially secure backgrounds could access similar international opportunities to their peers. Removing it risks a return to mobility models that remain open in principle, but are easier to take up for students with greater financial flexibility and fewer competing pressures. If widening participation is to be meaningful, that risk requires careful consideration.
Global breadth and strategic reach
One of Erasmus’s greatest strengths is its stability secured in long term institutional partnerships, but the flexibility of Turing has enabled UK student mobility well beyond Europe at a time when global competition for talent, partnerships and influence is intensifying.
Erasmus does allow some third-country mobility through KA171, but this remains capped and partnership-led, with limited flexibility and scale. If Erasmus were to become the sole mobility mechanism, the geographic frame and shape of UK student mobility would inevitably narrow, at a time when UK universities are being required to think more globally.
A different operating environment
There is also a more fundamental structural issue at play. UK universities increasingly operate within a funding and policy environment that differs markedly from that of many European counterparts. They are less directly publicly funded, more dependent on international engagement, and therefore more globally oriented in both strategy and outlook. For many institutions, this is not a matter of ambition alone but of long-term sustainability.
Erasmus was designed to support publicly funded systems with strong regional integration. For EU member states, it also sits within a policy framework that they are able to shape and influence over time. Turing, by contrast, aligns more closely with the strategic reality of UK institutions, particularly those seeking to build sustainable engagement in growth regions such as India, Africa and Southeast Asia. In practice, universities were only beginning to realise how Turing could be used as a strategic lever alongside wider international priorities – removing it now risks cutting off that line of development just as it was starting to take shape.
Disruption and uncertainty
There is also a practical reality to consider. Over the past five years, universities have rebuilt systems, processes and partnerships around Turing. Removing it would create partnership instability, impose significant transition costs and erode institutional capacity at a point when resources are already under pressure.
Alongside this sits a wider issue of political volatility. If student mobility funding is subject to repeated policy shifts, institutions are forced into short-term planning, designing programmes that may not survive the next change of government. That environment makes sustained investment, partnership-building and long-term student opportunity much harder to achieve.
Design and delivery: what can be improved
None of this is to deny that Turing has significant shortcomings. Late funding announcements, heavy administrative requirements and a lack of certainty of funding from one year to the next have created real challenges for institutions. They have not been able to embed Turing fully into long-term strategy.
There are also limits that stem from Turing’s underlying design. Most notably, it does not provide funded reciprocal mobility unlike Erasmus, which embeds exchange as a core principle. Turing was deliberately constructed as an outward-only scheme. This has made it harder for institutions to sustain balanced international partnerships but reflects a conscious policy choice.
Challenges around timing, administration and predictability are matters of delivery rather than principle. With clearer commitment, earlier confirmation of funding and greater multi-year certainty – drawing on the planning cycles familiar from Erasmus – many of these could be substantially mitigated. The experience of recent years suggests not that Turing lacks purpose, but that it has not yet been given the conditions it needs to thrive.
A question of balance
The most sustainable outcome is not a choice between Erasmus or Turing, but an approach that recognises the distinct value of both. Together, they support different dimensions of student mobility: European depth alongside global reach; long-standing partnerships alongside flexibility; stability alongside responsiveness.
In a relatively short period of time, Turing has begun to reshape who participates in mobility, where students are able to go, and how international experience fits. It has opened doors to experiences previously beyond the reach of many students, supported more inclusive forms of participation and given UK institutions a tool that better reflects the global realities in which they operate. It would be a loss to see that progress curtailed.
Rejoining Erasmus may be both desirable and beneficial. But allowing it to replace Turing entirely would mean stepping back from gains in widening participation and global engagement that align closely with the strategic direction of UK higher education and with the needs of students.
When Linda Zaugg’s baby caught a high fever in January, it took an hour and a half to walk him to the hospital — a journey that usually takes 10 minutes. But this was Davos, Switzerland during the week of the World Economic Forum (WEF). Some 3,000 politicians and business leaders from all around the world had descended on the city to discuss important political and economic issues.
Zaugg is a member of the local parliament in Davos, a town in the Swiss Alps with a permanent population of about 11,000 people. She has been spearheading a campaign to raise awareness of the local impacts the conference has and find ways to mitigate them.
During the forum, traffic becomes so bad, she said, that ambulances have trouble finding their way through the streets of Davos, causing response times to increase significantly.
Traffic isn’t the only problem. During this time Davos experiences a massive influx of people, causing rent prices to explode by up to 10 times.
“This is the real problem with the WEF,” she said. “Not the conference itself, but all the people and companies that come along with it to make money and advertise.”
Economic effects of an economic forum
Albert Kruker, the tourism director of Davos, warned that these price increases may cause a price spiral which would affect the town year-round.
During the forum, local businesses go into overdrive trying to supply the politicians, journalists and other attendees with everything they require. When asked about it, the owner of a local bakery, Bäckerei Weber, said that it is one of the most profitable but also intense weeks of the year.
“During the conference you get all these catering companies coming in and the hotels are full, so we have a lot more orders,” he told us. “During the conference we work 24 hours a day. Because of the security, we usually start delivering at two o’clock in the morning.”
Many other business and house owners during this time either stock up on their goods or rent out their buildings for exorbitant prices. A banker living in Zurich with an apartment in Davos said that he can rent out his apartment for a single week during the conference for approximately three months’ rent.
In an apartment block right next to the conference hall, many inhabitants move out during the week. These apartments are then rented by journalists, attendees and large companies.
Disruption in Davos
One resident of an apartment block told us that he is never home during the WEF. “I rent out my apartment and go on holiday during this time,” he said.
The housing crunch during the forum is so intense that to accommodate attendees, some renters and families are forced out of their homes for the duration of the conference.
Zaugg said that some landlords even include a clause in the renter’s agreement dictating that the renters must leave during this period. A side effect of this is that many children must live temporarily outside the city and cannot attend school.
This problem is worsened by the fact that the streets are constantly congested and filled with drivers that aren’t used to Davos.
These drivers often do not respect speed limits or pedestrian only zones, requiring even more attention by commuters, which is especially difficult and dangerous for children and the elderly as they aren’t used to this amount of traffic.
Additionally, the public transportation system is bogged down during this time, once again causing confusion among society’s most vulnerable.
Crowds and congestion
Stephan Büchli, a local bus driver, said that there are no fixed schedules during this time as the traffic is simply too unpredictable. Additionally, they must use smaller buses, as the streets are too congested to allow the manoeuvring of the traditional ones.
Furthermore, the new drivers often also park in restricted zones, further impacting public transport.
“Last year I saw an old man at the local bus station during the conference. He was crying very heavily and was confused. It really made me angry,” Zaugg told us.
The level of congestion also brings other problems with it.
All this traffic creates substantial emissions. In 2023, the private jets attending the Forum alone generated 7,500 tons of CO2, roughly equivalent to the yearly emissions of 5,000 cars.
Minimising the carbon footprint
Part of the problem, Büchli said, is that limousines, trucks and taxis often leave their engines on while standing still, sometimes for upwards of half an hour. He himself has frequently witnessed cars idling with the engine running while stuck in traffic.
As a high-profile event, the WEF requires a lot of temporary structures, internal furnishings and food to function. Every year these temporary structures are erected in late December and then taken down again afterwards. Some of them only get used once and thrown away after only one week’s use.
The same is true for internal furniture such as carpets, shelves, computers and TV screens, as well as any leftover food. Several residents told us that after the WEF there are heaps of electronic equipment that gets thrown away.
Still, while many residents feel the effects, many keep their irritation to themselves out of fear of being labelled a WEF hater.
While there are several key problems with the Forum in its current form, the organisers aren’t sitting idle. Over the past few years, several steps have been taken to lessen the impact of these problems.
The road ahead
The most obvious of these steps is the reduction in waste. The organisers of the conference and the government of Davos have issued regulations on the number of temporary structures and their reusability. This has caused their number to noticeably decrease over the last few editions.
Old furniture and electronic devices are sold to the local inhabitants at reduced prices and spare food is offered to the residents for free, further contributing to making the WEF more sustainable.
To ensure that people can travel around in a manageable timeframe, the municipality has also set up extra trains that commute from one end of the town to the other. Entry into Davos by car was also restricted this year for visitors and tourists.
One of the most impactful changes was the installation of temporary ambulance stations. These stations are scattered across Davos, allowing them to respond quickly to emergencies and save lives.
Over the last few years, both the WEF organisation and Davos itself have taken several different measures to lessen the negative impacts of the conference. However, these issues still persist and require solutions.
Only time will tell if the people who organize a conference meant to bring people together to improve the state of the world can improve the lives of the people who live in this small town in the Alps, for one week of the year.
“You truly notice how the ideological part of the WEF, the bringing together of people, gets pushed into the background in favour of economic reasons,” Zaugg said.
Questions to consider:
1. What is the World Economic Forum?
2. In what ways is the town of Davos negatively affected by the WEF?
3. Is there an event that disrupts life near where you live? How do people deal with it?
This audio is auto-generated. Please let us know if you have feedback.
Dive Brief:
Nearly all senior higher education leaders — 98% — reported that federal policymaking has introduced uncertainty into institutional planning, according to the latest pulse survey from the American Council on Education.
Topping the list of senior leaders’ most pressing concerns is state and federal interference with colleges’ autonomy.Over 70% of leaders said they were either extremely or moderately concerned about threats to independence and academic freedom.
“Uncertainty around research funding, immigration and international engagement, academic freedom, and student aid policy are shaping institutional decision-making and straining long-term planning efforts,” the report’s authors wrote.
Dive Insight:
After 2025’s many policy upheavals, it would be shocking only if college leaders didn’treport some uncertainty.
Last year, President Donald Trump and his administration upended many of the sector’s longstanding precedents and fundamental assumptions. Trump’s executive branch attacked everything from the U.S. Department of Education as a whole, to research funding, to the visa system for international students, to individual colleges, many of which became targets of civil rights investigations and political pressure campaigns.
All of that tumult is clearly weighing on the minds of college leaders.
Nearly three in four senior leaders described their level of uncertainty about the federal policy environment and its impact on planning as “extreme” or “moderate,” according to the poll. Another 19% reported “some” uncertainty and 7% described it as “slight.”
Trump’s impact on international student enrollment — with recent studies showing dips in graduate and new students from abroad — also loomed large for many leaders. Sixty percent said they were extremely or moderately concerned about immigration restrictions and visa revocations.
Academic freedom and institutional autonomy are also arguably more at risk than they have been in generations.
Trump’s government has tried to force policy changes at colleges through federal investigations, research funding cuts and his compact for higher education. In some cases, the administration has wrested payments and policy changes from institutions under pressure.
But many colleges and universities are also losing their independence through new state laws that aim to weaken governance, direct course content, and banish diversity, equity and inclusion efforts.
In a recent report, the free expression group PEN America described 2025 as a “catastrophe” for higher ed. The group counted 21 bills across 15 states enacted in 2025 that it says censor higher education and were the “result of a relentless, years-old campaign to exert ideological control over college and university campuses.”
College leaders also flagged perennial challenges among their concerns in the ACE poll. That includes fiscal pressures, with 44% reporting either extreme or moderate concern about long-term financial viability. Enrollment, the mental health of students and perceptions about higher education’s value were all among leaders’ most pressing concerns as well. Over 75% reported extreme or moderate concern around what the public and policymakers thought about the sector.
The ACE report drew from a December survey of 386 senior leaders from colleges nationwide.
It’s truly incredible how much new technology has made its way into the classroom. Where once teaching consisted primarily of whiteboards and textbooks, you can now find tablets, smart screens, AI assistants, and a trove of learning apps designed to foster inquiry and maximize student growth.
While these new tools are certainly helpful, the flood of options means that educators can struggle to discern truly useful resources from one-time gimmicks. As a result, some of the best tools for sparking curiosity, creativity, and critical thinking often go overlooked.
Personally, I believe 3D printing is one such tool that doesn’t get nearly enough consideration for the way it transforms a classroom.
3D printing is the process of making a physical object from a three-dimensional digital model, typically by laying down many thin layers of material using a specialized printer. Using 3D printing, a teacher could make a model of a fossil to share with students, trophies for inter-class competitions, or even supplies for construction activities.
At first glance, this might not seem all that revolutionary. However, 3D printing offers three distinct educational advantages that have the potential to transform K–12 learning:
It develops success skills: 3D printing encourages students to build a variety of success skills that prepare them for challenges outside the classroom. For starters, its inclusion creates opportunities for students to practice communication, collaboration, and other social-emotional skills. The process of moving from an idea to a physical, printed prototype fosters perseverance and creativity. Meanwhile, every print–regardless of its success–builds perseverance and problem-solving confidence. This is the type of hands-on, inquiry-based learning that students remember.
It creates cross-curricular connections: 3D printing is intrinsically cross-curricular. Professional scientists, engineers, and technicians often use 3D printing to create product models or build prototypes for testing their hypotheses. This process involves documentation, symbolism, color theory, understanding of narrative, and countless other disciplines. It doesn’t take much imagination to see how these could also be beneficial to classroom learning. Students can observe for themselves how subjects connect, while teachers transform abstract concepts into tangible points of understanding.
It’s aligned with engineering and NGSS: 3D printing aligns perfectly with Next Gen Science Standards. By focusing on the engineering design process (define, imagine, plan, create, improve) students learn to think and act like real scientists to overcome obstacles. This approach also emphasizes iteration and evidence-based conclusions. What better way to facilitate student engagement, hands-on inquiry, and creative expression?
3D printing might not be the flashiest educational tool, but its potential is undeniable. This flexible resource can give students something tangible to work with while sparking wonder and pushing them to explore new horizons.
So, take a moment to familiarize yourself with the technology. Maybe try running a few experiments of your own. When used with purpose, 3D printing transforms from a common classroom tool into a launchpad for student discovery.
Jon Oosterman, Van Andel Institute for Education
Jon Oosterman is a Learning Specialist at Van Andel Institute for Education, a Michigan-based education nonprofit dedicated to creating classrooms where curiosity, creativity, and critical thinking thrive.
Latest posts by eSchool Media Contributors (see all)
Looking back on my lifelong history of learning experiences, the ones that I would rank as most effective and memorable were the ones in which the instructor truly saw me, understood my motivations and encouraged me to apply the learning to my own circumstances. This critical aspect of teaching and learning is included in most every meaningful pedagogical approach. We commonly recognize that the best practices of our field include a sensitivity to and understanding of the learner’s experiences, motivations and goals. Without responding to the learner’s needs, we will fall short of the common goal of internalizing whatever learning takes place.
Some might believe that AI, as a computer-based system, merely addresses the facts, formulas and figures of quantitative learning rather than emotionally intelligent engagement with the learner. In its initial development that may have been true, however, AI has developed the ability to recognize and respond to emotional aspects of the learner’s responses.
Most Popular
In September 2024, the South-East Europe Design Automation, Computer Engineering, Computer Networks and Social Media Conference included research by four professors from the University of West Attica in Egaleo, Greece—Theofanis Tasoulas, Christos Troussas, Phivos Mylonas and Cleo Sgouropoulou—titled “Affective Computing in Intelligent Tutoring Systems: Exploring Insights and Innovations.” The authors described the importance of including affective engagement into developing learning systems:
“Integrating intelligent tutoring systems (ITS) into education has significantly enriched personalized learning experiences for students and educators alike. However, these systems often neglect the critical role of emotions in the learning process. By integrating affective computing, which empowers computers to recognize and respond to emotions, ITS can foster more engaging and impactful learning environments. This paper explores the utilization of affective computing techniques, such as facial expression analysis and voice modulation, to enhance ITS functionality. Case studies and existing systems have been scrutinized to comprehend design decisions, outcomes, and guidelines for effective integration, thereby enhancing learning outcomes and user engagement. Furthermore, this study underscores the necessity of considering emotional aspects in the development and deployment of educational technology to optimize its influence on student learning and well-being. A major conclusion of this research is that integration of affective computing into ITS empowers educators to customize learning experiences to students’ emotional states, thereby enhancing educational effectiveness.”
“Affective intelligent tutoring systems (ATSs) are gaining recognition for their role in personalized learning through adaptive automated education based on students’ affective states. This scoping review evaluates recent advancements and the educational impact of ATSs, following PRISMA guidelines for article selection and analysis. A structured search of the Web of Science (WoS) and Scopus databases resulted in 30 studies covering 27 distinct ATSs. These studies assess the effectiveness of ATSs in meeting learners’ emotional and cognitive needs. This review examines the technical and pedagogical aspects of ATSs, focusing on how emotional recognition technologies are used to customize educational content and feedback, enhancing learning experiences. The primary characteristics of the selected studies are described, emphasizing key technical features and their implications for educational outcomes. The discussion highlights the importance of emotional intelligence in educational environments and the potential of ATSs to improve learning processes.”
“Agents will be able to gather data from multiple sources to assess a student’s progress across multiple courses. If the student starts falling behind, processes could kick in to help them catch up. Agents can relieve teachers and administrators from time-consuming chores such as grading multiple-choice tests and monitoring attendance. The idea is catching on. Andrew Ng, co-founder of Coursera, launched a startup called Kira Learning to ease burdens on overworked teachers. ‘Kira’s AI tutor works alongside teachers as an intelligent co-educator, adapting in real-time to each student’s learning style and emotional state,’ Andrea Pasinetti, Kira Learning’s CEO, says in an interview with The Observer.”
We are no longer limited to transactional chatbots that respond to questions from students without regard to their background, whether that be academic, experiential or even emotional. Using the capabilities of advanced AI, our engagements can analyze, identify and adapt to a range of learner emotions. These components are often the hallmark of excellent, experienced faculty members who do not teach only to the median of the class but instead offer personalized responses to meet the interests and needs of individual students.
As we look ahead to the last half of this semester, and succeeding semesters, we can expect that enhanced technology will enable us to better serve our learners. We will be able to identify growing frustration where that may be the case or the opportunity to accelerate the pace of the learning experience when learners display comfort with the learning materials and readiness to advance at their own pace ahead of others in the class.
We all recognize that this field is moving very rapidly. It is important that we have leaders at all levels who are prepared to experiment with the emergent technologies, demonstrate their capabilities and lead discussions on the potential for implementations. The results can be most rewarding, with a higher percentage of learners more comfortably reaching their goals. Are you prepared to take the lead in demonstrating these technologies to your colleagues?
About a quarter of faculty don’t use any AI tools at all, and about a third don’t use them in teaching, according to the survey.
Oleh Stefaniak/iStock/Getty Images Plus
Faculty overwhelmingly agree that generative artificial intelligence will have an impact on teaching and learning in higher education, but whether that impact is positive or negative is still up for debate.
Nine in 10 faculty members say that generative AI will diminish students’ critical thinking skills, and 95 percent say its impact will increase students’ overreliance on AI tools over time, according to a report out today from the American Association of Colleges and Universities and Elon University.
In November, the groups surveyed 1,057 faculty members at U.S. institutions about their thoughts on generative AI’s impact. Eighty-three percent of faculty said the technology will decrease students’ attention spans, and 79 percent said they think the typical teaching model in their department will be affected by AI.
Most professors—86 percent—said that the impact of AI on teachers will be “significant and transformative or at least noticeable,” the report states. Only 4 percent said that AI’s effect on teaching will “not amount to much.” About half of faculty respondents said AI will have a negative effect on students’ careers over the next five years, while 20 percent said it will have a positive effect and another 20 percent said it will be equally negative and positive.
Faculty are largely unprepared for AI in the classroom, the report shows. About 68 percent of faculty said their institutions have not prepared faculty to use AI in teaching, student mentorship and scholarship. Most of their recent graduates are underprepared, too. Sixty-three percent of professors said that last spring’s graduates were not very or not at all prepared to use generative AI at work, and 71 percent said the graduates were not prepared to understand ethical issues related to AI use.
About a quarter of faculty don’t use any AI tools at all, and about a third don’t use them in teaching, according to the report. This faculty resistance is a challenge, survey respondents say. About 82 percent of faculty said that resistance to AI or unfamiliarity with AI are hurdles in adopting the tools in their departments.
“These findings explain why nearly half of surveyed faculty view the future impact of GenAI in their fields as more negative than positive, while only one in five see it as more positive than negative,” Lynn Pasquerella, president of the AAC&U, wrote in her introduction to the report. “Yet, this is not a story of simple resistance to change. It is, instead, a portrait of a profession grappling seriously with how to uphold educational values in a rapidly shifting technological landscape.”
While most professors—78 percent—said AI-driven cheating is on the rise, they are split about what exactly constitutes cheating. Just over half of faculty said it’s cheating for a student to follow a detailed AI-generated outline when writing a paper, while just under half said it is either a legitimate use of AI or they’re not sure. Another 45 percent of faculty said that using generative AI to edit a paper is a legitimate use of the tool, while the remaining 55 percent said it was illegitimate or they were unsure.
Despite their agreement on generative AI’s overall impact, faculty are split on whether AI literacy is important for students. About half of professors said AI literacy is “extremely or very important” to their students’ success, while 11 percent said it’s slightly important and 13 percent said it’s irrelevant.
Professors held a few hopeful predictions about generative AI. Sixty-one percent of respondents said it will improve and customize learning in the future. Four in 10 professors said it will increase the ability of students to write clearly, and 41 percent said it will improve students’ research skills.