Tag: Data

  • Federal judge gives DOGE access to education data

    Federal judge gives DOGE access to education data

    The University of California Student Association’s request to block Department of Government Efficiency staffers from accessing student data at the Department of Education was denied Monday by a federal district judge. 

    The lawsuit, filed earlier this month, accused the department of illegally sharing confidential student data, arguing it violated the 1974 Privacy Act and confidentiality provisions of the Internal Revenue Code by giving DOGE access to records that contain tax information. 

    But Judge Randolph D. Moss of the District Court for the District of Columbia said there wasn’t an immediate threat, citing testimony from Adam Ramada, a DOGE staffer, who said that he and his team were only assisting the department with auditing for waste, fraud and abuse and that DOGE staffers understood the need to comply with data privacy laws. 

    “None of those initiatives should involve disclosure of any sensitive, personal information about any UCSA members,” Moss, an Obama appointee, wrote in his ruling. “The future injuries that UCSA’s members fear are, therefore, far from likely, let alone certain and great.”

    Other higher education groups have raised concerns about DOGE’s access to education data, as the department’s databases house students’ personal information, including dates of birth, contact information and Social Security numbers. Some student advocates worry the data could be illegally shared with other agencies and used for immigration enforcement. Moss, however, called those harms “entirely conjectural,” saying Ramada had attested that the data was not being used in such ways.

    Although the temporary restraining order was denied, the overall lawsuit will continue to work its way through the courts, and other legal challenges are emerging, The Washington Post reported.

    A coalition of labor unions, including the American Federation of Teachers, is also suing to block DOGE’s access to the sensitive data. This latest lawsuit argues that agencies—including Education, Labor and Personnel Management—are improperly disclosing the records of millions of Americans in violation of the Privacy Act.

    Source link

  • Embracing a growth mindset when reviewing student data

    Embracing a growth mindset when reviewing student data

    Key points:

    In the words of Carol Dweck, “Becoming is better than being.” As novice sixth grade math and English teachers, we’ve learned to approach our mid-year benchmark assessments not as final judgments but as tools for reflection and growth. Many of our students entered the school year below grade level, and while achieving grade-level mastery is challenging, a growth mindset allows us to see their potential, celebrate progress, and plan for further successes amongst our students. This perspective transforms data analysis into an empowering process; data is a tool for improvement amongst our students rather than a measure of failure.

    A growth mindset is the belief that abilities grow through effort and persistence. This mindset shapes how we view data. Instead of focusing on what students can’t do, we emphasize what they can achieve. For us, this means turning gaps into opportunities for growth and modeling optimism and resilience for our students. When reviewing data, we don’t dwell on weaknesses. We set small and achievable goals to help students move forward to build confidence and momentum.

    Celebrating progress is vital. Even small wins (i.e., moving from a kindergarten grade-level to a 1st– or 2nd-grade level, significant growth in one domain, etc.) are causes for recognition. Highlighting these successes motivates students and shows them that effort leads to results.

    Involving students in the process is also advantageous. At student-led conferences, our students presented their data via slideshows that they created after they reviewed their growth, identified their strengths, and generated next steps with their teachers. This allowed them to feel and have tremendous ownership over their learning. In addition, interdisciplinary collaboration at our weekly professional learning communities (PLCs) has strengthened this process. To support our students who struggle in English and math, we work together to address overlapping challenges (i.e., teaching math vocabulary, chunking word-problems, etc.) to ensure students build skills in connected and meaningful ways.

    We also address the social-emotional side of learning. Many students come to us with fixed mindsets by believing they’re just “bad at math” or “not good readers.” We counter this by celebrating effort, by normalizing struggle, and by creating a safe and supportive environment where mistakes are part of learning. Progress is often slow, but it’s real. Students may not reach grade-level standards in one year, but gains in confidence, skills, and mindset set the stage for future success, as evidenced by our students’ mid-year benchmark results. We emphasize the concept of having a “growth mindset,” because in the words of Denzel Washington, “The road to success is always under construction.” By embracing growth and seeing potential in every student, improvement, resilience, and hope will allow for a brighter future.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Data, Decisions, and Disruptions: Inside the World of University Rankings

    Data, Decisions, and Disruptions: Inside the World of University Rankings

    University rankings are pretty much everywhere. Though the earliest university rankings in the U. S. date back to the early 1900s and the modern ones from the 1983 debut of the U. S. News and World Report rankings. The kind of rankings we tend to talk about now, international or global rankings, really only date back to 2003 with the creation of the Shanghai Academic Rankings of World Universities.

    Over the decade that followed that first publication, a triumvirate emerged at the top of the rankings pyramid. The Shanghai Rankings, run by a group of academics at the Shanghai Jiao Tong University, the Quacquarelli Symonds, or QS Rankings, and the Times Higher Education’s World University Rankings. Between them, these three rankings producers, particularly QS and Times Higher, created a bewildering array of new rankings, dividing the world up by geography and field of study, mainly based on metrics relating to research.

    Joining me today is the former Chief Data Officer of the Times Higher Education Rankings, Duncan Ross. He took over those rankings at a time when it seemed like the higher education world might be running out of things to rank. Under his tutelage, though, the Times Impact Rankings, which are based around the 17 UN Sustainable Development Goals, were developed. And that’s created a genuinely new hierarchy in world higher education, at least among those institutions who choose to submit to the rankings.  

    My discussion with Duncan today covers a wide range of topics related to his time at THE. But the most enjoyable bit by far, for me anything, was the bit about the genesis of the impact rankings. Listen a bit, especially when Duncan talks about how the Impact Rankings came about because the THE realized that its industry rankings weren’t very reliable. Fun fact, around that time I got into a very public debate with Phil Beatty, the editor of the Times Higher, on exactly that subject. Which means maybe, just maybe, I’m kind of a godparent to the impact rankings. But that’s just me. You may well find other points of interest in this very compelling interview. Let’s hand things over to Duncan.


    The World of Higher Education Podcast
    Episode 3.20 | Data, Decisions, and Disruptions: Inside the World of University Rankings 

    Transcript

    Alex Usher: So, Duncan, let’s start at the beginning. I’m curious—what got you into university rankings in the first place? How did you end up at Times Higher Education in 2015?

    Duncan Ross: I think it was almost by chance. I had been working in the tech sector for a large data warehousing company, which meant I was working across many industries—almost every industry except higher education. I was looking for a new challenge, something completely different. Then a friend approached me and mentioned a role that might interest me. So I started talking to Times Higher Education, and it turned out it really was a great fit.

    Alex Usher: So when you arrived at Times Higher in 2015, the company already had a pretty full set of rankings products, right? They had the global rankings, the regional rankings, which I think started around 2010, and then the subject or field of study rankings came a couple of years later. When you looked at all of that, what did you think? What did you feel needed to be improved?

    Duncan Ross: Well, the first thing I had to do was actually bring all of that production in-house. At the time, even though Times Higher had rankings, they were produced by Clarivate—well, Thomson Reuters, as it was then. They were doing a perfectly good job, but if you’re not in control of the data yourself, there’s a limit to what you can do with it.

    Another key issue was that, while it looked like Times Higher had many rankings, in reality, they had just one: the World University Rankings. The other rankings were simply different cuts of that same data. And even within the World University Rankings, only 400 universities were included, with a strong bias toward Europe and North America. About 26 or 27 percent of those institutions were from the U.S., which didn’t truly reflect the global landscape of higher education.

    So the challenge was: how could we broaden our scope and truly capture the world of higher education beyond the usual suspects? And beyond that, were there other aspects of universities that we could measure, rather than just relying on research-centered metrics? There are good reasons why international rankings tend to focus on research—it’s the most consistent data available—but as you know, it’s certainly not the only way to define excellence in higher education.

    Alex Usher: Oh, yeah. So how did you address the issue of geographic diversity? Was it as simple as saying, “We’re not going to limit it to 400 universities—we’re going to expand it”? I think the ranking now includes over a thousand institutions, right? I’ve forgotten the exact number.

    Duncan Ross: It’s actually around 2,100 or so, and in practice, the number is even larger because, about two years ago, we introduced the concept of reporter institutions. These are institutions that haven’t yet met the criteria to be fully ranked but are already providing data.

    The World University Rankings have an artificial limit because there’s a threshold for participation based on the number of research articles published. That threshold is set at 1,000 papers over a five-year period. If we look at how many universities could potentially meet that criterion, it’s probably around 3,000, and that number keeps growing. But even that is just a fraction of the higher education institutions worldwide. There are likely 30,000—maybe even 40,000—higher education institutions globally, and that’s before we even consider community colleges.

    So, expanding the rankings was about removing artificial boundaries. We needed to reach out to institutions in parts of the world that weren’t well represented and think about higher education in a way that wasn’t so Anglo-centric.

    One of the biggest challenges I’ve encountered—and it’s something people inevitably fall into—is that we tend to view higher education through the lens of our own experiences. But higher education doesn’t function the same way everywhere. It’s easy to assume that all universities should look like those in Canada, the U.S., or the UK—but that’s simply not the case.

    To improve the rankings, we had to be open-minded, engage with institutions globally, and carefully navigate the challenges of collecting data on such a large scale. As a result, Times Higher Education now has data on around 5,000 to 6,000 universities—a huge step up from the original 400. Still, it’s just a fraction of the institutions that exist worldwide.

    Alex Usher: Well, that’s exactly the mission of this podcast—to get people to think beyond an Anglo-centric view of the world. So I take your point that, in your first couple of years at Times Higher Education, most of what you were doing was working with a single set of data and slicing it in different ways.

    But even with that, collecting data for rankings isn’t simple, right? It’s tricky, you have to make a lot of decisions, especially about inclusion—what to include and how to weight different factors. And I think you’ve had to deal with a couple of major issues over the years—one in your first few years and another more recently.

    One was about fractional counting of articles, which I remember went on for quite a while. There was that big surge of CERN-related articles, mostly coming out of Switzerland but with thousands of authors from around the world, which affected the weighting. That led to a move toward fractional weighting, which in theory equalized things a bit—but not everyone agreed.

    More recently, you’ve had an issue with voting, right? What I think was called a cartel of voters in the Middle East, related to the reputation rankings. Can you talk a bit about how you handle these kinds of challenges?

    Duncan Ross: Well, I think the starting point is that we’re always trying to evaluate things in a fair and consistent way. But inevitably, we’re dealing with a very noisy and messy world.

    The two cases you mentioned are actually quite different. One is about adjusting to the norms of the higher education sector, particularly in publishing. A lot of academics, especially those working within a single discipline, assume that publishing works the same way across all fields—that you can create a universal set of rules that apply to everyone. But that’s simply not the case.

    For example, the concept of a first author doesn’t exist in every discipline. Likewise, in some fields, the principal investigator (PI) is always listed at the end of the author list, while in others, that’s not the norm.

    One of the biggest challenges we faced was in fields dealing with big science—large-scale research projects involving hundreds or even thousands of contributors. In high-energy physics, for example, a decision was made back in the 1920s: everyone who participates in an experiment above a certain threshold is listed as an author in alphabetical order. They even have a committee to determine who meets that threshold—because, of course, it’s academia, so there has to be a committee.

    But when you have 5,000 authors on a single paper, that distorts the rankings. So we had to develop a mechanism to handle that. Ideally, we’d have a single metric that works in all cases—just like in physics, where we don’t use one model of gravity in some situations and a different one in others. But sometimes, you have to make exceptions. Now, Times Higher Education is moving toward more sophisticated bibliometric measures to address these challenges in a better way.

    The second issue you mentioned—the voting behavior in reputation rankings—is completely different because it involves inappropriate behavior. And this kind of issue isn’t just institutional; sometimes, it’s at the individual academic level.

    We’re seeing this in publishing as well, where some academics are somehow producing over 200 articles a year. Impressive productivity, sure—but is it actually viable? In cases like this, the approach has to be different. It’s about identifying and penalizing misbehavior.

    At the same time, we don’t want to be judge and jury. It’s difficult because, often, we can see statistical patterns that strongly suggest something is happening, but we don’t always have a smoking gun. So our goal is always to be as fair and equitable as possible while putting safeguards in place to maintain the integrity of the rankings.

    Alex Usher: Duncan, you hinted at this earlier, but I want to turn now to the Impact Rankings. This was the big initiative you introduced at Times Higher Education. Tell us about the genesis of those rankings—where did the idea come from? Why focus on impact? And why the SDGs?

    Duncan Ross: It actually didn’t start out as a sustainability-focused project. The idea came from my colleague, Phil Baty, who had always been concerned that the World University Rankings didn’t include enough measurement around technology transfer.

    So, we set out to collect data from universities on that—looking at things like income from consultancy and university spin-offs. But when the data came back, it was a complete mess—totally inconsistent and fundamentally unusable. So, I had to go back to the drawing board.

    That’s when I came across SDG 9—Industry, Innovation, and Infrastructure. I looked at it and thought, This is interesting. It was compelling because it provided an external framework.

    One of the challenges with ranking models is that people always question them—Is this really a good model for excellence? But with an external framework like the SDGs, if someone challenges it, I can just point to the United Nations and say, Take it up with them.

    At that point, I had done some data science work and was familiar with the tank problem, so I jokingly assumed there were probably 13 to 18 SDGs out there. (That’s a data science joke—those don’t land well 99% of the time.) But as it turned out, there were more SDGs, and exploring them was a real light bulb moment.

    The SDGs provided a powerful framework for understanding the most positive role universities can play in the world today. We all know—well, at least those of us outside the U.S. know—that we’re facing a climate catastrophe. Higher education has a crucial role to play in addressing it.

    So, the question became: How can we support that? How can we measure it? How can we encourage better behavior in this incredibly important sector?

    Alex Usher: The Impact Rankings are very different in that roughly half of the indicators—about 240 to 250 across all 17 SDGs—aren’t naturally quantifiable. Instead, they’re based on stories.

    For example, an institution might submit, This is how we combat organized crime or This is how we ensure our food sourcing is organic. These responses are scored based on institutional submissions.

    Now, I don’t know exactly how Times Higher Education evaluates them, but there has to be a system in place. How do you ensure that these institutional answers—maybe 120 to 130 per institution at most—are scored fairly and consistently when you’re dealing with hundreds of institutions?

    Duncan Ross: Well, I can tell you that this year, over 2,500 institutions submitted approved data—so it’s grown significantly. One thing to clarify, though, is that these aren’t written-up reports like the UK’s Teaching Excellence Framework, where universities can submit an essay justifying why they didn’t score as well as expected—what I like to call the dog ate my student statistics paper excuse. Instead, we ask for evidence of the work institutions have done. That evidence can take different forms—sometimes policies, sometimes procedures, sometimes concrete examples of their initiatives. The scoring process itself is relatively straightforward. First, we give some credit if an institution says they’re doing something. Then, we assess the evidence they provide to determine whether it actually supports their claim. But the third and most important part is that institutions receive extra credit if the evidence is publicly available. If you publish your policies or reports, you open yourself up to scrutiny, which adds accountability.

    A great example is SDG 5—Gender Equality—specifically around gender pay equity. If an institution claims to have a policy on gender pay equity, we check: Do you publish it? If so, and you’re not actually living up to it, I’d hope—and expect—that women within the institution will challenge you on it. That’s part of the balancing mechanism in this process.

    Now, how do we evaluate all this? Until this year, we relied on a team of assessors. We brought in people, trained them, supported them with our regular staff, and implemented a layer of checks—such as cross-referencing responses against previous years. Ultimately, human assessors were making the decisions.

    This year, as you might expect, we’re introducing AI to assist with the process. AI helps us filter out straightforward cases, leaving the more complex ones for human assessors. It also ensures that we don’t run into assessor fatigue. When someone has reviewed 15 different answers to the same question from various universities, the process can get a bit tedious—AI helps mitigate that.

    Alex Usher: Yeah, it’s like that experiment with Israeli judges, right? You don’t want to be the last case before lunch—you get a much harsher sentence if the judge is making decisions on an empty stomach. I imagine you must have similar issues to deal with in rankings.

    I’ve been really impressed by how enthusiastically institutions have embraced the Impact Rankings. Canadian universities, in particular, have really taken to them. I think we had four of the top ten last year and three of the top ten this year, which is rare for us. But the uptake hasn’t been as strong—at least not yet—in China or the United States, which are arguably the two biggest national players in research-based university rankings. Maybe that’s changing this year, but why do you think the reception has been so different in different parts of the world? And what does that say about how different regions view the purpose of universities?

    Duncan Ross: I think there’s definitely a case that different countries and regions have different approaches to the SDGs. In China, as you might expect, interest in the rankings depends on how well they align with current Communist Party priorities. You could argue that something similar happens in the U.S. The incoming administration has made it fairly clear that SDG 10 (Reduced Inequalities) and SDG 5 (Gender Equality) are not going to be top priorities—probably not SDG 1 (No Poverty), either. So in some cases, a country’s level of engagement reflects its political landscape.

    But sometimes, it also reflects the economic structure of the higher education system itself. In the U.S., where universities rely heavily on high tuition fees, rankings are all about attracting students. And the dominant ranking in that market is U.S. News & World Report—the 600-pound gorilla. If I were in their position, I’d focus on that, too, because it’s the ranking that brings in applications.

    In other parts of the world, though, rankings serve a different purpose. This ties back to our earlier discussion about different priorities in different regions. Take Indonesia, for example. There are over 4,000 universities in the country. If you’re an institution like ITS (Institut Teknologi Sepuluh Nopember), how do you stand out? How do you show that you’re different from other universities?

    For them, the Impact Rankings provided an opportunity to showcase the important work they’re doing—work that might not have been recognized in traditional rankings. And that’s something I’m particularly proud of with the Impact Rankings. Unlike the World University Rankings or the Teaching Rankings, it’s not just the usual suspects at the top.

    One of my favorite examples is Western Sydney University. It’s a fantastic institution. If you’re ever in Sydney, take the train out there. Stay on the train—it’s a long way from the city center—but go visit them. Look at the incredible work they’re doing, not just in sustainability but also in their engagement with Aboriginal and Torres Strait Islander communities. They’re making a real impact, and I’m so pleased that we’ve been able to raise the profile of institutions like Western Sydney—universities that might not otherwise get the recognition they truly deserve.

    Alex Usher: But you’re still left with the problem that many institutions that do really well in research rankings have, in effect, boycotted the Impact Rankings—simply because they’re not guaranteed to come first.

    A lot of them seem to take the attitude of, Why would I participate in a ranking if I don’t know I’ll be at the top?

    I know you initially faced that issue with LERU (the League of European Research Universities), and I guess the U.S. is still a challenge, with lower participation numbers.

    Do you think Times Higher Education will eventually crack that? It’s a tough nut to crack. I mean, even the OECD ran into the same resistance—it was the same people saying, Rankings are terrible, and we don’t want better ones.

    What’s your take on that?

    Duncan Ross: Well, I’ve got a brief anecdote about this whole rankings boycott approach. There’s one university—I’m not going to name them—that made a very public statement about withdrawing from the Times Higher Education World University Rankings. And just to be clear, that’s something you can do, because participation is voluntary—not all rankings are. So, they made this big announcement about pulling out. Then, about a month later, we got an email from their graduate studies department asking, Can we get a copy of your rankings? We use them to evaluate applicants for interviews. So, there’s definitely some odd thinking at play here. But when it comes to the Impact Rankings, I’m pretty relaxed about it. Sure, it would be nice to have Oxford or Harvard participate—but MIT does, and they’re a reasonably good school, I hear. Spiderman applied there, so it’s got to be decent. The way I see it, the so-called top universities already have plenty of rankings they can focus on. If we say there are 300 top universities in the world, what about the other 36,000 institutions?

    Alex Usher: I just want to end on a slightly different note. While doing some background research for this interview, I came across your involvement in DataKind—a data charity that, if I understand correctly, you founded. I’ve never heard of a data charity before, and I find the idea fascinating—intriguing enough that I’m even thinking about starting one here. Tell us about DataKind—what does it do?

    Duncan Ross: Thank you! So, DataKind was actually founded in the U.S. by Jake Porway. I first came across it at one of the early big data conferences—O’Reilly’s Strata Conference in New York. Jake was talking about how data could be used for good, and at the time, I had been involved in leadership roles at several UK charities. It was a light bulb moment. I went up to Jake and said, Let me start a UK equivalent! At first, he was noncommittal—he said, Yeah, sure… someday. But I just kept nagging him until eventually, he gave in and said yes. Together with an amazing group of people in the UK—Fran Bennett, Caitlin Thaney, and Stuart Townsend—we set up DataKind UK.

    The concept is simple: we often talk about how businesses—whether in telecom, retail, or finance—use data to operate more effectively. The same is true in the nonprofit sector. The difference is that banks can afford to hire data scientists—charities often can’t. So, DataKind was created to connect data scientists with nonprofit organizations, allowing them to volunteer their skills.

    Of course, for this to work, a charity needs a few things:

    1. Leadership willing to embrace data-driven decision-making.
    2. A well-defined problem that can be analyzed.
    3. Access to data—because without data, we can’t do much.

    Over the years, DataKind—both in the U.S. and worldwide—has done incredible work. We’ve helped nonprofits understand what their data is telling them, improve their use of resources, and ultimately, do more for the communities they serve. I stepped down from DataKind UK in 2020 because I believe that the true test of something successful is whether it can continue to thrive without you. And I’m happy to say it’s still going strong. I kind of hope the Impact Rankings continue to thrive at Times Higher Education now that I’ve moved on as well.

    Alex Usher: Yeah. Well, thank you for joining us today, Duncan.

    Duncan Ross: It’s been a pleasure.

    And it just remains for me to thank our excellent producers, Sam Pufek and Tiffany MacLennan. And you, our viewers, listeners, and readers for joining us today. If you have any questions or comments about today’s episode, please don’t hesitate to get in touch with us at [email protected]. Worried about missing an episode of the World of Higher Education? There’s a solution for that. Go to our YouTube page and subscribe. Next week, our guest will be Jim Dickinson. He’s an associate editor at Wonkhe in the UK, and he’s also maybe the world expert on comparative student politics. And he joins us to talk about the events in Serbia where the student movement is challenging the populist government of the day. Bye for now.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service.

    Source link

  • Fun with Participation Rate Data

    Fun with Participation Rate Data

    Just a quick one today, mostly charts.

    Back in the fall, StatsCan released a mess of data from the Labour Force Survey looking at education participation rates—that is, the percentage of any given age cohort that is attending education—over the past 25 years. So, let’s go see what it says.

    Figure 1 shows total education participation rates, across all levels of education, from age 15 to 29, for selected years over the past quarter century. At the two ends of the graph, the numbers look pretty similar. At age 15, we’ve always had 95%+ of our population enrolled in school (almost exclusively secondary education, and from age 26 and above, we’ve always been in the low-tweens or high single digits. The falling-off in participation is fairly steady: for every age-year above 17, about 10% of the population exits education up until the age of 26. The big increase in education enrolments that we’ve seen over the past couple of decades has really occurred in the 18-24 range, where participation rates (almost exclusively in universities, as we shall see) have increased enormously.

    Figure 1: Participation rates in Education (all institutions) by Age, Canada, select years 1999-00 to 2023-24

    Figure 2 shows current participation rates by age and type of postsecondary institution. People sometimes have the impression that colleges cater to an “older” clientele, but in fact, at any given age under 30, Canadians are much more likely to be enrolled in universities than in colleges. Colleges have a very high base in the teens because of the way the CEGEP system works in Quebec (I’ll come back to regional diversity in a minute), and it is certainly true that there is a very wide gap in favour of universities among Canadians in their mid-20s. But while the part rate gap narrows substantially at about age 25, it is never the case that the college participation rate surpasses the university one.

    Figure 2: Participation Rates by Age and Institution Type, Canada, 2023-24

    Figure 3 shows college participation rates by age over time. What you should take from this is that there has been a slight decline in college participation rates over time in the 19-23 age range, but beyond that not much has changed.

    Figure 3: College Participation Rates by Age, Selected Years, 1999-2000 to 2023-24

    Figure 4 uses the same lens as figure 3 only for universities. And it’s about as different as it can be. In 1999, fewer than one in ten Canadians aged 18 was in university: now it is three in ten. In 1999, only one in four 21 year-olds was in university, now it is four-in-ten. These aren’t purely the effects of increased demand; the elimination of grade 13 in Ontario had a lot to do with the changes for 18-year-olds; Alberta and British Columbia converting a number of their institutions from colleges to universities in the late 00s probably juices these numbers a bit, too. But on the whole, what we’ve seen is a significant increase in the rate at which young people are choosing to attend universities between the ages of 18 and 24. However, beyond those ages the growth is less pronounced. There was certainly growth in older student participation rates between 1999-00 and 20011-12, but since then none at all.

    Figure 4: University Participation Rates by Age, Selected Years, 1999-2000 to 2023-24

    So much for the national numbers: what’s going on at the provincial level? Well, because this is the Labour Force Survey, which unlike administrative data has sample size issues, we can’t quite get the same level of granularity of information. We can’t look at individual ages, but we can see age-ranges, in this case ages 20-24. In figures 5 and 6 (I broke them up so they are a bit easier to read), I show how each province’s university and college participation rates in 2000 vs. 2023.

    Figure 5: University Participation Rates for 20-24 Year-olds, Four Largest Provinces, 2000-01 vs. 2023-24

    Figure 6: University Participation Rates for 20-24 Year-olds, Six Remaining Provinces, 2000-01 vs. 2023-24

    Some key facts emerge from these two graphs:

    • The highest participation rates in the country are in Ontario, Quebec, and British Columbia.
    • In all provinces, the participation rate in universities is higher than it is for colleges, ranging from 2.5x in Quebec for over 4x in Saskatchewan.
    • Over the past quarter century, overall postsecondary participation rates and university participation rates have gone up in all provinces; Alberta and British Columbia alone have seen a decline in college participation rates, due to the aforementioned decision to convert certain colleges to university status in the 00s.
    • Growth in participation rates since 2000 has been universal but has been more significant in the country’s four largest provinces, where the average gain has been nine percentage points, and the country’s six smaller provinces, where the gain has been just under five percent.
    • Over twenty-five years, British Columbia has gone from ninth to second in the country in terms of university participation rates, while Nova Scotia has gone second to ninth.
    • New Brunswick has consistently been in last place for overall participation rates for the entire century.

    Just think: three minutes ago, you probably knew very little about participation rates in Canada by age and geography, now you know almost everything there is to know about participation rates in Canada by age and geography. Is this a great way to start your day or what?

    Source link

  • DOGE temporarily blocked from accessing Education Department student aid data

    DOGE temporarily blocked from accessing Education Department student aid data

    This audio is auto-generated. Please let us know if you have feedback.

    UPDATE: Feb. 12, 2025: The U.S. Department of Education on Tuesday agreed to temporarily block staffers of the Department of Government Efficiency, or DOGE, from accessing student aid information and other data systems until at least Feb. 17. 

    On that date, a federal judge overseeing the case is expected to rule on a student group’s request for a temporary restraining order to block the agency from sharing sensitive data with DOGE. 

    Dive Brief: 

    •  A group representing University of California students filed a lawsuit Friday to block the Elon Musk-led Department of Government Efficiency from accessing federal financial aid data.  
    • The University of California Student Association cited reports that DOGE members gained access to federal student loan data, which includes information such as Social Security numbers, birth dates, account information and driver’s license numbers. 
    • The complaint accuses the U.S. Department of Education of violating federal privacy laws and regulations by granting DOGE staffers access to the data. “The scale of intrusion into individuals’ privacy is enormous and unprecedented,” the lawsuit says. 

    Dive Insight: 

    President Donald Trump created DOGE through executive order on the first day of his second term, tasking the team, led by Tesla co-founder and Trump adviser Musk, with rooting out what the new administration deems as government waste. 

    DOGE has since accessed the data of several government agencies, sparking concerns that its staffers are violating privacy laws and overstepping the executive branch’s power. With the new lawsuit, the University of California Student Association joins the growing chorus of groups that say DOGE is flouting federal statutes. 

    One of those groups — 19 state attorneys general — scored a victory over the weekend. On Saturday, a federal judge temporarily blocked DOGE from accessing the Treasury Department’s payments and data system, which disburses Social Security benefits, tax returns and federal employee salaries. 

    The University of California Student Association has likewise asked the judge to temporarily block the Education Department from sharing sensitive data with DOGE staffers and to retrieve any information that has already been transferred to them. 

    The group argues that the Education Department is violating the Privacy Act of 1974, which says that government agencies may not disclose an individual’s data “to any person, or to another agency,” without their consent, except in limited circumstances. The Internal Revenue Code has similar protections for personal information. 

    “None of the targeted exceptions in these laws allows individuals associated with DOGE, or anyone else, to obtain or access students’ personal information, except for specific purposes — purposes not implicated here,” the lawsuit says. 

    The Washington Post reported on Feb. 3 that some DOGE team members had in fact gained access to “multiple sensitive internal systems, including federal financial aid data, as part of larger plans to carry out Trump’s goal to eventually eliminate the Education Department. 

    “ED did not publicly announce this new policy — what is known is based on media reporting — or attempt to justify it,” Friday’s lawsuit says. “Rather, ED secretly decided to allow individuals with no role in the federal student aid program to root around millions of students’ sensitive records.”

    In response to the Post’s Feb. 3 reporting, Musk on the same day posted on X that Trump “will succeed” in dismantling the agency. 

    Later that week, the Post reported that DOGE staffers were feeding sensitive Education Departmentdata into artificial intelligence software to analyze the agency’s spending. 

    The moves have also attracted lawmakers’ attention. Virginia Rep. Bobby Scott, the top-ranking Democrat on the House’s education committee, asked the Government Accountability Office on Friday to probe the security of information technology systems at the Education Department’s and several other agencies. 

    An Education Department spokesperson said Monday that the agency does not comment on pending litigation. 

    Source link

  • Building tiny chips that can handle enormous data

    Building tiny chips that can handle enormous data

    In the not-so-distant future, really big disasters, such as wildfires in California or floods in Spain or an earthquake in Japan will be monitored and perhaps anticipated by a technology so small it is difficult to even imagine.

    This new technology, called quantum computing, is enabled by nanotechnology — a way of designing technology by manipulating atoms and molecules. Paradoxically, this ultra small technology enables the processing of massively large data sets needed for complex artificial intelligence algorithms.

    There is a growing consensus that AI will quickly change almost everything in the world.

    The AIU cluster, a collection of computer resources used to develop, test and deploy AI models at the IBM Research Center upstate New York. (Credit: Enrique Shore)

    The AI many people already use — such as ChatGPT, Perplexity and now DeepSeek — is based on traditional computers. To process the data analysis needed to answer questions put to these AI programs and to handle the tasks assigned to them takes an enormous amount of energy. For example, the current energy consumption from OpenAI to handle ChatGPT’s prompts in the United States costs some $139.7 million per year.

    Several large private companies, including Google, Microsoft and IBM, are leading the way in this development. The International Business Machines Corp., known as IBM, currently manages the largest industrial research organization, with specialized labs located all over the world.

    Glimpsing the world’s most powerful computers

    The global headquarters of IBM Research are located in the company’s Thomas J. Watson Research Center. Located about one hour north of New York City, it is an impressive building designed in 1961 by Eero Saarinen, an iconic Finnish-American architect who also designed the Dulles International Airport in Washington, D.C., the Swedish Theater in Helsinki and the U.S. Embassy in Oslo.

    A sign at the front door at IBM's research headquarters: “Inventing what’s next”.

    At the entrance of the IBM research headquarters a simple statement sums up what research scientists are trying to achieve at IBM: “Inventing what’s next.”

    At the heart of the IBM Research Center is a “Think Lab” where researchers test AI hardware advancements using the latest and most powerful quantum computers. News Decoder recently toured these facilities.

    There, Shawn Holiday, a product manager at the lab’s Artificial Intelligence Unit (AIU) said the challenge is scaling the size of semiconductors to not only increase performance but also improve power efficiency.

    IBM was the first to develop a new transistor geometry called the gate. Basically, each transistor has multiple channels that are parallel to the surface. Each of those channels has a thickness that is about two nanometers. To try to grasp how small this is consider that one nanometer is about a billionth of a meter.

    This new technology is not just a faster or better version of traditional computers but a totally new way of processing information. It is not based in the traditional bits that are the basis of modern binary computers (meaning bits can be either in the state zero or one) but in qubits, for quantum bits, a different and more complex concept.

    The IBM Quantum System Two

    The IBM Quantum System Two, a powerful quantum computer, operating in the IBM Research Center in Yorktown Heights in upstate New York. (Credit: Enrique Shore)

    A quantum processor with more gates can handle more complex quantum algorithms by allowing for a greater variety of operations to be applied to the qubits within a single computation.

    A new way of processing data

    The change is much more than a new stage in the evolution of computers. Nanotechnology has enabled for the first time in history an entirely new branching in computing history. This new technology is exponentially more advanced; it is not just a faster or better version of traditional computers but a totally new way of processing information.

     

    A replica of the first quantum computer

    A replica of the IBM Quantum System One, the first quantum computer, on display at the IBM Research Center in Yorktown Heights New York. (Credit: Enrique Shore)

    The quantum bit is a basic unit of quantum information that can have many more possibilities, including being in all states simultaneously — a state called superposition — and combining with others, called entanglement, where the state of one qubit is intimately connected with another. This is, of course, a simplified description of a complex process that could hold massively more processing power than traditional computers.

    The current architecture of existing quantum computers require costly, large and complex devices that are refrigerated at extremely low temperatures, close to absolute zero (-459º F, or -273ºC) in order to function correctly. That extremely low temperature is required to change the state of certain materials to conduct electricity with practically zero resistance and no noise.

    Even though there are some prototypes of desktop quantum computers with limited capabilities that could eventually operate at room temperature, they won’t likely replace traditional computers in the foreseeable future, but rather they will operate jointly with them.

    IBM Research is a growing global network of laboratories around the world that are interconnected.

    While IBM is focused on having what they call a hybrid, open and flexible cloud, meaning open-source platforms that can interact with many different systems and vendors, it is also pushing its own technological developments in semiconductor research, an area where its goal is to push the absolute limits of transistor scaling.

    Shrinking down to the quantum realm

    At the lowest level of a chip, you have transistors. You can think of them as switches. Almost like a light switch, they can be off or they can be on. But instead of a mechanical switch, you use a voltage to turn them on and off — when they’re off, they’re at zero and when they’re on, they’re at one.

    A 133-qubit tunable-coupler quantum processor

    IBM Heron, IBM Heron, a 133-qubit tunable-coupler quantum processor (Credit: Enrique Shore)

    This is the basis of all digital computation. What’s driven this industry for the last 60 years is a constant shrinking of the size of transistors to fit more of them on a chip, thereby increasing the processing power of the chip.

    IBM produces wafers in partnership with foundry partners like Samsung and a Japanese startup called Rapidus. Consider that the two-nanometer semiconductor chips which Rapidus is aiming to produce are expected to have up to 45% better performance and use 75% less energy compared to seven-nanometer chips on the market in 2022.

     

    George Tulevski stands next to a world map

    Dr. George Tulevski, IBM Research scientist and manager of the IBM Think Lab, stands next to a world map showing their different labs at the IBM Research Center in Yorktown Heights in New York. (Credit: Enrique Shore)

    IBM predicts that there will be about a trillion transistors on a single die by the early 2030s. To understand that consider that Apple’s M4 chip for its latest iPad Pro has 28 billion transistors. (A die is the square of silicon containing an integrated circuit that has been cut out of the wafer).

    There may be a physical limit to the shrinking of transistors, but if they can no longer be made smaller, they could be stacked in a way that the density per area goes up.

    With each doubling of the trend, there is always a tradeoff of power and performance. Depending on if you tune for power or you tune for performance, with each of these technology nodes, you get either roughly a 50% increase in efficiency or a 50% increase in performance.

    A roadmap for advanced technology

    The bottom line is that doubling the transistor count means being able to do more computations with the same area and the same power.

    Dr. Jay M. Gambetta.

    Dr. Jay M. Gambetta, IBM’s Vice President in charge of IBM’s overall Quantum initiative. explains the expected quantum development roadmap. (Credit: Enrique Shore)

    The roadmap of this acceleration is impressive. Dr. Jay Gambetta, IBM’s vice president in charge of IBM’s overall quantum initiative, showed us a table that forecasts the processing capabilities increasing from the current 5,000 gates to an estimated 100 million gates by 2029, reaching possibly one billion gates by 2033.

    A quantum gate is a basic quantum circuit operating on a small number of qubits. Quantum logic gates are the building blocks of quantum circuits, like classical logic gates are for conventional digital circuits.

    But that will radically diminish with the new more efficient quantum computers, so the old assumptions that more capacity requires more power is being revised and will be greatly improved in the near future — otherwise this development would not be sustainable.

    A practical example of a current project made possible thanks to quantum computing and AI is Prithvi, a groundbreaking geospatial AI foundation model designed for satellite data by IBM and NASA.

    The model supports tracking changes in land use, monitoring disasters and predicting crop yields worldwide. At 600 million parameters, it’s current version 2.0 introduced in December 2024 is already six times bigger than its predecessor, first released in August 2023.

    It has practical uses like analyzing the recent fires in California, the floods in Spain and the crops in Africa — just a few examples of how Prithvi can help understand complex current issues at a rate that was simply impossible before.

    The impossible isn’t just possible. It is happening now.


     

    Three questions to consider:

    1. How is quantum computing different from traditional computing?
    2. What is the benefit of shrinking the size of a transistor?
    3. If you had access to a supercomputer, what big problem would you want it to solve?


    Source link

  • DOGE’s access to Education Department data raises concerns

    DOGE’s access to Education Department data raises concerns

    Just last month, Lorena Tule-Romain was encouraging families with mixed citizenship to fill out the Free Application for Federal Student Aid. She and her staff at ImmSchools, a nonprofit dedicated to improving educational access for immigrants in Dallas, walked students and parents through the complicated federal aid process. Along the way, they offered reassurance that information revealing their undocumented status would be securely held by the Department of Education alone.

    Two weeks ago, ImmSchools stopped offering those services. And Tule-Romain said they’re no longer recommending families fill out the FAFSA. 

    That’s because the Department of Government Efficiency, a White House office run by Elon Musk, now has access to Education Department data systems, potentially including sensitive student loan and financial aid information for millions of students, according to sources both outside and within the department who spoke with Inside Higher Ed

    With immigration officers conducting a blitz of deportations over the past few weeks—and the new possibility of ICE raids at public schools and college campuses—Tule-Romain is worried that applying for federal aid could put undocumented families in jeopardy. Instead of answering parents’ questions about the FAFSA contributor form, she’s hosting Know Your Rights workshops to prepare them for ICE raids.

    “Before, we were doing all we could to encourage families to apply for federal aid, to empower students to break cycles and go to college,” she said. “Now we are not in a position to give that advice. It’s heartbreaking.”

    Student data is technically protected by the Privacy Act of 1974, which prevents departments from sharing personally identifying information unless strict exceptions are met or a law is passed to allow it. The FUTURE Act, for example, gave the IRS access to financial aid data to simplify the FAFSA process. 

    Karen McCarthy, vice president of public policy and federal relations at the National Association of Student Financial Aid Administrators, told Inside Higher Ed that because DOGE has not said why they might be interested in department data or what data they have access to, it’s unclear if they’re acting in accordance with the law.

    In the past, that law has been strictly enforced for federal employees. In 2010, nine people were accused of accessing President Barack Obama’s student loan records while employed for an Education Department contractor in Iowa. The charges levied against them in federal court were punishable by up to one year in prison and a fine of up to $100,000, according to the Associated Press.   

    On Thursday, Democratic Representative Bobby Scott of Virginia wrote to the Government Accountability Office requesting a review of the Education Department’s information technology security and DOGE’s interventions in the department in order to determine their legality and the “potential impact on children.” On Friday, a group of students at the University of California sued department officials for allowing potential privacy act violations. 

    “The scale of the intrusion into individuals’ privacy is massive, unprecedented, and dangerous,” the plaintiffs wrote. 

    In recent days, labor unions and other groups have sued to block DOGE”s access to databases at several federal agencies and have secured some wins. Early Saturday morning, a federal judge prohibited DOGE from accessing Treasury Department data, ordering Musk’s team to “immediately destroy any and all copies of material” from the department’s systems.

    Concerns about DOGE’s use of private student data come as Musk and his staff take a hacksaw to agencies and departments across the federal government, seeking to cut spending and eliminate large portions of the federal workforce. The Trump administration has singled out the Education Department in particular, threatening to gut its administrative capacity or eliminate the department all together. 

    Spokespeople for DOGE did not respond to a list of questions from Inside Higher Ed. Madi Biederman, the Education Department’s deputy assistant secretary for communications, wrote in an email that DOGE staff “have the necessary background checks and clearances” to view department data and are “focused on making the department more cost-efficient, effective and accountable to the taxpayers.”

    “There is nothing inappropriate or nefarious going on,” she added. She did not respond to questions about what data DOGE has access to or how they plan to use it.

    A ‘Gaping Hole’ in Data Security 

    The Education Department’s student financial aid systems contain unique private information that families submit through FAFSA: not only social security numbers but also addresses of relatives, property taxes, sources of income and more. The National Student Loan Database, which tracks loan borrowers’ repayment history and which DOGE may also have access to, includes a wealth of personally identifying information for many more millions of current and former students. 

    A current department staffer provided Inside Higher Ed with a screenshot from the department’s email address catalog containing the names of 25 DOGE employees who may have access to student data—including a 19-year-old who, according to a Bloomberg report, was once fired by a cybersecurity firm for allegedly leaking internal data. And the Washington Post reported that DOGE employees fed sensitive education department data through artificial intelligence software.

    “It could become a gaping hole in our cybersecurity infrastructure,” a former department official said. “I cannot stress enough how unusual it is to just give people access willy-nilly.”

    Two former department officials told Inside Higher Ed it is unclear how the DOGE officials could have legally gained access to department data. McCarthy compared DOGE’s murky activity in the department to a “massive data breach within the federal government.”

    “Normally, there’d be a paper trail telling us what they’ve requested access to and why,” she said. “We don’t have that, so there’s a lot of uncertainty and fear.”

    A current department official told Inside Higher Ed that DOGE staff have been given access to PartnerConnect, which includes information about college programs that receive federal financial aid funding; and that they have read-only access to a financial system. Neither of those databases contain personally identifying information, but the official wasn’t sure DOGE’s access was limited to those sources—and said department staff are worried sensitive student information could be illegally accessed and disbursed. 

    “It just creates a kind of shadow over the work that everyone’s doing,” a prior department official said. 

    Fears of a FAFSA ‘Chilling Effect’

    Families with mixed citizenship status were some of the hardest hit by the error-riddled FAFSA rollout last year, with many reporting glitches that prevented them from applying for aid until late last summer. 

    Tule-Romain said mixed-status families in her community had only just begun to feel comfortable with the federal aid form. In the past few weeks that progress has evaporated, she said, and high school counselors working with ImmSchools report a concerning decline in requests for FAFSA consultations from mixed-status students. 

    “If they weren’t already hesitant, they are extremely hesitant now,” Tule-Romain said. 

    It’s not just mixed-status families who could be affected if data is shared or leaked. McCarthy said that concerns about privacy could have a wide-spread “chilling effect” on federal aid applications.

    “There have always been parents who are reluctant to share their information and the counterargument we always fall back on are the privacy laws,” she said. “A lot of Pell money could get left on the table, or students could be discouraged from going to college altogether.”

    Kim Cook, CEO of the National College Attainment Network, said that after last year’s bungled FAFSA rollout, community organizations and government officials had worked hard to rebuild trust in the system and get completion rates back to normal. She worries that fears about privacy could set back those efforts significantly. 

    “Chaos and uncertainty won’t give us the FAFSA rebound we need,” she said. 

    The confusion could also affect current college students who need to renew their FAFSA soon. Tule-Romain said one undocumented parent who filled out her first form with ImmSchools last year came back a few weeks ago asking for advice. 

    She was torn: on the one hand, she didn’t trust Musk and Trump’s White House not to use the information on the form to deport her. On the other, if her son didn’t receive federal aid, he’d have to drop out of college. Ultimately, she chose to renew the application.

    “If you came [to America] for a better life, you cannot let fear stop you from pursuing that,” Tule-Romain said. “Instead, you arm yourself with knowledge and you move forward—maybe with fear, but you move forward anyway.”

    Source link

  • The Role of Data Analytics in Higher Education

    The Role of Data Analytics in Higher Education

    Reading Time: 8 minutes

    Data analytics has become the cornerstone of effective decision-making across industries, including higher education marketing. As a school administrator or marketer, you’re likely aware that competition for student enrollment is fiercer than ever. 

    To stand out, leveraging data analytics can transform your marketing strategy, enabling you to make informed decisions, optimize resources, and maximize ROI. But what does data analytics mean in the context of higher education marketing, and how can you apply it to achieve tangible results? Keep reading to understand the impact of data analytics on your school’s marketing campaigns, some benefits you can expect, and how to implement them.

    Struggling with enrollment?

    Our expert digital marketing services can help you attract and enroll more students!

    The Significance of Data Analytics in Education Marketing

    What is the role of data analysis in education marketing? Data analytics involves collecting, processing, and interpreting data to uncover patterns, trends, and actionable insights. In higher education marketing, data analytics enables you to understand your target audience—prospective students, parents, alumni, and other stakeholders—better and craft strategies that resonate with them.

    Data analytics goes beyond tracking website visits or social media likes. It involves deep-diving into metrics such as application trends, conversion rates, engagement levels, and even predictive modelling to anticipate future behaviour. For example, analyzing prospective students’ journey from initial interaction with your website to applying can reveal opportunities to refine your marketing campaigns. Data analytics equips you to attract and retain the right students by more effectively addressing their needs.

    HEM 1HEM 1

    Source: HEM

    Do you need support as you create a more data-driven higher education marketing campaign? Reach out to learn more about our specialized digital marketing services. 

    Benefits of a Data-Driven Marketing Campaign

    What are the benefits of big data analytics in higher education marketing? A data-driven approach to marketing offers several advantages that can elevate your institution’s performance and visibility. First, it enhances decision-making. With access to real-time and historical data, you can base your decisions on evidence rather than assumptions. For example, if you notice that email campaigns targeting a particular geographic region yield a higher application rate, you can allocate more resources to similar efforts.

    Second, data analytics in higher education enables personalization. Prospective students now expect tailored experiences that speak to their unique aspirations and challenges. By leveraging data, you can segment your audience and deliver content that resonates deeply with each group. This level of personalization increases engagement and fosters trust and loyalty.

    Additionally, data analytics optimizes your budget. In the past, marketing efforts often involved a degree of guesswork, leading to wasted resources. With data, you can pinpoint what works and what doesn’t, ensuring every dollar you spend contributes to your goals. For instance, if a social media ad targeting international students outperforms others, you can reallocate funds to expand that campaign.

    Finally, data analytics offers the ability to measure success with precision. By setting key performance indicators (KPIs) and tracking them over time, you clearly understand what’s driving results. Whether the number of inquiries generated by a digital ad or the completion rate of an online application form, data analytics provides you with the tools to evaluate and refine your strategies continuously.

    HEM 2HEM 2

    Source: HEM

    Example: Our clients have access to our specialized performance-tracking services. The information in the image above, coupled with the school’s specific objectives, allows us to assess what is working and what needs changing. It informs our strategy, provides valuable insights into how new strategies are performing, and offers detailed insights into the changes that can be made for optimal results. 

    Types of Data Analytics Tools for Higher Education Marketers

    The many data analytics tools available can seem overwhelming, but selecting the right ones can significantly improve your marketing efforts. These tools generally fall into a few key categories.

    Web analytics platforms, such as Google Analytics, allow you to track user behaviour on your website. From page views to time spent on specific pages, these tools help you understand how prospective students interact with your digital presence. For instance, if many visitors drop off on your application page, it may indicate a need to simplify the process.

    Customer relationship management (CRM) systems, like our system, Mautic, help you manage and analyze interactions with prospective and current students. CRMs help you organize your outreach efforts, track the progress of leads through the enrollment funnel, and identify trends in student engagement. 

    As a higher education institution, a system like our Student Portal will guide your prospects down the enrollment funnel. The Student Portal keeps track of vital student information such as their names, contact information, and relationship with your school. You need these data points to retarget students effectively through ads and email campaigns.

    HEM 4HEM 4

    Source: HEM | Student Portal

    Example: Here, you see how our SIS (Student Information System) tracks the progress of school applications, complete with insights like each prospect’s program of interest and location. This data is vital for creating and timing marketing materials, such as email campaigns based on each contact’s current needs, guiding them to the next phase of the enrollment funnel.  

    Social media analytics tools, including platforms like Hootsuite or Sprout Social, provide insights into your social media performance. These tools can reveal which types of content resonate most with your audience, enabling you to fine-tune your messaging.

    HEM 5HEM 5

    Source: Sprout Social

    Example: Social media is a powerful tool for a higher education institution, particularly when targeting Gen-Z prospects. Like any marketing tactic, optimizing social media platforms requires measuring post-performance. A tool like Sprout Social, pictured above, tracks paid and organic performance, streamlining reports and even offering insights into competitor data. 

    Predictive analytics platforms, such as Tableau or SAS, take your efforts further by using historical data to forecast future outcomes. These tools can help you identify at-risk students who may not complete the enrollment process or predict which programs are likely to see increased interest based on current trends.

    Use These Actionable Tips for Optimizing ROI Using Data Analytics

    Clearly define your goals to maximize the impact of data analytics in education marketing campaigns. Whether you aim to increase enrollment in a specific program, boost alumni engagement, or expand your reach internationally, having a clear objective will guide your efforts and help you measure success effectively.

    Next, ensure that you’re collecting the right data. Too often, institutions fall into the trap of gathering vast amounts of data without a clear plan for its use. Focus on metrics that align with your goals, such as lead generation, conversion rates, and engagement levels. Regularly audit your data collection processes to ensure they remain relevant and efficient.

    Once you’ve gathered your data, prioritize analysis. This step involves identifying patterns and trends that can inform your strategy. For instance, if your data shows that most applications come from mobile devices, optimizing your website for mobile users becomes a top priority. Similarly, if you notice that email open rates are highest on Tuesdays, you can adjust your sending schedule accordingly.

    Another key aspect of optimizing ROI is experimentation. Use your data to test different strategies, such as varying your ad copy, targeting different demographics, or experimenting with new platforms. Over time, you’ll better understand what resonates with your audience.

    Don’t overlook the importance of collaboration. Data analytics should be integrated across departments. By sharing insights with admissions, student services, and academic departments, you can create a more cohesive and impactful strategy and carve an efficient path toward the desired results. For example, if your analytics reveal a growing interest in STEM programs, your academic team can develop targeted resources to meet that demand.

    Finally, invest in ongoing education and training. Data analytics constantly evolves, and staying up-to-date on the latest tools and techniques is essential. Encourage your team to participate in workshops, webinars, and courses to enhance their skills and bring fresh insights to your campaigns.

    How We Help Clients to Leverage Data Analytics Solutions: A Case Study with Western University

    The transformative potential of data analytics is best illustrated through real-world examples. Western University of Health Sciences, a leading graduate school for health professionals in California, partnered with us to optimize its data analytics strategy. The collaboration highlights how implementing tailored data solutions can drive meaningful results.

    HEM began by conducting program—and service-specific interviews with Western University staff to identify the analytics needs of managers across the institution. These discussions revealed unique departmental needs, prompting the creation of tailored analytics profiles and corresponding website objectives. Subsequently, data was segmented and collected in alignment with these tailored profiles, ensuring actionable insights for each group.

    A comprehensive technical audit of Western’s web ecosystem revealed several challenges in implementing analytics tools. HEM recommended and implemented a series of changes through a custom analytics implementation guide. These changes included the university’s web team developing and installing cross- and subdomain tracking codes and creating data filters, such as internal traffic exclusion.

    One of the highest priorities was tracking student registration behaviour. HEM developed a custom “apply now” registration funnel that integrated seamlessly with Western’s SunGard Banner registration pages to address this. This funnel provided a clear view of prospect and registrant behaviour across the main website and its subdomains, offering valuable insights into the user journey.

    Over three months, HEM implemented these solutions and provided custom monthly reports to program managers. These reports verified the successful integration of changes, including the application of filters and cross-domain tracking. As a result, Western’s managers gained the ability to fully track student registrations, monitor library download behaviour, and make data-informed decisions to enhance student services.

    Western University’s Director of Instructional Technology praised HEM’s efforts, noting that the refined tracking capabilities clarified how prospective students navigated the site. The successful collaboration demonstrates the significant impact of data analytics solutions on improving user experience and institutional efficiency.

    HEM 6HEM 6

    Source: HEM

    HEM continues to build data-driven marketing campaigns for clients, streamlining their workflows, providing deep insights, increasing engagement, and boosting enrollment. 

    Higher ed data analytics is necessary for building effective marketing campaigns. By understanding its role and potential, you can craft data-driven strategies that elevate your institution’s visibility, improve engagement, and optimize ROI. As you embrace data analytics, remember that its true power lies in its ability to guide informed decision-making and foster continuous improvement. Whether you aim to attract more students, enhance retention, or build stronger alumni relationships, data analytics provides the roadmap to success. Start leveraging its insights today and position your institution as a leader in an increasingly competitive landscape.

    Struggling with enrollment?

    Our expert digital marketing services can help you attract and enroll more students!

    Frequently Asked Questions 

    What is the role of data analysis in education marketing?

    Data analytics involves collecting, processing, and interpreting data to uncover patterns, trends, and actionable insights. In higher education marketing, data analytics enables you to better understand your target audience—prospective students, parents, alumni, and other stakeholders—and craft strategies that resonate with them.

    What are the benefits of big data analytics in higher education marketing? 

    A data-driven approach to marketing offers several advantages that can elevate your institution’s performance and visibility, including:

    • Decision-making
    • Personalization 
    • Cost efficiency 
    • The ability to track results

    Source link

  • Why unified data and technology is critical to student experience and university success

    Why unified data and technology is critical to student experience and university success

    The Australian higher education sector continues to evolve rapidly, with hybrid learning,
    non-linear education, and the current skills shortage all shaping how universities operate.

    At the same time, universities are grappling with rising operational costs and decreased funding, leading to fierce competition for new enrolments.

    Amidst the dynamic landscape of higher education, the student experience has become a crucial factor in attracting and retaining students.

    The student experience encompasses a wide array of interactions, from how students first learn about an institution through to the enrolment process, coursework, social activities, wellbeing support and career connections. With so many student touchpoints to manage, institutions are turning to data and technology integrations to help streamline communications and improve their adaptability to change.

    Download the white paper: Why Unifying Data and Technology is Critical to the Success and Future of Universities

    Enhancing institutional efficiency and effectiveness
    Universities face an increasingly fragmented IT landscape, with siloed data and legacy systems making it difficult to support growth ambitions and improve student experiences.

    By integrating systems and data, institutions are starting to align digital and business strategies so that they can meet operational goals while providing more connected, seamless and personalised experiences for students.

    One of the most effective ways universities can achieve this is by consolidating disparate systems into a cloud-based Customer Relationship Management (CRM) solution, such as Salesforce.

    Optimising admissions and enhancing student engagement
    In recent years, there have been significant fluctuations in the enrolment of higher education students for numerous reasons – Covid-19 restrictions, declining domestic student numbers, high cost of living, proposed international student caps, and volatile labour market conditions being just a few.

    To better capture the attention of prospective students, institutions are now focusing on delivering more personalised and targeted engagement strategies. Integrated CRM and marketing automation is increasingly being used to attract more prospective students with tailored, well-timed communication.

    Universities are also using CRM tools to support student retention and minimise attrition. According to a Forrester study, students are 15 per cent more likely to stay with an institution when Salesforce is used to provide communications, learning resources and support services.

    Streamlining communication and collaboration
    By creating a centralised system of engagement, universities can not only support students throughout their academic journey, but also oversee their wellbeing.

    For example, a leading university in Sydney has developed a system that provides a comprehensive view of students and their needs, allowing for integrated and holistic support and transforming its incident reporting and case management.

    Fostering stronger alumni and industry relations
    Another area where CRM systems play a pivotal role is in building alumni and industry relationships. Alumni who feel valued by their university – through personalised engagement – are more likely to return when seeking upskilling, or to lend financial support.

    Personalising communication to industry partners can also help strengthen relationships, potentially leading to sponsored research, grants, and donations, as well as internships and career placements.

    University of Technology Sydney, for example, adopted a centralised data-led strategy for Corporate Relations to change how it works with strategic partners, significantly strengthening its partner network across the university.

    Unlocking the value of data and integration

    With unified data and digital technology driving personalised student interactions, university ICT departments can empower faculty and staff to exceed enrolment goals, foster lifelong student relationships and drive institutional growth.

    To learn more about the strategies and technologies to maximise institutional business value, download the white paper.

    Do you have an idea for a story?
    Email [email protected]

    Source link

  • UCAS End of Cycle provider data, 2024

    UCAS End of Cycle provider data, 2024

    Chat to anyone involved in sector admissions and you will hear a similar story.

    And the story appears to be true.

    It is now clear “high tariff” providers have been lowering their entry tariff (often substantially) in order to grow recruitment – meaning students with less-than-stellar grades have been ending up in prestigious institutions, and the kinds of places students like this would more usually attend have been struggling to recruit as a result.

    In other words, the 2024 looks a lot like a lockdown cycle (without the examnishambles and Zoom pub quizzes).

    Any major dude will tell you

    We noted, at a sector level, the rise in the number of offers made by high-tariff providers – it was the highest number on record. There was no parallel rise in A level attainment, which suggests a strategic decision, made early on, to widen access.

    Today’s release of UCAS End of Cycle data for 2024 at provider level illustrates that this picture is a generalisation. Some high-tariff providers have acted in the way described above, others have pursued alternative strategies. And other providers have hit on other ways to drive undergraduate recruitment.

    Starting with my favourite chart, we can think about these individual strategies in more detail. This scatter plot shows the year-on-year change in the number of applications along the horizontal axis and the year-on-year change in acceptances on the vertical. There’s filters for gender, domicile, age group and subject group (at the top level) – and I’ve provided a choice of comparator years if you want to look at changes over a longer term. The size of the dots represents the total recruitment by that provider in 2024, given the parameters we can see.

    [Full screen]

    In essence this illustrates popularity (among applicants) and selectivity. What we can see here for 2024 (defaulting to UK 18 year olds applying to all subjects compared to 2023) is that pretty much the entire Russell group has made significant (c500 or above) increases in recruitment, whether or not they saw a corresponding growth in applications.

    It’s not the full story – the picture for other pre-92 and post-92 providers is more mixed, with some providers able to leverage popularity (or desperation) to find growth.

    My old school

    We can’t look directly at provider behaviour by tariff, but we can examine what qualifications students placed at the provider have – here a key indicator might be an increase in the number of students entering without A levels (a group that tends to have lower tariffs overall).

    [Full screen]

    The trouble is, A level entry rates have also increased – pretty much anyone who wants to and can do A levels is now doing A levels. With the decline in BTEC popularity, and the still uncertain interest in T levels, this is to be expected. All this means most providers have seen an increase or steady state in the number of students entering with A levels (when you include that A level plus project options). In Scotland – and recall we don’t get the complete picture of Scottish applications from UCAS because of a wonderful little thing called intercalation – it’s SQA pretty much all the way.

    Everything you did

    If you are wondering whether a change in age groups placed as undergraduates could also have an impact on recruitment patterns, it looks as if the pattern of low and slowly falling mature recruitment continues for most providers. For larger universities most of the action is around 18 year old home recruitment – and specialist providers that focus on mature students (often via part-time or flexible study) tend to struggle.

    [Full screen]

    The other key factor is domicile – the changes to visa arrangements this time last year had a huge impact on international applications (particularly from countries like India and Nigeria that have become important for lower tariff providers) and coupled with some of the changes described above this has resulted in some providers seeing undergraduate international admissions fall off a cliff.

    [Full screen]

    As always, undergraduate isn’t the full story – we’ve still no reliable way of understanding postgraduate recruitment in the round until we get the HESA data long after the academic year in question has finished. I just hope that regulators with new duties to understand the financial stability of the sector have more of a clue.

    Any world that I’m welcome to

    With some providers stuffed to the seams and beyond with students they wouldn’t usually accept – many with support needs it is unclear whether they are able to meet – it is unclear who exactly benefits from this new state of affairs. The claim we regularly hear is that universities lose money on educating home students, and that these must be cross subsidised by international recruitment.

    The corollary of this is that in times where international student recruitment is restricted you would expect to see the number of home students at providers reliant on this income fall – after all, if you lose money on every home student the more you recruit the more money you lose. Though measures to widen access and participation are important (and indeed, we see welcome evidence of contextual admissions at selective providers in the chart below) the fact of it is that you need to spend money to support students without the cultural capital to succeed.

    [Full screen]

    The rather painful conclusion I reach is that the only way to make this year’s sums add up is a reduction in spend per student – and, thus, most likely, the quality of the student experience among precisely the students who would have been overjoyed to get a place at a famous university. We should keep a close eye on continuation metrics and the national student survey this year.

    Source link