Tag: Data

  • Building tiny chips that can handle enormous data

    Building tiny chips that can handle enormous data

    In the not-so-distant future, really big disasters, such as wildfires in California or floods in Spain or an earthquake in Japan will be monitored and perhaps anticipated by a technology so small it is difficult to even imagine.

    This new technology, called quantum computing, is enabled by nanotechnology — a way of designing technology by manipulating atoms and molecules. Paradoxically, this ultra small technology enables the processing of massively large data sets needed for complex artificial intelligence algorithms.

    There is a growing consensus that AI will quickly change almost everything in the world.

    The AIU cluster, a collection of computer resources used to develop, test and deploy AI models at the IBM Research Center upstate New York. (Credit: Enrique Shore)

    The AI many people already use — such as ChatGPT, Perplexity and now DeepSeek — is based on traditional computers. To process the data analysis needed to answer questions put to these AI programs and to handle the tasks assigned to them takes an enormous amount of energy. For example, the current energy consumption from OpenAI to handle ChatGPT’s prompts in the United States costs some $139.7 million per year.

    Several large private companies, including Google, Microsoft and IBM, are leading the way in this development. The International Business Machines Corp., known as IBM, currently manages the largest industrial research organization, with specialized labs located all over the world.

    Glimpsing the world’s most powerful computers

    The global headquarters of IBM Research are located in the company’s Thomas J. Watson Research Center. Located about one hour north of New York City, it is an impressive building designed in 1961 by Eero Saarinen, an iconic Finnish-American architect who also designed the Dulles International Airport in Washington, D.C., the Swedish Theater in Helsinki and the U.S. Embassy in Oslo.

    A sign at the front door at IBM's research headquarters: “Inventing what’s next”.

    At the entrance of the IBM research headquarters a simple statement sums up what research scientists are trying to achieve at IBM: “Inventing what’s next.”

    At the heart of the IBM Research Center is a “Think Lab” where researchers test AI hardware advancements using the latest and most powerful quantum computers. News Decoder recently toured these facilities.

    There, Shawn Holiday, a product manager at the lab’s Artificial Intelligence Unit (AIU) said the challenge is scaling the size of semiconductors to not only increase performance but also improve power efficiency.

    IBM was the first to develop a new transistor geometry called the gate. Basically, each transistor has multiple channels that are parallel to the surface. Each of those channels has a thickness that is about two nanometers. To try to grasp how small this is consider that one nanometer is about a billionth of a meter.

    This new technology is not just a faster or better version of traditional computers but a totally new way of processing information. It is not based in the traditional bits that are the basis of modern binary computers (meaning bits can be either in the state zero or one) but in qubits, for quantum bits, a different and more complex concept.

    The IBM Quantum System Two

    The IBM Quantum System Two, a powerful quantum computer, operating in the IBM Research Center in Yorktown Heights in upstate New York. (Credit: Enrique Shore)

    A quantum processor with more gates can handle more complex quantum algorithms by allowing for a greater variety of operations to be applied to the qubits within a single computation.

    A new way of processing data

    The change is much more than a new stage in the evolution of computers. Nanotechnology has enabled for the first time in history an entirely new branching in computing history. This new technology is exponentially more advanced; it is not just a faster or better version of traditional computers but a totally new way of processing information.

     

    A replica of the first quantum computer

    A replica of the IBM Quantum System One, the first quantum computer, on display at the IBM Research Center in Yorktown Heights New York. (Credit: Enrique Shore)

    The quantum bit is a basic unit of quantum information that can have many more possibilities, including being in all states simultaneously — a state called superposition — and combining with others, called entanglement, where the state of one qubit is intimately connected with another. This is, of course, a simplified description of a complex process that could hold massively more processing power than traditional computers.

    The current architecture of existing quantum computers require costly, large and complex devices that are refrigerated at extremely low temperatures, close to absolute zero (-459º F, or -273ºC) in order to function correctly. That extremely low temperature is required to change the state of certain materials to conduct electricity with practically zero resistance and no noise.

    Even though there are some prototypes of desktop quantum computers with limited capabilities that could eventually operate at room temperature, they won’t likely replace traditional computers in the foreseeable future, but rather they will operate jointly with them.

    IBM Research is a growing global network of laboratories around the world that are interconnected.

    While IBM is focused on having what they call a hybrid, open and flexible cloud, meaning open-source platforms that can interact with many different systems and vendors, it is also pushing its own technological developments in semiconductor research, an area where its goal is to push the absolute limits of transistor scaling.

    Shrinking down to the quantum realm

    At the lowest level of a chip, you have transistors. You can think of them as switches. Almost like a light switch, they can be off or they can be on. But instead of a mechanical switch, you use a voltage to turn them on and off — when they’re off, they’re at zero and when they’re on, they’re at one.

    A 133-qubit tunable-coupler quantum processor

    IBM Heron, IBM Heron, a 133-qubit tunable-coupler quantum processor (Credit: Enrique Shore)

    This is the basis of all digital computation. What’s driven this industry for the last 60 years is a constant shrinking of the size of transistors to fit more of them on a chip, thereby increasing the processing power of the chip.

    IBM produces wafers in partnership with foundry partners like Samsung and a Japanese startup called Rapidus. Consider that the two-nanometer semiconductor chips which Rapidus is aiming to produce are expected to have up to 45% better performance and use 75% less energy compared to seven-nanometer chips on the market in 2022.

     

    George Tulevski stands next to a world map

    Dr. George Tulevski, IBM Research scientist and manager of the IBM Think Lab, stands next to a world map showing their different labs at the IBM Research Center in Yorktown Heights in New York. (Credit: Enrique Shore)

    IBM predicts that there will be about a trillion transistors on a single die by the early 2030s. To understand that consider that Apple’s M4 chip for its latest iPad Pro has 28 billion transistors. (A die is the square of silicon containing an integrated circuit that has been cut out of the wafer).

    There may be a physical limit to the shrinking of transistors, but if they can no longer be made smaller, they could be stacked in a way that the density per area goes up.

    With each doubling of the trend, there is always a tradeoff of power and performance. Depending on if you tune for power or you tune for performance, with each of these technology nodes, you get either roughly a 50% increase in efficiency or a 50% increase in performance.

    A roadmap for advanced technology

    The bottom line is that doubling the transistor count means being able to do more computations with the same area and the same power.

    Dr. Jay M. Gambetta.

    Dr. Jay M. Gambetta, IBM’s Vice President in charge of IBM’s overall Quantum initiative. explains the expected quantum development roadmap. (Credit: Enrique Shore)

    The roadmap of this acceleration is impressive. Dr. Jay Gambetta, IBM’s vice president in charge of IBM’s overall quantum initiative, showed us a table that forecasts the processing capabilities increasing from the current 5,000 gates to an estimated 100 million gates by 2029, reaching possibly one billion gates by 2033.

    A quantum gate is a basic quantum circuit operating on a small number of qubits. Quantum logic gates are the building blocks of quantum circuits, like classical logic gates are for conventional digital circuits.

    But that will radically diminish with the new more efficient quantum computers, so the old assumptions that more capacity requires more power is being revised and will be greatly improved in the near future — otherwise this development would not be sustainable.

    A practical example of a current project made possible thanks to quantum computing and AI is Prithvi, a groundbreaking geospatial AI foundation model designed for satellite data by IBM and NASA.

    The model supports tracking changes in land use, monitoring disasters and predicting crop yields worldwide. At 600 million parameters, it’s current version 2.0 introduced in December 2024 is already six times bigger than its predecessor, first released in August 2023.

    It has practical uses like analyzing the recent fires in California, the floods in Spain and the crops in Africa — just a few examples of how Prithvi can help understand complex current issues at a rate that was simply impossible before.

    The impossible isn’t just possible. It is happening now.


     

    Three questions to consider:

    1. How is quantum computing different from traditional computing?
    2. What is the benefit of shrinking the size of a transistor?
    3. If you had access to a supercomputer, what big problem would you want it to solve?


    Source link

  • DOGE’s access to Education Department data raises concerns

    DOGE’s access to Education Department data raises concerns

    Just last month, Lorena Tule-Romain was encouraging families with mixed citizenship to fill out the Free Application for Federal Student Aid. She and her staff at ImmSchools, a nonprofit dedicated to improving educational access for immigrants in Dallas, walked students and parents through the complicated federal aid process. Along the way, they offered reassurance that information revealing their undocumented status would be securely held by the Department of Education alone.

    Two weeks ago, ImmSchools stopped offering those services. And Tule-Romain said they’re no longer recommending families fill out the FAFSA. 

    That’s because the Department of Government Efficiency, a White House office run by Elon Musk, now has access to Education Department data systems, potentially including sensitive student loan and financial aid information for millions of students, according to sources both outside and within the department who spoke with Inside Higher Ed

    With immigration officers conducting a blitz of deportations over the past few weeks—and the new possibility of ICE raids at public schools and college campuses—Tule-Romain is worried that applying for federal aid could put undocumented families in jeopardy. Instead of answering parents’ questions about the FAFSA contributor form, she’s hosting Know Your Rights workshops to prepare them for ICE raids.

    “Before, we were doing all we could to encourage families to apply for federal aid, to empower students to break cycles and go to college,” she said. “Now we are not in a position to give that advice. It’s heartbreaking.”

    Student data is technically protected by the Privacy Act of 1974, which prevents departments from sharing personally identifying information unless strict exceptions are met or a law is passed to allow it. The FUTURE Act, for example, gave the IRS access to financial aid data to simplify the FAFSA process. 

    Karen McCarthy, vice president of public policy and federal relations at the National Association of Student Financial Aid Administrators, told Inside Higher Ed that because DOGE has not said why they might be interested in department data or what data they have access to, it’s unclear if they’re acting in accordance with the law.

    In the past, that law has been strictly enforced for federal employees. In 2010, nine people were accused of accessing President Barack Obama’s student loan records while employed for an Education Department contractor in Iowa. The charges levied against them in federal court were punishable by up to one year in prison and a fine of up to $100,000, according to the Associated Press.   

    On Thursday, Democratic Representative Bobby Scott of Virginia wrote to the Government Accountability Office requesting a review of the Education Department’s information technology security and DOGE’s interventions in the department in order to determine their legality and the “potential impact on children.” On Friday, a group of students at the University of California sued department officials for allowing potential privacy act violations. 

    “The scale of the intrusion into individuals’ privacy is massive, unprecedented, and dangerous,” the plaintiffs wrote. 

    In recent days, labor unions and other groups have sued to block DOGE”s access to databases at several federal agencies and have secured some wins. Early Saturday morning, a federal judge prohibited DOGE from accessing Treasury Department data, ordering Musk’s team to “immediately destroy any and all copies of material” from the department’s systems.

    Concerns about DOGE’s use of private student data come as Musk and his staff take a hacksaw to agencies and departments across the federal government, seeking to cut spending and eliminate large portions of the federal workforce. The Trump administration has singled out the Education Department in particular, threatening to gut its administrative capacity or eliminate the department all together. 

    Spokespeople for DOGE did not respond to a list of questions from Inside Higher Ed. Madi Biederman, the Education Department’s deputy assistant secretary for communications, wrote in an email that DOGE staff “have the necessary background checks and clearances” to view department data and are “focused on making the department more cost-efficient, effective and accountable to the taxpayers.”

    “There is nothing inappropriate or nefarious going on,” she added. She did not respond to questions about what data DOGE has access to or how they plan to use it.

    A ‘Gaping Hole’ in Data Security 

    The Education Department’s student financial aid systems contain unique private information that families submit through FAFSA: not only social security numbers but also addresses of relatives, property taxes, sources of income and more. The National Student Loan Database, which tracks loan borrowers’ repayment history and which DOGE may also have access to, includes a wealth of personally identifying information for many more millions of current and former students. 

    A current department staffer provided Inside Higher Ed with a screenshot from the department’s email address catalog containing the names of 25 DOGE employees who may have access to student data—including a 19-year-old who, according to a Bloomberg report, was once fired by a cybersecurity firm for allegedly leaking internal data. And the Washington Post reported that DOGE employees fed sensitive education department data through artificial intelligence software.

    “It could become a gaping hole in our cybersecurity infrastructure,” a former department official said. “I cannot stress enough how unusual it is to just give people access willy-nilly.”

    Two former department officials told Inside Higher Ed it is unclear how the DOGE officials could have legally gained access to department data. McCarthy compared DOGE’s murky activity in the department to a “massive data breach within the federal government.”

    “Normally, there’d be a paper trail telling us what they’ve requested access to and why,” she said. “We don’t have that, so there’s a lot of uncertainty and fear.”

    A current department official told Inside Higher Ed that DOGE staff have been given access to PartnerConnect, which includes information about college programs that receive federal financial aid funding; and that they have read-only access to a financial system. Neither of those databases contain personally identifying information, but the official wasn’t sure DOGE’s access was limited to those sources—and said department staff are worried sensitive student information could be illegally accessed and disbursed. 

    “It just creates a kind of shadow over the work that everyone’s doing,” a prior department official said. 

    Fears of a FAFSA ‘Chilling Effect’

    Families with mixed citizenship status were some of the hardest hit by the error-riddled FAFSA rollout last year, with many reporting glitches that prevented them from applying for aid until late last summer. 

    Tule-Romain said mixed-status families in her community had only just begun to feel comfortable with the federal aid form. In the past few weeks that progress has evaporated, she said, and high school counselors working with ImmSchools report a concerning decline in requests for FAFSA consultations from mixed-status students. 

    “If they weren’t already hesitant, they are extremely hesitant now,” Tule-Romain said. 

    It’s not just mixed-status families who could be affected if data is shared or leaked. McCarthy said that concerns about privacy could have a wide-spread “chilling effect” on federal aid applications.

    “There have always been parents who are reluctant to share their information and the counterargument we always fall back on are the privacy laws,” she said. “A lot of Pell money could get left on the table, or students could be discouraged from going to college altogether.”

    Kim Cook, CEO of the National College Attainment Network, said that after last year’s bungled FAFSA rollout, community organizations and government officials had worked hard to rebuild trust in the system and get completion rates back to normal. She worries that fears about privacy could set back those efforts significantly. 

    “Chaos and uncertainty won’t give us the FAFSA rebound we need,” she said. 

    The confusion could also affect current college students who need to renew their FAFSA soon. Tule-Romain said one undocumented parent who filled out her first form with ImmSchools last year came back a few weeks ago asking for advice. 

    She was torn: on the one hand, she didn’t trust Musk and Trump’s White House not to use the information on the form to deport her. On the other, if her son didn’t receive federal aid, he’d have to drop out of college. Ultimately, she chose to renew the application.

    “If you came [to America] for a better life, you cannot let fear stop you from pursuing that,” Tule-Romain said. “Instead, you arm yourself with knowledge and you move forward—maybe with fear, but you move forward anyway.”

    Source link

  • The Role of Data Analytics in Higher Education

    The Role of Data Analytics in Higher Education

    Reading Time: 8 minutes

    Data analytics has become the cornerstone of effective decision-making across industries, including higher education marketing. As a school administrator or marketer, you’re likely aware that competition for student enrollment is fiercer than ever. 

    To stand out, leveraging data analytics can transform your marketing strategy, enabling you to make informed decisions, optimize resources, and maximize ROI. But what does data analytics mean in the context of higher education marketing, and how can you apply it to achieve tangible results? Keep reading to understand the impact of data analytics on your school’s marketing campaigns, some benefits you can expect, and how to implement them.

    Struggling with enrollment?

    Our expert digital marketing services can help you attract and enroll more students!

    The Significance of Data Analytics in Education Marketing

    What is the role of data analysis in education marketing? Data analytics involves collecting, processing, and interpreting data to uncover patterns, trends, and actionable insights. In higher education marketing, data analytics enables you to understand your target audience—prospective students, parents, alumni, and other stakeholders—better and craft strategies that resonate with them.

    Data analytics goes beyond tracking website visits or social media likes. It involves deep-diving into metrics such as application trends, conversion rates, engagement levels, and even predictive modelling to anticipate future behaviour. For example, analyzing prospective students’ journey from initial interaction with your website to applying can reveal opportunities to refine your marketing campaigns. Data analytics equips you to attract and retain the right students by more effectively addressing their needs.

    HEM 1HEM 1

    Source: HEM

    Do you need support as you create a more data-driven higher education marketing campaign? Reach out to learn more about our specialized digital marketing services. 

    Benefits of a Data-Driven Marketing Campaign

    What are the benefits of big data analytics in higher education marketing? A data-driven approach to marketing offers several advantages that can elevate your institution’s performance and visibility. First, it enhances decision-making. With access to real-time and historical data, you can base your decisions on evidence rather than assumptions. For example, if you notice that email campaigns targeting a particular geographic region yield a higher application rate, you can allocate more resources to similar efforts.

    Second, data analytics in higher education enables personalization. Prospective students now expect tailored experiences that speak to their unique aspirations and challenges. By leveraging data, you can segment your audience and deliver content that resonates deeply with each group. This level of personalization increases engagement and fosters trust and loyalty.

    Additionally, data analytics optimizes your budget. In the past, marketing efforts often involved a degree of guesswork, leading to wasted resources. With data, you can pinpoint what works and what doesn’t, ensuring every dollar you spend contributes to your goals. For instance, if a social media ad targeting international students outperforms others, you can reallocate funds to expand that campaign.

    Finally, data analytics offers the ability to measure success with precision. By setting key performance indicators (KPIs) and tracking them over time, you clearly understand what’s driving results. Whether the number of inquiries generated by a digital ad or the completion rate of an online application form, data analytics provides you with the tools to evaluate and refine your strategies continuously.

    HEM 2HEM 2

    Source: HEM

    Example: Our clients have access to our specialized performance-tracking services. The information in the image above, coupled with the school’s specific objectives, allows us to assess what is working and what needs changing. It informs our strategy, provides valuable insights into how new strategies are performing, and offers detailed insights into the changes that can be made for optimal results. 

    Types of Data Analytics Tools for Higher Education Marketers

    The many data analytics tools available can seem overwhelming, but selecting the right ones can significantly improve your marketing efforts. These tools generally fall into a few key categories.

    Web analytics platforms, such as Google Analytics, allow you to track user behaviour on your website. From page views to time spent on specific pages, these tools help you understand how prospective students interact with your digital presence. For instance, if many visitors drop off on your application page, it may indicate a need to simplify the process.

    Customer relationship management (CRM) systems, like our system, Mautic, help you manage and analyze interactions with prospective and current students. CRMs help you organize your outreach efforts, track the progress of leads through the enrollment funnel, and identify trends in student engagement. 

    As a higher education institution, a system like our Student Portal will guide your prospects down the enrollment funnel. The Student Portal keeps track of vital student information such as their names, contact information, and relationship with your school. You need these data points to retarget students effectively through ads and email campaigns.

    HEM 4HEM 4

    Source: HEM | Student Portal

    Example: Here, you see how our SIS (Student Information System) tracks the progress of school applications, complete with insights like each prospect’s program of interest and location. This data is vital for creating and timing marketing materials, such as email campaigns based on each contact’s current needs, guiding them to the next phase of the enrollment funnel.  

    Social media analytics tools, including platforms like Hootsuite or Sprout Social, provide insights into your social media performance. These tools can reveal which types of content resonate most with your audience, enabling you to fine-tune your messaging.

    HEM 5HEM 5

    Source: Sprout Social

    Example: Social media is a powerful tool for a higher education institution, particularly when targeting Gen-Z prospects. Like any marketing tactic, optimizing social media platforms requires measuring post-performance. A tool like Sprout Social, pictured above, tracks paid and organic performance, streamlining reports and even offering insights into competitor data. 

    Predictive analytics platforms, such as Tableau or SAS, take your efforts further by using historical data to forecast future outcomes. These tools can help you identify at-risk students who may not complete the enrollment process or predict which programs are likely to see increased interest based on current trends.

    Use These Actionable Tips for Optimizing ROI Using Data Analytics

    Clearly define your goals to maximize the impact of data analytics in education marketing campaigns. Whether you aim to increase enrollment in a specific program, boost alumni engagement, or expand your reach internationally, having a clear objective will guide your efforts and help you measure success effectively.

    Next, ensure that you’re collecting the right data. Too often, institutions fall into the trap of gathering vast amounts of data without a clear plan for its use. Focus on metrics that align with your goals, such as lead generation, conversion rates, and engagement levels. Regularly audit your data collection processes to ensure they remain relevant and efficient.

    Once you’ve gathered your data, prioritize analysis. This step involves identifying patterns and trends that can inform your strategy. For instance, if your data shows that most applications come from mobile devices, optimizing your website for mobile users becomes a top priority. Similarly, if you notice that email open rates are highest on Tuesdays, you can adjust your sending schedule accordingly.

    Another key aspect of optimizing ROI is experimentation. Use your data to test different strategies, such as varying your ad copy, targeting different demographics, or experimenting with new platforms. Over time, you’ll better understand what resonates with your audience.

    Don’t overlook the importance of collaboration. Data analytics should be integrated across departments. By sharing insights with admissions, student services, and academic departments, you can create a more cohesive and impactful strategy and carve an efficient path toward the desired results. For example, if your analytics reveal a growing interest in STEM programs, your academic team can develop targeted resources to meet that demand.

    Finally, invest in ongoing education and training. Data analytics constantly evolves, and staying up-to-date on the latest tools and techniques is essential. Encourage your team to participate in workshops, webinars, and courses to enhance their skills and bring fresh insights to your campaigns.

    How We Help Clients to Leverage Data Analytics Solutions: A Case Study with Western University

    The transformative potential of data analytics is best illustrated through real-world examples. Western University of Health Sciences, a leading graduate school for health professionals in California, partnered with us to optimize its data analytics strategy. The collaboration highlights how implementing tailored data solutions can drive meaningful results.

    HEM began by conducting program—and service-specific interviews with Western University staff to identify the analytics needs of managers across the institution. These discussions revealed unique departmental needs, prompting the creation of tailored analytics profiles and corresponding website objectives. Subsequently, data was segmented and collected in alignment with these tailored profiles, ensuring actionable insights for each group.

    A comprehensive technical audit of Western’s web ecosystem revealed several challenges in implementing analytics tools. HEM recommended and implemented a series of changes through a custom analytics implementation guide. These changes included the university’s web team developing and installing cross- and subdomain tracking codes and creating data filters, such as internal traffic exclusion.

    One of the highest priorities was tracking student registration behaviour. HEM developed a custom “apply now” registration funnel that integrated seamlessly with Western’s SunGard Banner registration pages to address this. This funnel provided a clear view of prospect and registrant behaviour across the main website and its subdomains, offering valuable insights into the user journey.

    Over three months, HEM implemented these solutions and provided custom monthly reports to program managers. These reports verified the successful integration of changes, including the application of filters and cross-domain tracking. As a result, Western’s managers gained the ability to fully track student registrations, monitor library download behaviour, and make data-informed decisions to enhance student services.

    Western University’s Director of Instructional Technology praised HEM’s efforts, noting that the refined tracking capabilities clarified how prospective students navigated the site. The successful collaboration demonstrates the significant impact of data analytics solutions on improving user experience and institutional efficiency.

    HEM 6HEM 6

    Source: HEM

    HEM continues to build data-driven marketing campaigns for clients, streamlining their workflows, providing deep insights, increasing engagement, and boosting enrollment. 

    Higher ed data analytics is necessary for building effective marketing campaigns. By understanding its role and potential, you can craft data-driven strategies that elevate your institution’s visibility, improve engagement, and optimize ROI. As you embrace data analytics, remember that its true power lies in its ability to guide informed decision-making and foster continuous improvement. Whether you aim to attract more students, enhance retention, or build stronger alumni relationships, data analytics provides the roadmap to success. Start leveraging its insights today and position your institution as a leader in an increasingly competitive landscape.

    Struggling with enrollment?

    Our expert digital marketing services can help you attract and enroll more students!

    Frequently Asked Questions 

    What is the role of data analysis in education marketing?

    Data analytics involves collecting, processing, and interpreting data to uncover patterns, trends, and actionable insights. In higher education marketing, data analytics enables you to better understand your target audience—prospective students, parents, alumni, and other stakeholders—and craft strategies that resonate with them.

    What are the benefits of big data analytics in higher education marketing? 

    A data-driven approach to marketing offers several advantages that can elevate your institution’s performance and visibility, including:

    • Decision-making
    • Personalization 
    • Cost efficiency 
    • The ability to track results

    Source link

  • Why unified data and technology is critical to student experience and university success

    Why unified data and technology is critical to student experience and university success

    The Australian higher education sector continues to evolve rapidly, with hybrid learning,
    non-linear education, and the current skills shortage all shaping how universities operate.

    At the same time, universities are grappling with rising operational costs and decreased funding, leading to fierce competition for new enrolments.

    Amidst the dynamic landscape of higher education, the student experience has become a crucial factor in attracting and retaining students.

    The student experience encompasses a wide array of interactions, from how students first learn about an institution through to the enrolment process, coursework, social activities, wellbeing support and career connections. With so many student touchpoints to manage, institutions are turning to data and technology integrations to help streamline communications and improve their adaptability to change.

    Download the white paper: Why Unifying Data and Technology is Critical to the Success and Future of Universities

    Enhancing institutional efficiency and effectiveness
    Universities face an increasingly fragmented IT landscape, with siloed data and legacy systems making it difficult to support growth ambitions and improve student experiences.

    By integrating systems and data, institutions are starting to align digital and business strategies so that they can meet operational goals while providing more connected, seamless and personalised experiences for students.

    One of the most effective ways universities can achieve this is by consolidating disparate systems into a cloud-based Customer Relationship Management (CRM) solution, such as Salesforce.

    Optimising admissions and enhancing student engagement
    In recent years, there have been significant fluctuations in the enrolment of higher education students for numerous reasons – Covid-19 restrictions, declining domestic student numbers, high cost of living, proposed international student caps, and volatile labour market conditions being just a few.

    To better capture the attention of prospective students, institutions are now focusing on delivering more personalised and targeted engagement strategies. Integrated CRM and marketing automation is increasingly being used to attract more prospective students with tailored, well-timed communication.

    Universities are also using CRM tools to support student retention and minimise attrition. According to a Forrester study, students are 15 per cent more likely to stay with an institution when Salesforce is used to provide communications, learning resources and support services.

    Streamlining communication and collaboration
    By creating a centralised system of engagement, universities can not only support students throughout their academic journey, but also oversee their wellbeing.

    For example, a leading university in Sydney has developed a system that provides a comprehensive view of students and their needs, allowing for integrated and holistic support and transforming its incident reporting and case management.

    Fostering stronger alumni and industry relations
    Another area where CRM systems play a pivotal role is in building alumni and industry relationships. Alumni who feel valued by their university – through personalised engagement – are more likely to return when seeking upskilling, or to lend financial support.

    Personalising communication to industry partners can also help strengthen relationships, potentially leading to sponsored research, grants, and donations, as well as internships and career placements.

    University of Technology Sydney, for example, adopted a centralised data-led strategy for Corporate Relations to change how it works with strategic partners, significantly strengthening its partner network across the university.

    Unlocking the value of data and integration

    With unified data and digital technology driving personalised student interactions, university ICT departments can empower faculty and staff to exceed enrolment goals, foster lifelong student relationships and drive institutional growth.

    To learn more about the strategies and technologies to maximise institutional business value, download the white paper.

    Do you have an idea for a story?
    Email [email protected]

    Source link

  • UCAS End of Cycle provider data, 2024

    UCAS End of Cycle provider data, 2024

    Chat to anyone involved in sector admissions and you will hear a similar story.

    And the story appears to be true.

    It is now clear “high tariff” providers have been lowering their entry tariff (often substantially) in order to grow recruitment – meaning students with less-than-stellar grades have been ending up in prestigious institutions, and the kinds of places students like this would more usually attend have been struggling to recruit as a result.

    In other words, the 2024 looks a lot like a lockdown cycle (without the examnishambles and Zoom pub quizzes).

    Any major dude will tell you

    We noted, at a sector level, the rise in the number of offers made by high-tariff providers – it was the highest number on record. There was no parallel rise in A level attainment, which suggests a strategic decision, made early on, to widen access.

    Today’s release of UCAS End of Cycle data for 2024 at provider level illustrates that this picture is a generalisation. Some high-tariff providers have acted in the way described above, others have pursued alternative strategies. And other providers have hit on other ways to drive undergraduate recruitment.

    Starting with my favourite chart, we can think about these individual strategies in more detail. This scatter plot shows the year-on-year change in the number of applications along the horizontal axis and the year-on-year change in acceptances on the vertical. There’s filters for gender, domicile, age group and subject group (at the top level) – and I’ve provided a choice of comparator years if you want to look at changes over a longer term. The size of the dots represents the total recruitment by that provider in 2024, given the parameters we can see.

    [Full screen]

    In essence this illustrates popularity (among applicants) and selectivity. What we can see here for 2024 (defaulting to UK 18 year olds applying to all subjects compared to 2023) is that pretty much the entire Russell group has made significant (c500 or above) increases in recruitment, whether or not they saw a corresponding growth in applications.

    It’s not the full story – the picture for other pre-92 and post-92 providers is more mixed, with some providers able to leverage popularity (or desperation) to find growth.

    My old school

    We can’t look directly at provider behaviour by tariff, but we can examine what qualifications students placed at the provider have – here a key indicator might be an increase in the number of students entering without A levels (a group that tends to have lower tariffs overall).

    [Full screen]

    The trouble is, A level entry rates have also increased – pretty much anyone who wants to and can do A levels is now doing A levels. With the decline in BTEC popularity, and the still uncertain interest in T levels, this is to be expected. All this means most providers have seen an increase or steady state in the number of students entering with A levels (when you include that A level plus project options). In Scotland – and recall we don’t get the complete picture of Scottish applications from UCAS because of a wonderful little thing called intercalation – it’s SQA pretty much all the way.

    Everything you did

    If you are wondering whether a change in age groups placed as undergraduates could also have an impact on recruitment patterns, it looks as if the pattern of low and slowly falling mature recruitment continues for most providers. For larger universities most of the action is around 18 year old home recruitment – and specialist providers that focus on mature students (often via part-time or flexible study) tend to struggle.

    [Full screen]

    The other key factor is domicile – the changes to visa arrangements this time last year had a huge impact on international applications (particularly from countries like India and Nigeria that have become important for lower tariff providers) and coupled with some of the changes described above this has resulted in some providers seeing undergraduate international admissions fall off a cliff.

    [Full screen]

    As always, undergraduate isn’t the full story – we’ve still no reliable way of understanding postgraduate recruitment in the round until we get the HESA data long after the academic year in question has finished. I just hope that regulators with new duties to understand the financial stability of the sector have more of a clue.

    Any world that I’m welcome to

    With some providers stuffed to the seams and beyond with students they wouldn’t usually accept – many with support needs it is unclear whether they are able to meet – it is unclear who exactly benefits from this new state of affairs. The claim we regularly hear is that universities lose money on educating home students, and that these must be cross subsidised by international recruitment.

    The corollary of this is that in times where international student recruitment is restricted you would expect to see the number of home students at providers reliant on this income fall – after all, if you lose money on every home student the more you recruit the more money you lose. Though measures to widen access and participation are important (and indeed, we see welcome evidence of contextual admissions at selective providers in the chart below) the fact of it is that you need to spend money to support students without the cultural capital to succeed.

    [Full screen]

    The rather painful conclusion I reach is that the only way to make this year’s sums add up is a reduction in spend per student – and, thus, most likely, the quality of the student experience among precisely the students who would have been overjoyed to get a place at a famous university. We should keep a close eye on continuation metrics and the national student survey this year.

    Source link

  • Institutions may be holding themselves back by not sharing enough data

    Institutions may be holding themselves back by not sharing enough data

    Wonkhe readers need little persuasion that information flows are vital to the higher education sector. But without properly considering those flows and how to minimise the risk of something going wrong, institutions can find themselves at risk of substantial fines, claims and reputational damage. These risks need organisational focus from the top down as well as regular review.

    Information flows in higher education occur not only in teaching and research but in every other area of activity such as accommodation arrangements, student support, alumni relations, fundraising, staff and student complaints and disciplinary matters. Sometimes these flows are within organisations, sometimes they involve sharing data externally.

    Universities hold both highly sensitive research information and personal data. Examples of the latter include information about individuals’ physical and mental health, family circumstances, care background, religion, financial information and a huge range of other personal information.

    The public narrative on risks around data tend to focus on examples of inadvertently sharing protected information – such as in the recent case of the Information Commissioner’s decision to fine the Police Service of Northern Ireland £750,000 in relation to the inadvertent disclosure of personal information over 9,000 officers and staff in response to a freedom of information request. The same breach has also resulted in individuals bringing legal claims against the PSNI, with media reports suggesting a potential bill for those at up to £240m.

    There is also the issue of higher education institutions being a target for cyber attack by criminal and state actors. Loss of data through such attacks again has the potential to result in fines and other regulatory action as well as claims by those affected.

    Oversharing and undersharing

    But inadvertent sharing of information and cyberattacks are not the only areas of risk. In some circumstances a failure to ensure that information is properly collected and shared lawfully may also be a risk. And ensuring effective and appropriate flows of information to the governing body is key to it being able to fulfil its oversight function.

    One aspect of the tragic circumstances mentioned in the High Court appeal ruling in the case concerning Natasha Abrahart is the finding that there had been a failure to pass on information about a suicide attempt to key members of staff, which might have enabled action to be taken to remove pressure on Natasha.

    Another area of focus concerns sharing of information related to complaints of sexual harassment and misconduct and subsequent investigations. OfS Condition E6 and its accompanying guidance which comes fully into effect on 1 August 2025 includes measures on matters such as reporting potential complaints and the sensitive handling and fair use of information. The condition and guidance require the provider to set out comprehensively and in an easy to understand manner how it ensures that those “directly affected” by decisions are directly informed about those decisions and the reasons for them.

    There are also potential information flows concerning measures intended to protect students from any actual or potential abuse of power or conflict of interest in respect of what the condition refers to as “intimate personal relationships” between “relevant staff members” and students.

    All of these data flows are highly sensitive and institutions will need to ensure that appropriate thought is given to policies, procedures and systems security as well as identifying the legal basis for collecting, holding and sharing information, taking appropriate account of individual rights.

    A blanket approach will not serve

    Whilst there are some important broad principles in data protection law that should be applied when determining the legal basis for processing personal data, in sensitive cases like allegations of sexual harassment the question of exactly what information can be shared with another person involved in the process often needs to be considered against the particular circumstances.

    Broadly speaking in most cases where sexual harassment or mental health support is concerned, the legislation will require at minimum both a lawful basis and a condition for processing “special category” and/or data that includes potential allegations of a criminal act. Criminal offences and allegations data and special category data (which includes data relating to an individual’s health, sex life and sexual orientation) are subject to heightened controls under the legislation.

    Without getting into the fine detail it can often be necessary to consider individuals’ rights and interests in light of the specific circumstances. This is brought into sharp focus when considering matters such as:

    • Sharing information with an emergency contact in scenarios that might fall short of a clear “life or death” situation.
    • Considering what information to provide to a student who has made a complaint about sexual harassment by another student or staff member in relation to the outcome of their complaint and of any sanction imposed.

    It’s also important not to forget other legal frameworks that may be relevant to data flows. This includes express or implied duties of confidentiality that can arise where sensitive information is concerned. Careful thought needs to be given to make clear in relevant policies and documents when it is envisaged that information might need to be shared, and provided the law permits it.

    A range of other legal frameworks can also be relevant, such as consumer law, equality law and freedom of information obligations. And of course, aside from the legal issues, there will be potential reputational and institutional risks if something does go wrong. It’s important that senior management and governing bodies have sufficient oversight and involvement to encourage a culture of organisational awareness and compliance across the range of information governance issues that can arise.

    Managing the flow of information

    Institutions ought to have processes to keep their data governance under review, including measures that map out the flows and uses of data in accordance with relevant legal frameworks. The responsibility for oversight of data governance lies not only with any Data Protection Officer, but also with senior management and governors who can play a key part in ensuring a good data governance culture within institutions.

    Compliance mechanisms also need regular review and refresh including matters such as how privacy information is provided to individuals in a clear and timely way. Data governance needs to be embedded throughout the lifecycle of each item of data. And where new activities, policies or technologies are being considered, data governance needs to be a central part of project plans at the earliest stages to ensure that appropriate due diligence and other compliance requirements are in place, such as data processing agreements or data protection impact assessments are undertaken.

    Effective management of the flow ensures that the right data gets in front of the right people, at the right time – and means everyone can be confident the right balance has been struck between maintaining privacy and sharing vital information.

    This article is published in association with Mills & Reeve.

    Source link

  • Data futures, reviewed | Wonkhe

    Data futures, reviewed | Wonkhe

    As a sector, we should really have a handle on how many students we have and what they are like.

    Data Futures – the multi-year programme that was designed to modernise the collection of student data – has become, among higher education data professionals, a byword for delays, stress, and mixed messages.

    It was designed to deliver in year data (so 2024-25 data arriving within the 2024-25 academic year) three times a year, drive efficiency in data collection (by allowing for process streamlining and automation), and remove “data duplication” (becoming a single collection that could be used for multiple purposes by statutory customers and others). To date it has achieved none of these benefits, and has instead (for 2022-23 data) driven one of the sectors’ most fundamental pieces of data infrastructure into such chaos that all forward uses of data require heavy caveats.

    The problem with the future

    In short – after seven years of work (at the point the review was first mooted), and substantial investment, we are left with more problems than we started with. Most commentary has focused on four key difficulties:

    • The development of the data collection platform, starting with Civica in 2016 and later taken over by Jisc, has been fraught with difficulties, frequently delayed, and experienced numerous changes in scope
    • The documentation and user experience of the data collection platform has been lacking. Rapid changes have not resulted in updates for those who use the platform within providers, or those who support those providers (the HESA Liaison team). The error handling and automated quality rules have caused particular issues – indeed the current iteration of the platform still struggles with fields that require responses involving decimal fractions.
    • The behavior of some statutory customers – in frequently modifying requirements, changing deadlines, and putting unhelpful regulatory pressure on providers, has not helped matters.
    • The preparedness of the sector has been inconsistent between providers and between software vendors. This level of preparedness has not been fully understood – in part because of a nervousness among providers around regulatory consequences for late submissions.

    These four interlinked strands have been exacerbated by an underlying fifth issue:

    • The quality of programme management, programme delivery, and programme documentation has not been of the standards required for a major infrastructure project. Parts of this have been due to problems in staffing, and problems in programme governance – but there are also reasonable questions to be asked about the underlying programme management process.

    Decisions to be made

    An independent review was originally announced in November 2023, overlapping a parallel internal Jisc investigation. The results we have may not be timely – the review didn’t even appear to start until early 2024 – but even the final report merely represents a starting point for some of the fundamental discussions that need to happen about sector data.

    I say a “starting point” because many of the issues raised by the review concern decisions about the projected benefits of doing data futures. As none of the original benefits of the programme have been realised in any meaningful way, the future of the programme (if it has one) needs to be focused on what people actually want to see happen.

    The headline is in-year data collection. To the external observer, it is embarrassing that while other parts of the education sector can return data on a near-real time basis – universities update the records they hold on students on a regular basis so it should not be impossible to update external data too. It should not come as a surprise that when the review poses the question:

    As a priority, following completion of the 2023-24 data collection, the Statutory Customers (with the help of Jisc) should revisit the initial statement of benefits… in order to ascertain whether a move to in-year data collection is a critical dependent in order to deliver on the benefits of the data futures programme.

    This isn’t just an opportunity for regulators to consider their shopping list – a decision to continue needs to be swiftly followed by a cost-benefit analysis, reassessing the value of in-year collection and determining whether or when to pursue in-year collection. And the decision is that there will, one day, be in-year student data. In a joint statement the four statutory customers said:

    After careful consideration, we intend to take forward the collection of in-year student data

    highlighting the need for data to contribute to “robust and timely regulation”, and reminding institutions that they will need “adequate systems in place to record and submit student data on time”.

    The bit that interests me here is the implications for programme management.

    Managing successful programmes

    If you look at the government’s recent record in delivering large and complex programmes you may be surprised to learn of the existence of a Government Functional Standard covering portfolio, programme, and project management. What’s a programme? Well:

    A programme is a unique, temporary, flexible organisation created to co-ordinate, direct and oversee the implementation of a set of projects and other related work components to deliver outcomes and benefits related to a set of strategic objectives

    Language like this, and the concepts underpinning it come from what remains the gold standard programme management methodology, Managing Successful Programmes (MSP). If you are more familiar with the world of project management (project: “a unique temporary management environment, undertaken in stages, created for the purpose of delivering one or more business products or outcomes”) it bears a familial resemblance to PRINCE2.

    If you do manage projects for a living, you might be wondering where I have been for the last decade or so. The cool kids these days are into a suite of methodologies that come under the general description of “agile” – PRINCE2 these days is seen primarily as a cautionary tale: a “waterfall” (top down, documentation centered, deadline focused) management practice rather than an “iterative” (emergent, development centered, short term) one.

    Each approach has strengths and weaknesses. Waterfall methods are great if you want to develop something that meets a clearly defined need against clear milestones and a well understood specification. Agile methods are a nice way to avoid writing reports and updating documentation.

    Data futures as a case study

    In the real world, the distinction is less clear cut. Most large programmes in the public sector use elements of waterfall methods (regular project reports, milestones, risk and benefits management, senior responsible owners, formal governance) as a scaffold in which sit agile elements at a more junior level (short development cycle, regular “releases” of “product” prioritised above documentation). While this can be done well it is very easy for the two ideologically separate approaches to drift apart – and it doesn’t take much to read this into what the independent review of data futures reveals.

    Recommendation B1 calls, essentially, for clarity:

    • Clarity of roles and responsibilities
    • Clarity of purpose for the programme
    • Clarity on the timetable, and on how and when the scope of the programme can be changed

    This is amplified by recommendation C1, which looks for specific clarifications around “benefits realisation” – which itself underpins the central recommendation relating to in-year data.

    In classic programme management (like MSP) the business case will include a map of programme benefits: that is, all of the good things that will come about as a result of the hard work of the programme. Like the business case’s risk register (a list of all the bad things that might happen and what can be done if they did) it is supposed to be regularly updated and signed off by the Programme Board – which is made up of the most senior staff responsible for the work of the programme (the Senior Responsible Owners) in the lingo.

    The statement of benefits languished for some time without a full update (there was an incomplete attempt in February 2023, and a promise to make another one after the completed 2022-23 collection – we are not told whether the second had happened). In proper, grown-up, programme management this is supposed to be done in a systematic way: every programme board meeting you review the benefits and the risk register. It’s dull (most of the time!) but it is important. The board needs an eye on whether the programme still offers value overall (based on an analysis of projected benefits). And if the scope needed to change, the board would have final say on that.

    The issue with Data Futures was clarity over whether this level of governance actually had the power to do these things, and – if not – who was actually doing them. The Office for Students latterly put together quite a complex and unwieldy governance structure, with a quarterly review board having oversight of the main programme board. This QRB was made up of very senior staff at the statutory customers (OfS, HEFCW, SFC, DoE(NI)), Jisc, and HESA (plus one Margaret Monckton – now chair of this independent review! – as an external voice).

    The QRB oversaw the work of the programme board – meaning that decisions made by the senior staff nominally responsible for the direction of the programme were often second guessed by their direct line managers. The programme board was supposed to have its own assurance function and an independent observer – it did not (despite the budget being there for it).

    Stop and go

    Another role of the board is to make what are more generally called “stop-go” decisions, and are here described as “approval to proceed”. This is an important way of making sure the programme is still on track – you’d set (in advance) the criteria that needed to be fulfilled in terms of delivery (was the platform ready, had the testing been done) before you moved on to the next work package. Below this, incremental approvals are made by line managers or senior staff as required, but reported upwards to the board.

    What seems to have happened a lot in the Data Futures programme is what’s called conditional approvals – where some of these conditions were waived based on assurances that the remaining required work was completed. This is fine as it goes (not everything lines up all the time) but as the report notes:

    While the conditions of the approvals were tracked in subsequent increment approval documents, they were not given a deadline, assignee or accountable owner for the conditions. Furthermore, there were cases where conditions were not met by the time of the subsequent approval

    Why would you do that? Well, you’d be tempted if you had another board above you – comprising very senior staff and key statutory customers – concerned about the very public problems with Data Futures and looking for progress. The Quarterly Review Board (QRB) as it turned out, only actually ended up making five decisions (and in three of these cases it just punted the issue back down to the programme board – the other two, for completists, were to delay plans for in-year collection).

    What it was meant to be doing was “providing assurance on progress”, “acting as an escalation point” and “approving external assurance activities”. As we’ve already seen, it didn’t really bother with external assurance. And on the other points the review is damning:

    From the minutes provided, the extent to which the members of the QRG actively challenged the programme’s progress and performance in the forum appears to be limited. There was not a clear delegation of responsibilities between the QRG, Programme Board and other stakeholders. In practice, there was a lack of clarity also on the role of the Data Futures governance structure and the role of the Statutory Customers separately to the Data Futures governance structure; some decisions around the data specification were taken outside of the governance structure.

    Little wonder that the section concludes:

    Overall, the Programme Board and QRG were unable to gain an independent, unbiased view on the progress and success of the project. If independent project assurance had been in place throughout the Data Futures project, this would have supported members of the Programme Board in oversight of progress and issues may have been raised and resolved sooner

    Resourcing issues

    Jisc, as developer, took on responsibility for technical delivery in late 2019. Incredibly, Jisc was not provided with funding to do this work until March 2020.

    As luck would have it, March 2020 saw the onset of a series of lockdowns and a huge upswing in demand for the kind of technical and data skills needed to deliver a programme like data futures. Jisc struggled to fill key posts, most notably running for a substantive period of time without a testing lead in post.

    If you think back to the 2022-23 collection, the accepted explanation around the sector for what – at heart – had gone wrong was a failure to test “edge cases”. Students, it turns out, are complex and unpredictable things – with combinations of characteristics and registrations that you might not expect to find. A properly managed programme of testing would have focused on these edge cases – there would have been less issues faced when the collection went live.

    Underresourcing and understaffing are problems in their own right, but these were exacerbated by rapidly changing data model requirements, largely coming from statutory customers.

    To quote the detail from from the report:

    The expected model for data collection under the Data Futures Programme has changed repeatedly and extensively, with ongoing changes over several years on the detail of the data model as well as the nature of collection and the planned number of in-year collections. Prior to 2020, these changes were driven by challenges with the initial implementation. The initial data model developed was changed substantially due to technical challenges after a number of institutions had expended significant time and resource working to develop and implement it. Since 2020, these changes were made to reflect evolving requirements of the return from Statutory Customers, ongoing enhancements to the data model and data specification and significantly, the ongoing development of quality rules and necessary technical changes determined as a result of bugs identified after the return had ‘gone live’. These changes have caused substantial challenges to delivery of the Data Futures Programme – specifically reducing sector confidence and engagement as well as resulting in a compressed timeline for software development.

    Sector readiness

    It’s not enough to conjure up a new data specification and platform – it is hugely important to be sure that your key people (“operational contacts”) within the universities and colleges that would be submitting data are ready.

    On a high level, this did happen – there were numerous surveys of provider readiness, and the programme also worked with the small number of software vendors that supply student information systems to the sector. This formal programme communication came alongside the more established links between the sector and the HESA Liaison team.

    However, such was the level of mistrust between universities and the Office for Students (who could technically have found struggling providers in breach of condition of registration F4), that it is widely understood that answers to these surveys were less than honest. As the report says:

    Institutions did not feel like they could answer the surveys honestly, especially in instances where the institution was not on track to submit data in line with the reporting requirements, due to the outputs of the surveys being accessible to regulators/funders and concerns about additional regulatory burden as a result.

    The decision to scrap a planned mandatory trial of the platform, made in March 2022 by the Quarterly Review Group, was ostensibly made to reduce burden – but, coupled with the unreliable survey responses, this meant that HESA was unable to identify cases where support was needed.

    This is precisely the kind of risk that should have been escalated to programme board level – a lack of transparency between Jisc and the board about readiness made it harder to take strategic actions on the basis of evidence about where the sector really was. And the issue continued into live collection – because Liaison were not made aware of common problems (“known issues”, in fact) the team often struggled with out-of-date documentation: meaning that providers got conflicting messages from different parts of Jisc.

    Liaison, on their part, dealt with more than 39,000 messages between October and December 2023 (during the peak of issues raised during the collection process) – even given the problems noted above they resolved 61 per cent of queries on the first try. Given the level of stress in the sector (queries came in at all hours of the day) and the longstanding and special relationship that data professionals have with HESA Liasion, you could hardly criticise that team for making the best of a near-impossible situation.

    I am glad to see that the review notes:

    The need for additional staff, late working hours, and the pressure of user acceptance testing highlights the hidden costs and stress associated with the programme, both at institutions and at Jisc. Several institutions talked about teams not being able to take holidays over the summer period due to the volume of work to be delivered. Many of the institutions we spoke to indicated that members of their team had chosen to move into other roles at the institution, leave the sector altogether, experienced long term sickness absence or retired early as a result of their experiences, and whilst difficult to quantify, this will have a long-term impact on the sector’s capabilities in this complex and fairly niche area.

    Anyone who was even tangentially involved in the 2022-23 collection, or attended the “Data Futures Redux” session at the Festival of Higher Education last year, will find those words familiar.

    Moving forward

    The decision on in-year data has been made – it will not happen before the 2026-27 academic year, but it will happen. The programme delivery and governance will need to improve, and there are numerous detailed recommendations to that end: we should expect more detail and the timeline to follow.

    It does look as though there will be more changes to the data model to come – though the recommendation is that this should be frozen 18 months before the start of data collection which by my reckoning would mean a confirmed data model printed out and on the walls of SROC members in the spring of 2026. A subset of institutions would make an early in-year submission, which may not be published to “allow for lower than ideal data quality”.

    On arrangements for collections for 2024-25 and 2025-26 there are no firm recommendations – it is hoped that data model changes will be minimal and the time used to ensure that the sector and Jisc are genuinely ready for the advent of the data future.

    Source link

  • Comparative Data on Race & Ethnicity in Education Abroad by Percentage of Students [2025]

    Comparative Data on Race & Ethnicity in Education Abroad by Percentage of Students [2025]

    References

     

    American Association of Community Colleges. (2024). AACC Fast Facts 2024. https://www.aacc.nche.edu/researchtrends/fast-facts/

     

    Fund for Education Abroad (FEA). (2024, December). Comparative Data on Race & Ethnicity of FEA Awards 20222023 by Percentage of Students. Data obtained from Joelle Leinbach, Program Manager at the Fund for Education Abroad. https://fundforeducationabroad.org/  

     

    Institute of International Education. (2024). Profile of U.S. Study Abroad Students, 2024 Open Doors U.S. Student Data. https://opendoorsdata.org/data/us-study-abroad/student-profile/  

     

    Institute for International Education. (2024). Student Characteristics: U.S. Students Studying Abroad at Associate’s Colleges Data from the 2024 Open Doors Report. https://opendoorsdata.org/data/us-study-abroad/community-college-student-characteristics/

     

    Institute for International Education. (2022, May) A Legacy of Supporting Excellence and Opportunity in Study Abroad: 20-Year Impact Study, Comprehensive Report. Benjamin A. Gilman International Scholarship. https://www.gilmanscholarship.org/program/program-statistics/ 

     

    United States Census Bureau. (2020). DP1 | Profile of General Population and Housing Characteristics, 2020: DEC Demographic Profile. https://data.census.gov/table?g=010XX00US&d=DEC+Demographic+Profile  

     

    U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics. (2023, August). Characteristics of Postsecondary Students. https://nces.ed.gov/programs/coe/indicator/csb/postsecondarystudents

    Bibliography of Literature, Presentations & Curriculum Integration Projects Incorporating the Comparative Data Table on Race & Ethnicity in Education Abroad

    Comp, D. & Bakkum, N. (2025, January). Study Away/Abroad for All Students! – Who Studies Away/Abroad at Columbia College? Invited presentation for faculty at the Winter 2025 Faculty and Staff Development Days at Columbia College Chicago.

    Lorge, K. & Comp, D. (2024, April). A Case for Simple and Comparable Data to Assess Race and Ethnicity in Education Abroad. The Global Impact Exchange: Publication of Diversity Abroad. Spring 2024. https://www.diversityabroad.org/GlobalImpactExchange 

    Comp, D. (2019). Effective Utilization of Data for Strategic Planning and Reporting with Case Study: My Failed Advocacy Strategy. In. A.C. Ogden, L.M. Alexander, & Mackintosh, E. (Eds.). Education Abroad Operational Management: Strategies, Opportunities, and Innovations, A Report on ISA ThinkDen, 72-75. Austin, TX: International Studies Abroad. https://educationaltravel.worldstrides.com/rs/313-GJL-850/images/ISA%20ThinkDen%20Report%202018.pdf  

    Comp, D. (2018, July). Effective Utilization of Data for Strategic Planning and Reporting in Education Abroad. Invited presentation at the ISA ThinkDen at the 2018 ThinkDen meeting, Boulder CO.

    Comp, D. (2010). Comparative Data on Race and Ethnicity in Education Abroad. In Diversity in International Education Hands-On Workshop: Summary Report and Data from the Workshop held on September 21, 2010, National Press Club, Washington, D.C. (pp. 19-21). American Institute For Foreign Study. https://www.aifsabroad.com/publications/

    Stallman, E., Woodruff, G., Kasravi, J., & Comp, D. (2010, March). The Diversification of the Student Profile. In W.W. Hoffa & S. DePaul (Eds.). A History of US Study Abroad: 1965 to Present, 115-160. Carlisle, PA: The Forum on Education Abroad/Frontiers: The Interdisciplinary Journal of Study Abroad.

    Comp, D., & Woodruff, G.A. (2008, May). Data and Research on U.S. Multicultural Students in Study Abroad. Co-Chair and presentation at the 2008 NAFSA Annual Conference, Washington, D.C.

    Comp, D.  (2008, Spring). U.S. Heritage-Seeking Students Discover Minority Communities in Western Europe.  Journal of Studies in International Education, 12 (1), 29-37.

    Comp, D.  (2007). Tool for Institutions & Organizations to Assess Diversity of Participants in Education Abroad. Used by the University of Minnesota Curriculum Integration Project.

    Comp, D. (2006). Underrepresentation in Education Abroad – Comparative Data on Race and Ethnicity. Hosted on the NAFSA: Association of International Educators, “Year of Study Abroad” website.

    Comp, D. (2005, November). NAFSA: Association of International Educators Subcommittee on Underrepresentation in Education Abroad Newsletter, 1 (2), 6.

    Past IHEC Blog posts about the Comparative Data Table on Race & Ethnicity in Education Abroad

    Tool for Institutions & Organizations to Assess Diversity of Participants in Education Abroad [February 15, 2011]

    How Do We Diversify The U.S. Study Abroad Student Population? [September 21, 2010]

    How do we Diversify the U.S. Study Abroad Student Profile? [December 8, 2009]

    Source link

  • Crafting technology-driven IEPs

    Crafting technology-driven IEPs

    Key points:

    Individualized Education Plans (IEP) have been the foundation of special education for decades, and the process in which these documents are written has evolved over the years.

    As technology has evolved, writing documents has also evolved. Before programs existed to streamline the IEP writing process, creating IEPs was once a daunting task of paper and pencil. Not only has the process of writing the IEP evolved, but IEPs are becoming technology-driven.

    Enhancing IEP goal progress with data-driven insights using technology: There are a variety of learning platforms that can monitor a student’s performance in real-time, tailoring to their individual needs and intervening areas for improvement. Data from these programs can be used to create students’ annual IEP goals. This study mentions that the ReadWorks program, used for progress monitoring IEP goals, has 1.2 million teachers and 17 million students using its resources, which provide content, curricular support, and digital tools. ReadWorks is free and provides all its resources free of charge and has both printed and digital versions of the material available to teachers and students (Education Technology Nonprofit, 2021).

    Student engagement and involvement with technology-driven IEPs: Technology-driven IEPs can also empower students to take an active role in their education plan. According to this study, research shows that special education students benefit from educational technology, especially in concept teaching and in practice-feedback type instructional activities (Carter & Center, 2005; Hall, Hughes & Filbert, 2000; Hasselbring & Glaser, 2000). It is vital for students to take ownership in their learning. When students on an IEP reach a certain age, it is important for them to be the active lead in their plan. Digital tools that are used for technology-driven IEPs can provide students with visual representations of their progress, such as dashboards or graphs. When students are given a visual representation of their progress, their engagement and motivation increases.

    Technology-driven IEPs make learning fun: This study discusses technology-enhanced and game based learning for children with special needs. Gamified programs, virtual reality (VR), and augmented reality (AR) change the learning experience from traditional to transformative. Gamified programs are intended to motivate students with rewards, personalized feedback, and competition with leaderboards and challenges to make learning feel like play. Virtual reality gives students an immersive experience that they would otherwise only be able to experience outside of the classroom. It allows for deep engagement and experiential learning via virtual field trips and simulations, without the risk of visiting dangerous places or costly field trip fees that not all districts or students can afford. Augmented reality allows students to visualize abstract concepts such as anatomy or 3D shapes in context. All these technologies align with technology-driven IEPs by providing personalized, accessible, and measurable learning experiences that address diverse needs. These technologies can adapt to a student’s individual skill level, pace, and goals, supporting their IEP.

    Challenges with technology-driven IEPs: Although there are many benefits to
    technology-driven IEPs, it is important to address the potential challenges to ensure equity across school districts. Access to technology in underfunded school districts can be challenging without proper investment in infrastructures, devices, and network connection. Student privacy and data must also be properly addressed. With the use of technologies for technology-driven IEPs, school districts must take into consideration laws such as the Family Educational Rights and Privacy Act (FERPA).

    The integration of technology into the IEP process to create technology-driven IEPs represents a shift from a traditional process to a transformative process. Technology-driven IEPs create more student-centered learning experiences by implementing digital tools, enhancing collaboration, and personalized learning experiences. These learning experiences will enhance student engagement and motivation and allow students to take control of their own learning, making them leaders in their IEP process. However, as technology continues to evolve, it is important to address the equity gap that may arise in underfunded school districts.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Why Data Alone Won’t Improve Retention – Faculty Focus

    Why Data Alone Won’t Improve Retention – Faculty Focus

    Source link