Science Minister Tim Ayres, Assistant Technology Minister Andrew Charlton, entrepreneur Rebecca Di-Noia and AI researcher Lailei Huang at The Factory grounds in Parramatta to announce the release of the National AI Plan. Picture: Simon Bullard.
Australia’s first national plan for artificial intelligence aims to upskill workers to boost productivity, but will leave the tech largely unregulated and without its own legislation to operate under.
Please login below to view content or subscribe now.
What does the latest research tell us about students using AI for college planning?
If you have spent time with today’s high school students, you know their college search journey looks nothing like it did ten, or even five, years ago. A glossy brochure or a well-timed postcard still has a place. However, the first “hello” increasingly comes through a digital assistant, a TikTok video, or a quick artificial intelligence–powered search.
Let us not pretend artificial intelligence (AI) is everyone’s new best friend. Some students are eager, some are eye-rolling, and plenty are stuck in the “maybe” camp. That mix of excitement and hesitation is real, and it deserves as much attention as hype.
The data is clear: nearly half of students (45 percent) have already used a digital AI assistant on a college website, with usage peaking among 9th- and 10th-graders (RNL, Halda, & Modern Campus, 2025). At the same time, a full third of students nationwide have turned to tools like ChatGPT to explore colleges, scholarships, and even essay help (RNL & Teen Voice, 2025).
This trend is playing out nationwide, with major news outlets reporting that AI chatbots are becoming a common part of the college application process, assisting students with everything from brainstorming essays to navigating deadlines (Singer, 2023).
For many students, AI is not futuristic; it is already woven into how they imagine, explore, and narrow their choices. Recent reporting confirms that AI-driven college search platforms are helping more students, especially those without access to personalized guidance, find the right fit and expand their options beyond what they might have considered on their own (Greenberg, 2025).
Beyond RNL: What other research shows
The RNL findings fit a much bigger story about how AI changes education. Around the world, researchers are watching students test, tinker, and sometimes wrestle with what these tools mean for learning and planning.
One line of research looks at predictive modeling. Recent studies have shown that AI-driven platforms can analyze student data, grades, extracurricular activities, and demographics to predict which students are likely to pursue college and which might need extra support (Eid, Mansouri, & Miled, 2024). By flagging students at risk of falling off the college pathway, these predictive systems allow counselors to intervene earlier, potentially changing a student’s trajectory.
Another cluster of studies zeroes in on personalized guidance. Tools built around a student’s interests and goals can recommend classes, extracurriculars, and colleges that “fit” better than a generic list. This is especially important in schools where one counselor may juggle hundreds of students (Majjate et al., 2023).
Meanwhile, students are already using AI, sometimes in ways that make their teachers nervous. A Swedish study added some nuance: the most confident students use AI the most, while those who are already unsure of their skills tend to hold back (Klarin, 2024). That raises real equity questions about who benefits.
And not all students are fans. Some research highlights concerns about privacy, over-reliance, and losing the chance to build their problem-solving muscles. It is a reminder that skepticism is not resistance for resistance’s sake but a way of protecting what matters to them.
On the institutional side, surveys suggest that many colleges are preparing to use AI in admissions, whether for transcript analysis or essay review. Recent coverage underscores that admissions offices are increasingly turning to AI tools to streamline application review, identify best-fit students, and even personalize outreach (Barnard, 2024).
If all of this feels like a promise and a warning label, it is because it is. AI can democratize access to information, but it can also amplify bias. Students know that. And they want us to take their concerns seriously.
Empower your leadership and staff to harness the power of AI.
Don’t get left behind in the AI transformation for higher education. See how RNL’s AI Education Services can help your leaders and staff unlock the full potential of AI on your campus.
Meet the pioneers, aspirers, resistors, and fence-sitters
As revealed by our research in The AI Divide in College Planning (RNL & Teen Voice, 2025), not all students approach artificial intelligence the same way. Four personas stand out:
Pioneers are already deep in the mix, using artificial intelligence for research, essays, and scholarship searches. Many say it has opened doors to colleges they might not have even considered otherwise.
Aspirers are curious but want proof. They like the idea of scholarship searches or cost planning, but need easy, free tools and success stories before they commit.
Resistors lean on counselors and family. They are worried about accuracy and privacy, but might come around if an advisor they trust introduces the tool.
Fence-Sitters are classic “wait and see” students. A third might trust artificial intelligence to guide them through the application process, but the majority are still unsure.
The takeaway? There is no single “artificial intelligence student.” Institutions need flexible strategies that welcome the eager, reassure the cautious, and do not alienate the skeptics.
What happens after the chatbot says, “Hello“?
One of the most striking findings from the E-Expectations study is that students rarely stop at the chatbot (RNL, Halda, & Modern Campus, 2025). After engaging with an AI assistant, they move. Twenty-nine percent email admissions directly, 28% click deeper into the website, 27% fill out an inquiry form, and almost a quarter apply.
In other words, that little chat bubble is not just answering frequently asked questions. It is a launchpad.
Personalization meets privacy
Here is another twist. While most students (61%) want personalization, they want it on their terms. Nearly half prefer to filter and customize their content, while only 16% want the college to decide automatically (RNL, Halda, & Modern Campus, 2025).
That is the sweet spot for artificial intelligence: not deciding for students but giving them the levers to design their journey.
What this means for your enrollment teams
AI is not just a front-end feature but a funnel mover. Treat chatbot engagement like an inquiry. Have a system ready to respond quickly when a student shifts from chatting to acting.
Remember the personas. Pioneers want depth, Aspirers want reassurance, Resistors want trusted guides, and Fence-Sitters want time. Design communications that honor those differences instead of pushing one script for all.
Personalization is not about guessing. It is about giving students control. Build tools that let them filter, sort, search, and resist the temptation to over-curate their journey.
AI is a natural fit for cost and scholarship exploration. If you want to hook Aspirers, put AI into your net price calculators or scholarship finders.
Virtual tours and event registration bots should not feel like gimmicks. When done well, they can bridge the gap between interest and visit, giving students confidence before setting foot on campus.
Download the complete reports from RNL and our partners to see what students are telling us directly:
Dr. Emmanuel LalandeHistorically Black Colleges and Universities (HBCUs) have always stood on the frontlines of educational equity, carving pathways to excellence for generations of Black students against overwhelming odds. Today, as higher education faces a shift driven by technology, declining enrollment, and resource disparities, a new opportunity emerges: the power of Artificial Intelligence (AI) to reshape, reimagine, and reinforce the mission of HBCUs.
From admissions automation and predictive analytics to personalized learning and AI-powered tutoring, artificial intelligence is no longer theoretical, it is operational. At large institutions, AI-driven chatbots and enrollment algorithms have already improved student engagement and reduced summer melt. Meanwhile, HBCUs, particularly smaller and underfunded ones, risk being left behind.
The imperative for HBCUs to act now is not about chasing trends about survival, relevance, and reclaiming leadership in shaping the future of Black education.
AI as a Force Aligned with the HBCU Mission
Artificial intelligence, when developed and implemented with intention and ethics, can be one of the most powerful tools for educational justice. HBCUs already do more with less. They enroll 10% of Black students in higher education and produce nearly 20% of all Black graduates. These institutions are responsible for over 25% of Black graduates in STEM fields, and they produce a significant share of Black teachers, judges, engineers, and public servants.
The power of AI can amplify this legacy.
Predictive analytics can flag at-risk students based on attendance, financial aid gaps, and academic performance, helping retention teams intervene before a student drops out.
AI chatbots can provide round-the-clock support to students navigating complex enrollment, financial aid, or housing questions.
AI tutors and adaptive platforms can meet students where they are, especially for those in developmental math, science, or writing courses.
Smart scheduling and resource optimization tools can help HBCUs streamline operations, offering courses more efficiently and improving completion rates.
For small HBCUs with limited staff, outdated technology, and tuition-driven models, AI can serve as a strategic equalizer. But accessing these tools requires intentional partnerships, resources, and cultural buy-in.
The Philanthropic Moment: A Unique Opportunity
The recent announcement from the Bill & Melinda Gates Foundation that it plans to spend its entire $200 billion endowment by 2045 presents a monumental opportunity. The foundation has declared a sharpened focus on “unlocking opportunity” through education, including major investments in AI-powered innovations in K-12 and higher education, particularly in mathematics and student learning platforms.
One such investment is in Magma Math, an AI-driven platform that helps teachers deliver personalized math instruction. The foundation is also actively funding research and development around how AI can close opportunity gaps in postsecondary education and increase economic mobility. Their call for “AI for Equity” aligns with the HBCU mission like no other.
Now is the time for HBCUs to boldly approach philanthropic organizations like the Gates Foundation as strategic partners capable of leading equity-driven AI implementation.
Other foundations should follow suit. Lumina Foundation, Carnegie Corporation, Kresge Foundation, and Strada Education Network have all expressed interest in digital learning and postsecondary success. A targeted, collaborative initiative to equip HBCUs with AI infrastructure, training, and research capacity could be transformative.
Tech Industry Engagement: From Tokenism to True Partnership
The tech industry has begun investing in HBCUs, but more is needed.
OpenAI recently partnered with North Carolina Central University (NCCU) to support AI literacy through its Institute for Artificial Intelligence and Emerging Research. The vision includes scaling support to other HBCUs.
Intel has committed $750,000 to Morgan State University to advance research in AI, data science, and cybersecurity.
Amazon launched the Educator Enablement Program, supporting faculty at HBCUs in learning and teaching AI-related curricula.
Apple and Google have supported HBCU initiatives around coding, machine learning, and entrepreneurship, though these efforts are often episodic or branding-focused. What’s needed now is sustained, institutional investment.
Huston-Tillotson University hosted an inaugural HBCU AI Conference and Training Summit back in April, bringing together AI researchers, students, educators, and industry leaders from across the country. This gathering focused on building inclusive pathways in artificial intelligence, offering interactive workshops, recruiter engagement, and a platform for collaboration among HBCUs, community colleges, and major tech firms.
We call on Microsoft, Salesforce, Nvidia, Coursera, Anthropic, and other major EdTech firms to go beyond surface partnerships. HBCUs are fertile ground for workforce development, AI research, and inclusive tech talent pipelines. Tech companies should invest in labs, curriculum development, student fellowships, and cloud infrastructure, especially at HBCUs without R1 status or multi-million-dollar endowments.
A Framework for Action Across HBCUs
To operate AI within the HBCU context, a few strategic steps can guide implementation:
1. AI Capacity Building Across Faculty and Staff
Workshops, certification programs, and summer institutes can train faculty to integrate AI into pedagogy, advising, and operations. Staff training can ensure AI tools support, not replace, relational student support.
2. Student Engagement Through Research and Internships
HBCUs can establish AI learning hubs where students gain real-world experience developing or auditing algorithms, especially those designed for educational equity.
3. AI Governance
Every HBCU adopting AI must also build frameworks for data privacy, transparency, and bias prevention. As institutions historically rooted in justice, HBCUs can lead the national conversation on ethical AI.
4. Regional and Consortial Collaboration
HBCUs can pool resources to co-purchase AI tools, share grant writers, and build regional research centers. Joint proposals to federal agencies and tech firms will yield greater impact.
5. AI in Strategic Planning and Accreditation
Institutions should embed AI as a theme in Quality Enhancement Plans (QEPs), Title III initiatives, and enrollment management strategies. AI should not be a novelty, it should be a core driver of sustainability and innovation.
Reclaiming the Future
HBCUs were built to meet an unmet need in American education. They responded to exclusion with excellence. They turned marginalization into momentum. Today, they can do it again, this time with algorithms, neural networks, and digital dashboards.
But this moment calls for bold leadership. We must go beyond curiosity and into strategy. We must demand resources, form coalitions, and prepare our institutions not just to use AI, but to shape it.
Let them define what culturally competent, mission-driven artificial intelligence looks like in real life, not in theory.
And to the Gates Foundation, Intel, OpenAI, Amazon, and all who believe in the transformative power of education: invest in HBCUs. Not as charity, but as the smartest, most impactful decision you can make for the future of American innovation.
Because when HBCUs lead, communities rise. And with AI in our hands, the next level of excellence is well within reach.
Dr. Emmanuel Lalande currently serves as Vice President for Enrollment and Student Success and Special Assistant to the President at Voorhees University.
In a digitally-driven world, artificial intelligence (AI) has become the latest technology that either will save or doom the planet depending on who you speak with. Remember when telephones (the ones that hung on the wall) were dubbed as privacy invaders? Even the radio, television, and VHS tapes were feared at the beginning of their existence. Artificial intelligence is no different, but how can we ease the minds of those educators who have trouble embracing the newest innovation in emerging technologies? A shift in the fundamental mindset of educators and learners will be vitally important as AI becomes more and more commonplace. To guide this transformative learning process, critical thinking will become an invaluable commodity.
The critical thinking model developed by Dr. Richard Paul and Dr. Linda Elder is pragmatic and fosters the critical thinking skills needed to navigate AI. Critical thinking is defined by Paul and Elder as “the art of analyzing and evaluating thought processes with a view to improving them” (Paul and Elder 2020, 9). The key is to teach your students ways to improve their thinking and using the Paul and Elder model can be an effective tool.
Navigating the Disorienting Dilemma
As we reason through this innovative technology, we will question truth and reality. Teaching students to analyze critically the information generated from AI chatbots will become necessary for a progressing society. Determining fact from fiction will be a skill that dedicated educators will train their students to harness in the work they complete.
Mezirow (1994, 224) contended that a transformative learning experience starts with a disorienting dilemma that causes an individual to question their understanding of previous assumptions by critically reflecting, validating the critical reflection with insight, and acting upon the new information. I, like many I assume, believed that artificial intelligence was a far-fetched concept that would only be real in the movies; however, AI is here and large-language models, such as Chat-GPT and Gemini, are only going to get more sophisticated with time. I also realized that once I was exposed to the Paul and Elder model for critical thinking in grad school, I was ignorant. I had my transformative moment when I realized that critical thinking is more complex than I thought and that I would need to step up my thinking game if I wanted to become an advanced thinker. Artificial intelligence will challenge even the most confident thinkers. Determining fact from fiction will be the disorienting dilemma that will lead us on this transformative journey. As educators, three strategies that we can use to support this transformation with students are to step up our thinking game, model critical thinking, and use AI for our benefit.
Step Up Your Thinking Game
An advanced thinker not only poses questions to others but focuses within. Understanding the why behind reasoning, acknowledging personal biases and assumptions, and valuing other’s perspectives are key to developing critical thinking skills. The reason that you and your students choose to use AI should be clear and intentional. AI is a tool that produces instantaneous solutions. The resulting details from AI should be analyzed for accuracy, logic, and bias. Results should be compared with multiple sources to ensure that the information, the conclusions, and the implications are precise and complete. Practicing these strategies fosters the development of intellectual virtues, such as intellectual humility, intellectual autonomy, and intellectual integrity.
Model Critical Thinking
As educators, we serve as leaders. Ultimately, our students look up to us and use our guidance in their learning. By modeling critical thinking with students, you are leading the way to fostering intentional questioning, ethical principles, and reflective practices. A start is to change the focus of your teaching from the expectation that students regurgitate information to focusing on more challenging, thought-provoking content that fosters thinking. AI can be a helpful tool for coming up with ideas, helping to shape lesson plans, and designing activities, but the real work will come from designing authentic questions that students can be trained to ask regardless of what the AI generates, such as:
How can I verify the validity and accuracy of this information?
Does the response represent logical and in-depth details?
Is the information precise, significant, and relevant to the knowledge that I am seeking?
Are perspectives that differ from mine represented or can I recognize bias in the information?
What other questions could be asked to dive deeper and more concisely into the information?
Use AI for Your Benefit
Generating activity ideas or lesson plans, creating rubrics, and assisting with basic writing tasks are three ways to easily get started with an AI chatbot. If the output is not what you expected or is incomplete, continue to give the chatbot more information to drive the chatbot to produce the desired outcome. Once you begin practicing with an AI chatbot, achieving your desired outcomes will become second nature.
Using learning outcomes as the basis for an inquiry provides AI with the information needed to generate an activity or lesson plan in seconds with objectives, timed components, suggestions for implementation, a materials list, closing, and follow-up ideas for the activity. Try typing a statement into an AI chatbot, such as ChatGPT or Gemini, and be amazed at the magic. When formulating your inquiry, remember to start with the end goal in mind and describe to AI the output you want. For example, type “Use this learning outcome to create an activity: (add learning outcome)”. AI will generate a comprehensive activity with all the bells and whistles.
Rubric development can be a cumbersome process; therefore, using AI to generate a rubric for a project that you have poured sweat and tears into creating is a very simple and time-saving process. Ask AI to generate a rubric based on the information and directions that you give your students. If the generated rubric is not the right style or in the right format, simply refocus the AI chatbot by being more explicit in your instructions. For example, you may need to be as specific as “Create an analytic rubric with 100, 90, 80, 70, and 0 as the levels of performance using the following expectations for the assignment (paste directions and outcomes). As always, use your critical thinking skills to evaluate the rubric and edit it to best meet your needs before sharing it with students.
Use AI to assist you with generating clearer and more concise messages. When creating an email, giving student feedback, or writing in general, a quick and easy way to use AI to assist you is to give the command “make this sound better” and plop in your message. When teaching your students to use AI, have them question the output that was generated. For example, “Why is this statement more clear and concise than my original thought” or “What can I learn from how AI changed my verbiage?” Focusing on the “why” of the produced information will be the key to fostering critical thinking with your students.
AI is a resourceful and impactful, yet imperfect tool. Fostering critical thinking with your students will help them develop the skills needed to recognize bias, inaccuracies, and AI hallucinations. With the practice of creating specific instructions and questioning the outcome, students will learn to trust themselves to defend AI-generated information.
Dr. Tina Evans earned her Ed.D. in Adult Education from Capella University in 2024. With over 25 years of experience in the education field, she brings deep expertise in higher education curriculum design, technology integration, and evidence-based practices for adult learners. Driven by a passion for critical thinking and a genuine commitment to supporting others, Dr. Evans continues to make a meaningful impact in both her professional and personal spheres.
In a digitally-driven world, artificial intelligence (AI) has become the latest technology that either will save or doom the planet depending on who you speak with. Remember when telephones (the ones that hung on the wall) were dubbed as privacy invaders? Even the radio, television, and VHS tapes were feared at the beginning of their existence. Artificial intelligence is no different, but how can we ease the minds of those educators who have trouble embracing the newest innovation in emerging technologies? A shift in the fundamental mindset of educators and learners will be vitally important as AI becomes more and more commonplace. To guide this transformative learning process, critical thinking will become an invaluable commodity.
The critical thinking model developed by Dr. Richard Paul and Dr. Linda Elder is pragmatic and fosters the critical thinking skills needed to navigate AI. Critical thinking is defined by Paul and Elder as “the art of analyzing and evaluating thought processes with a view to improving them” (Paul and Elder 2020, 9). The key is to teach your students ways to improve their thinking and using the Paul and Elder model can be an effective tool.
Navigating the Disorienting Dilemma
As we reason through this innovative technology, we will question truth and reality. Teaching students to analyze critically the information generated from AI chatbots will become necessary for a progressing society. Determining fact from fiction will be a skill that dedicated educators will train their students to harness in the work they complete.
Mezirow (1994, 224) contended that a transformative learning experience starts with a disorienting dilemma that causes an individual to question their understanding of previous assumptions by critically reflecting, validating the critical reflection with insight, and acting upon the new information. I, like many I assume, believed that artificial intelligence was a far-fetched concept that would only be real in the movies; however, AI is here and large-language models, such as Chat-GPT and Gemini, are only going to get more sophisticated with time. I also realized that once I was exposed to the Paul and Elder model for critical thinking in grad school, I was ignorant. I had my transformative moment when I realized that critical thinking is more complex than I thought and that I would need to step up my thinking game if I wanted to become an advanced thinker. Artificial intelligence will challenge even the most confident thinkers. Determining fact from fiction will be the disorienting dilemma that will lead us on this transformative journey. As educators, three strategies that we can use to support this transformation with students are to step up our thinking game, model critical thinking, and use AI for our benefit.
Step Up Your Thinking Game
An advanced thinker not only poses questions to others but focuses within. Understanding the why behind reasoning, acknowledging personal biases and assumptions, and valuing other’s perspectives are key to developing critical thinking skills. The reason that you and your students choose to use AI should be clear and intentional. AI is a tool that produces instantaneous solutions. The resulting details from AI should be analyzed for accuracy, logic, and bias. Results should be compared with multiple sources to ensure that the information, the conclusions, and the implications are precise and complete. Practicing these strategies fosters the development of intellectual virtues, such as intellectual humility, intellectual autonomy, and intellectual integrity.
Model Critical Thinking
As educators, we serve as leaders. Ultimately, our students look up to us and use our guidance in their learning. By modeling critical thinking with students, you are leading the way to fostering intentional questioning, ethical principles, and reflective practices. A start is to change the focus of your teaching from the expectation that students regurgitate information to focusing on more challenging, thought-provoking content that fosters thinking. AI can be a helpful tool for coming up with ideas, helping to shape lesson plans, and designing activities, but the real work will come from designing authentic questions that students can be trained to ask regardless of what the AI generates, such as:
How can I verify the validity and accuracy of this information?
Does the response represent logical and in-depth details?
Is the information precise, significant, and relevant to the knowledge that I am seeking?
Are perspectives that differ from mine represented or can I recognize bias in the information?
What other questions could be asked to dive deeper and more concisely into the information?
Use AI for Your Benefit
Generating activity ideas or lesson plans, creating rubrics, and assisting with basic writing tasks are three ways to easily get started with an AI chatbot. If the output is not what you expected or is incomplete, continue to give the chatbot more information to drive the chatbot to produce the desired outcome. Once you begin practicing with an AI chatbot, achieving your desired outcomes will become second nature.
Using learning outcomes as the basis for an inquiry provides AI with the information needed to generate an activity or lesson plan in seconds with objectives, timed components, suggestions for implementation, a materials list, closing, and follow-up ideas for the activity. Try typing a statement into an AI chatbot, such as ChatGPT or Gemini, and be amazed at the magic. When formulating your inquiry, remember to start with the end goal in mind and describe to AI the output you want. For example, type “Use this learning outcome to create an activity: (add learning outcome)”. AI will generate a comprehensive activity with all the bells and whistles.
Rubric development can be a cumbersome process; therefore, using AI to generate a rubric for a project that you have poured sweat and tears into creating is a very simple and time-saving process. Ask AI to generate a rubric based on the information and directions that you give your students. If the generated rubric is not the right style or in the right format, simply refocus the AI chatbot by being more explicit in your instructions. For example, you may need to be as specific as “Create an analytic rubric with 100, 90, 80, 70, and 0 as the levels of performance using the following expectations for the assignment (paste directions and outcomes). As always, use your critical thinking skills to evaluate the rubric and edit it to best meet your needs before sharing it with students.
Use AI to assist you with generating clearer and more concise messages. When creating an email, giving student feedback, or writing in general, a quick and easy way to use AI to assist you is to give the command “make this sound better” and plop in your message. When teaching your students to use AI, have them question the output that was generated. For example, “Why is this statement more clear and concise than my original thought” or “What can I learn from how AI changed my verbiage?” Focusing on the “why” of the produced information will be the key to fostering critical thinking with your students.
AI is a resourceful and impactful, yet imperfect tool. Fostering critical thinking with your students will help them develop the skills needed to recognize bias, inaccuracies, and AI hallucinations. With the practice of creating specific instructions and questioning the outcome, students will learn to trust themselves to defend AI-generated information.
Dr. Tina Evans earned her Ed.D. in Adult Education from Capella University in 2024. With over 25 years of experience in the education field, she brings deep expertise in higher education curriculum design, technology integration, and evidence-based practices for adult learners. Driven by a passion for critical thinking and a genuine commitment to supporting others, Dr. Evans continues to make a meaningful impact in both her professional and personal spheres.
In today’s competitive higher education landscape, institutions can no longer afford to rely on instinct alone when it comes to academic program planning. The stakes are too high and the margin for error too slim.
Leaders are facing increasing pressure to align their portfolios with market demand, institutional mission, and student expectations — all while navigating constrained resources and shifting demographics.
The good news? You don’t have to guess. Market intelligence offers a smarter, more strategic foundation for building and refining your academic program mix.
Why program optimization matters now more than ever
Most institutions have at least one program that’s no longer pulling its weight — whether due to declining enrollment, outdated relevance, or oversaturated competition. At the same time, there are often untapped opportunities for growth in emerging or underserved fields.
But how do you decide which programs to scale, sustain, or sunset?
Optimizing your portfolio requires more than internal performance metrics. It calls for an external lens — one that brings into view national and regional trends, labor market signals, and consumer behavior. When done effectively, academic portfolio strategy becomes less about trial and error, and more about clarity and confidence.
The first step: Start with the market
The strongest portfolio strategies begin with robust external data. At Collegis Education, we draw from sources like the National Center for Education Statistics (IPEDS), Lightcast labor market analytics, and Google search trends to assess program performance, student demand, and employment outlooks.
National trends give us the big picture and a foundation to start from. But for our partners, we prioritize regional analysis — because institutions ultimately compete and serve in specific geographic contexts, even with fully online programs. Understanding what’s growing in your state or region is often more actionable than knowing what’s growing nationwide.
Our proprietary methodology filters for:
Five-year conferral growth with positive year-over-year trends
Programs offered by a sufficient number of institutions (to avoid anomalies)
Competitive dynamics and saturation thresholds
Job postings and projected employment growth
This data-driven process helps institutions avoid chasing short-term trends and instead focus on sustainable growth areas.
Ready for a Smarter Way Forward?
Higher ed is hard — but you don’t have to figure it out alone. We can help you transform challenges into opportunities.
Data in action: Insights from today’s growth programs
Collegis’ latest program growth analyses — drawing from 2023 conferral data — surface a diverse mix of high-opportunity programs. While we won’t detail every entry here, a few trends stand out:
Technology and healthcare programs remain strong at the undergraduate level, with degrees like Computer Science and Health Sciences showing continued growth.
Graduate credentials in education and nursing reflect both workforce need and strong student interest.
Laddering potential is especially evident in fields like psychology and health sciences, where institutions can design seamless transitions from associate to bachelor’s. In fields such as education, options to ladder from certificate to master’s programs are growing in demand.
What’s most important isn’t the specific programs, it’s what they reveal: external data can confirm intuition, challenge assumptions, and unlock new strategic direction. And when paired with regional insights, these findings become even more powerful.
How to turn insight into strategy
Having market data is just the beginning. The true value lies in how institutions use it. At Collegis, we help our partners translate insight into action through a structured portfolio development process that includes the following:
Market analysis: Analyzing external data to identify growth areas, saturation risks, and demand signals — regionally and nationally.
Gap analysis: Identifying misalignments between current offerings and market opportunity.
Institutional alignment: Layering in internal metrics — enrollment, outcomes, mission fit, modality, and margin.
Strategic decisions: Prioritizing programs to expand, launch, refine, or sunset.
By grounding these decisions in both internal and external intelligence, institutions can future-proof their portfolios — driving enrollment, meeting workforce needs, and staying mission-aligned.
Put data to work for your portfolio
Program portfolio strategy doesn’t have to be a guessing game. With the right data and a trusted partner, institutions can make bold, confident moves that fuel growth and student success.
Whether you’re validating your instincts or exploring new academic directions, Collegis can help. Our market research and portfolio developmentservices are built to support institutions at every step of the process — with national insights and regional specificity to guide your next move.
Innovation Starts Here
Higher ed is evolving — don’t get left behind. Explore how Collegis can help your institution thrive.
Artificial Intelligence (AI) has become a tool that is used in the classroom. The integration of technology in education has historically been gradual (Holmes, Bialik, and Fadel 2019). Some educators lack the training to effectively use AI in the classroom, which may limit the ability to design AI-based coursework (Amado-Salvatierra et al. 2024). In addition, the lack of understanding of how AI usage can be applied pedagogically can potentially deter AI integration (Afzaal et al. 2024). The focus of this article is to share examples of AI input prompts to generate case studies as a learning tool to help students learn course topics and learning outcomes.
Case studies provide opportunities for students to learn and/or reinforce what they have already learned. While using AI to create case studies, it is important to use correct prompts to obtain appropriate output. For example, a single prompt asking AI to give five cases on five different topics may not provide sufficient detail. However, prompting AI to generate a case study on one topic will potentially generate a more appropriate case. Using as much detail in the initial prompt helps provide better output. An AI prompt example might be: ‘Assume the role of a professor teaching an introductory accounting course. Generate a case study for students to use to learn the very basic format of a balance sheet. Make sure the case study is real-world relevant.’ The stated role, course, and topic can be tailored to fit an appropriate class. The case study generated should be reviewed to verify alignment with the specific topic(s) and learning outcome(s).
Cases should be generated in a format that allows for measurable assessment of the students’ learning. Specifying this detail in an AI prompt while generating a case study will help ensure the AI output is measurable. An additional AI prompt example to include with the previous example might be: ‘The case study is to include an assignment portion that allows faculty to measure the student’s performance.’ Careful evaluation of the output from this prompt needs to be performed to ensure topic alignment.
Once an appropriate case study has been developed, AI can provide a grading rubric for the AI-generated case study by prompting AI to generate a rubric. An AI prompt example for generating a grading rubric is: ‘Provide a grading rubric that aligns with this case study.’ A review of the grading rubric’s use to measure appropriate topics and learning is recommended.
At any point in this process, AI can be used to change the output. For example, an appropriate AI prompt to modify something in a case study might be: ‘Update the above-generated case study to include 5 assets, 3 liabilities, and 2 owners’ equity accounts.’ At any point of revision, assessment of the previously generated output (for instance, the rubric) may need to be regenerated. It is recommended to note the AI prompts that generate acceptable output so that those prompts may be reused for future AI-generated case studies.
Another output for the case study could be an answer sheet for faculty to use and to share with students afterward to enable self-evaluation of performance. An AI prompt example might be: ‘Provide the answer sheet for this case study. Make sure to include details of any calculations and definitions of key terms.’
To add additional depth to using AI in the classroom, faculty may want to create two case studies on the same topic: one to be performed by the student without the use of AI, and one with the use of AI. This two-case-study method could allow students to learn how to use AI appropriately. A list of appropriate AI input prompts for the students to use would assist in the students’ learning how to engineer appropriate AI prompts. This effort would help students because research indicates they have a diminished sense of preparedness when they have insufficient exposure to AI application (Hsiao and Han 2023).
An example of an AI prompt to produce a two-case study with and without the use of AI is: ‘Assume the role of an Accounting Professor teaching an introductory accounting course. Generate one case study for students to use to learn the very basic format of a balance sheet without the use of AI. In addition, generate a second case study with the same format as the first case study for students to use to learn the very basic format of a balance sheet with the use of AI. Make sure the case study is real-world relevant. The case study is to include an assignment portion that allows faculty to measure the student’s performance.’
By using this two-case-study method, faculty can measure changes in student performance, allowing students and faculty to see how AI use can assist with case study topic comprehension. Providing the answer sheets to students will enable them to compare their performance and critically analyze the AI output.
A valuable measure to assess would be the amount of time the student spent on each case. A line item on the case study can be added by AI by including a prompt such as: ‘Provide an item at the end of the case to enable students to report the amount of time that they spent on each of the case studies.’
In addition to the non-AI case and the AI-usage case, students can also perform conceptual evaluations. Such evaluations can be qualitative, allowing the student to critically evaluate how AI assists in efficiency and accuracy on the topic. Another focus could be qualitative evaluations on how AI can be used in their future careers based on the specified topic in the case study. For example, a conceptual question might address how someone in their profession would benefit from AI to help them perform better in their future careers. If these cases are created to be used throughout a course term, students can gain a clearer picture of how AI might be applied to their classroom experience.
In conclusion, faculty can use AI to create tools, such as case studies which are focused on specific topics, to expose students to real-world scenarios and reinforce student learning. It is very important, as faculty and students use AI, to acknowledge that current AI outputs may not always be accurate. Faculty and students should evaluate the accuracy of AI-generated output and adjust as necessary. Experimenting with various AI inputs will allow faculty to become more comfortable with using AI. Recognizing AI’s role in the classroom does not replace faculty, but it can help provide excellent learning opportunities for students and demonstrate how AI can be used effectively.
Rhonda Gilreath is an associate professor of accounting at Tiffin University in Northwest Ohio. She enjoys exploring new opportunities to implement in the classroom to improve pedagogical approaches to prepare her students for career readiness.
References
Afzaal, M., Shanshan, X., Yan, D., and Younas, M. 2024. “Mapping Artificial Intelligence Integration in Education: A Decade of Innovation and Impact (2013–2023)—A Bibliometric Analysis.” IEEE Access 12: 113275–113299. https://doi.org/10.1109/ACCESS.2024.3443313.
Amado-Salvatierra, H. R., Morales-Chan, M., Hernandez-Rizzardini, R., and Rosales, M. 2024. “Exploring Educators’ Perceptions: Artificial Intelligence Integration in Higher Education.” In 2024 IEEE World Engineering Education Conference (EDUNINE), 1–5. https://doi.org/10.1109/EDUNINE60625.2024.10500578.
Holmes, W., Bialik, M., and Fadel, C. 2019. Artificial Intelligence in Education: Promises and Implications for Teaching and Learning. Boston: Center for Curriculum Redesign.
Hsiao, D., and Han, L. 2023. “The Impact of Data Analytics and Artificial Intelligence on the Future Accounting Profession: Perspectives from Accounting Students.” Journal of Theoretical Accounting Research 19 (1): 70–100.
Universities will need to prove to future students why university degrees are worth it in an artificial intelligence (AI) knowledge economy, speakers at Sydney’s latest generative AI meeting said.
Please login below to view content or subscribe now.
ORLANDO, Fla. — Florida Virtual School (FLVS) is partnering with the University of Florida (UF) and the Concord Consortium to introduce a groundbreaking year-long “Artificial Intelligence (AI) in Math” supplemental certification for FLVS middle and high school students enrolled in the school’s Flex option. FLVS instructors who teach Algebra 1 will lead this innovative program, teaching the online courses while also supplementing students’ learning with activities that build students’ understanding of math and AI concepts. FLVS students enrolled in Algebra 1 who elect to earn the certification will begin April 7.
The certification will introduce students to the foundational principles of AI that intersect with core math topics while offering insights into real-world applications, ethical considerations, and career opportunities in AI-related fields. By merging 21st-century technology with education, the program aims to boost students’ math skills, cultivate positive attitudes toward mathematics, and expose them to the rapidly evolving AI landscape.
“As a leader in online education for more than 27 years, Florida Virtual School is committed to being at the forefront of educational innovation,” said Dr. Louis Algaze, president and CEO of Florida Virtual School. “By partnering with the University of Florida and the Concord Consortium, we are equipping our students with essential math skills and the knowledge to navigate and succeed in an AI-enhanced world.”
The certification also includes a collaborative feedback loop between FLVS teachers and UF and Concord Consortium researchers. Teachers will provide critical insights into the online course structure and student outcomes, helping to refine and improve the certification’s effectiveness for future online learners.
“AI is revolutionizing industries worldwide, creating new opportunities,” said Jie Chao, project director at the Concord Consortium. “Our partnership with FLVS allows us to offer robust AI learning opportunities to students with limited access to such resources, bridging the educational gaps and preparing young people for an AI-powered future.”
FLVS teachers will also complete 40 hours of online professional development as part of the program. The training will include learning about specialized learning technologies designed to help visualize abstract math concepts and create interactive AI model explorations to ensure students engage with the AI development process in meaningful and dynamic ways.
FLVS Flex students who are either currently enrolled or are interested in taking Algebra 1 can now sign up for the “AI in Math” certification by filling out this survey. Students who complete the program as part of their FLVS math class will receive enrichment credit and the AI Literacy certificate issued by UF and the Concord Consortium.
About Florida Virtual School (FLVS)
At Florida Virtual School (FLVS), the student is at the center of every decision we make. For 27 years, our certified online teachers have worked one-on-one with students to understand their needs and ensure their success – with FLVS students completing 8.1 million semester courses since the school’s inception. As a fully accredited statewide public school district, Florida students in grades Kindergarten through 12 can enroll tuition-free in full-time and part-time online education options. With more than 200 effective and comprehensive courses, and over 80 fun and exciting clubs, FLVS provides families with a safe, reliable, and flexible education in a supportive environment. As a leading online education provider, FLVS also offers comprehensive digital learning solutions to school districts, from online courses that result in high student performance outcomes, to easy-to-use online platforms, staff training, and support. To learn more, visit our website.
eSchool Media staff cover education technology in all its aspects–from legislation and litigation, to best practices, to lessons learned and new products. First published in March of 1998 as a monthly print and digital newspaper, eSchool Media provides the news and information necessary to help K-20 decision-makers successfully use technology and innovation to transform schools and colleges and achieve their educational goals.