Author: admin

  • We’re testing preschoolers for giftedness. Experts say that doesn’t work

    We’re testing preschoolers for giftedness. Experts say that doesn’t work

    by Sarah Carr, The Hechinger Report
    October 31, 2025

    When I was a kindergartner in the 1980s, the “gifted” programming for my class could be found inside of a chest. 

    I don’t know what toys and learning materials lived there, since I wasn’t one of the handful of presumably more academically advanced kiddos that my kindergarten teacher invited to open the chest. My distinct impression at the time was that my teacher didn’t think I was worthy of the enrichment because I frequently spilled my chocolate milk at lunch and I had also once forgotten to hang a sheet of paper on the class easel — instead painting an elaborate and detailed picture on the stand itself. The withering look on my teacher’s face after seeing the easel assured me that, gifted, I was not.

    The memory, and the enduring mystery of that chest, resurfaced recently when New York City mayoral front-runner Zohran Mamdani announced that if elected on Nov. 4, he would support ending kindergarten entry to the city’s public school gifted program. While many pundits and parents debated the political fallout of the proposal — the city’s segregated gifted program has for decades been credited with keeping many white and wealthier families in the public school system — I wondered what exactly it means to be a gifted kindergartner. In New York City, the determination is made several months before kindergarten starts, but how good is a screening mechanism for 4-year-olds at predicting academic prowess years down the road? 

    New York is not unique for opting to send kids as young as preschool down an accelerated path, no repeat display of giftedness required. It’s common practice at many private schools to try to measure young children’s academic abilities for admissions purposes. Other communities, including Houston and Miami, start gifted or accelerated programs in public schools as early as kindergarten, according to the National Center for Research on Gifted Education. When I reported on schools in New Orleans 15 years ago, they even had a few gifted prekindergarten programs at highly sought after public schools, which enrolled 4-year-olds whose seemingly stunning intellectual abilities were determined at age 3. It’s more common, however, for gifted programs in the public schools to start between grades 2 and 4, according to the center’s surveys.

    There is an assumption embedded in the persistence of gifted programs for the littles that it’s possible to assess a child’s potential, sometimes before they even start school. New York City has followed a long and winding road in its search for the best way to do this. And after more than five decades, the city’s experience offers a case study in how elusive — and, at times, distracting — that quest remains. 

    Three main strategies are used to assign young children to gifted programs, according to the center. The most common path is cognitive testing, which attempts to rate a child’s intelligence in relation to their peer group. Then there is achievement testing, which is supposed to measure how much and how fast a child is learning in school. And the third strategy is teacher evaluations. Some districts use the three measures in combination with each other.

    For nearly four decades, New York prioritized the first strategy, deploying an ever-evolving array of cognitive and IQ tests on its would-be gifted 4-year-olds — tests that families often signed up for in search of competitive advantage as much as anything else.

    Several years ago, a Brooklyn parent named Christine checked out an open house for a citywide gifted elementary school, knowing her child was likely just shy of the test score needed to get in. (Christine did not want her last name used to protect her daughter’s privacy.) 

    The school required her to show paperwork at the door confirming that her daughter had a relatively high score; and when Christine flashed the proof, the PTA member at the door congratulated her. That and the lack of diversity gave the school an exclusive vibe, Christine recalled. 

    “The resources were incredible,” she said. “The library was huge, there was a room full of blocks. It definitely made me envious, because I knew she was not getting in.” Yet years later, she feels “icky” about even visiting.

    Eishika Ahmed’s parents had opportunities of all kinds in mind when they had her tested for gifted kindergarten nearly two decades ago. Ahmed, now 23, remembers an administrator in a small white room with fluorescent lights asking her which boat in a series of cartoonish pictures was “wide.” The then 4-year-old had no idea. 

    “She didn’t look very pleased with my answer,” Ahmed recalled. She did not get into the kindergarten program.

    Related: Young children have unique needs and providing the right care can be a challenge. Our free early childhood education newsletter tracks the issues. 

    Equity and reliability have been long-running concerns for districts relying on cognitive tests.

    In New York, public school parents in some districts were once able to pay private psychologists to evaluate their children — a permissiveness that led to “a series of alleged abuses,” wrote Norm Fruchter, a now-deceased activist, educator and school board leader in a 2019 article called “The Spoils of Whiteness: New York City’s Gifted and Talented Programs.”

    In New Orleans, there was a similar disparity between the private and public testing of 3-year-olds when I lived and reported on schools there. Families could sit on a waitlist, sometimes for months, to take their children through the free process at the district central office. In 2008, the year I wrote about the issue, only five of the 153 3-year-olds tested by the district met the gifted benchmark. But families could also pay a few hundred dollars and go to a private tester who, over the same time period, identified at least 64 children as gifted. “I don’t know if everybody is paying,” one parent told me at the time, “but it defeats the purpose of a public school if you have to pay $300 to get them in.”

    Even after New York City districts outlawed private testers, concerns persisted about parents paying for pricey and extensive test prep to teach them common words and concepts featured on the tests. Moreover, some researchers have worried about racial and cultural bias in cognitive tests more generally. Critics, Fruchter wrote, had long considered them at least partly to assess knowledge of the “reigning cultural milieu in which test-makers and applicants alike were immersed.”

    Across the country, these concerns have led some schools and districts, including New York City, to shift to “nonverbal tests,” which try to assess innate capacity more than experience and exposure. 

    But those tests haven’t made cognitive testing more equitable, said Betsy McCoach, a professor of psychometrics and quantitative psychology at Fordham University and co-principal investigator at the National Center for Research on Gifted Education.

    “There is no way to take prior experience out of a test,” she said. “I wish we could.” Children who’ve had more exposure to tests, problem-solving and patterns are still going to have an advantage on a nonverbal test, McCoach added. 

    And no test can overcome the fact that for very young children, scores can change significantly from year to year, or even week to week. In 2024, researchers analyzed more than 200 studies on the stability of cognitive abilities at different ages. They found that for 4-year-olds, cognitive test scores are not very predictive of long-term scores — or even, necessarily, short-term ones. 

    There’s not enough stability “to say that if we assess someone at age 4, 5, 6 or 7 that a child would or wouldn’t be well-served by being in a gifted program” for multiple years, said Moritz Breit, the lead author of the study and a post-doctoral researcher in the psychology department at the University of Trier in Germany.

    Scores don’t start to become very consistent until later in elementary school, with stability peaking in late adolescence.

    But for 4-year-olds? “Stability is too low for high-stakes decisions,” he said.

    Eishika Ahmed is just one example of how early testing may not predict future achievement. Even though she did not enroll in the kindergarten gifted program, by third grade she was selected for an accelerated program at her school called “top class.”

    Years later, still struck by the inequity of the whole process, she wrote a 2023 essay for the think tank The Century Foundation about it. “The elementary school a child attends shouldn’t have such significant influence over the trajectory of their entire life,” she wrote. “But for students in New York City public schools, there is a real pipeline effect that extends from kindergarten to college. Students who do not enter the pipeline by attending G&T programs at an early age might not have the opportunity to try again.”

    Partly because of the concerns about cognitive tests, New York City dropped intelligence testing entirely in 2021 and shifted to declaring kindergartners gifted based on prekindergarten teacher recommendations. A recent article in Chalkbeat noted that after ending the testing for the youngest, diversity in the kindergarten gifted program increased: In 2023-24, 30 percent of the children were Black and Latino, compared to just 12 percent in 2020, Chalkbeat reported. Teachers in the programs also describe enrolling a broader range of students, including more neurodivergent ones. 

    The big problem, according to several experts, is that when hundreds of individual prekindergarten teachers evaluate 4-year-olds for giftedness, any consistency in defining it can get lost, even if the teachers are guided on what to look for. 

    “The word is drained of meaning because teachers are not thinking about the same thing,” said Sam Meisels, the founding executive director of the Buffett Early Childhood Institute at the University of Nebraska.

    Breit said that research has found that teacher evaluations and grades for young children are less stable and predictive than the (already unstable) cognitive testing. 

    “People are very bad at looking at another person and inferring a lot about what’s going on under the hood,” he said. “When you say, ‘Cognitive abilities are not stable, let’s switch to something else,’ the problem is that there is nothing else to switch to when the goal is stability. Young children are changing a lot.”

    Related: PROOF POINTS: How do you find a gifted child? 

    No one denies that access to gifted programming has been transformative for countless children. McCoach, the Fordham professor, points out that there should be something more challenging for the children who arrive at kindergarten already reading and doing arithmetic, who can be bored moving at the regular pace.

    In an ideal world, experts say, there would be universal screening for giftedness (which some districts, but not New York, have embraced), using multiple measures in a thoughtful way, and there would be frequent entry — and exit — points for the programs. In the early elementary years, that would look less like separate gifted programming and a lot more like meeting every kid where they are. 

    “The question shouldn’t really be: Are you the ‘Big G’?” said McCoach. “That sounds so permanent and stable. The question should be: Who are the kids who need something more than what we are providing in the curriculum?”

    But in the real world, individualized instruction has frequently proved elusive with underresourced schools, large class sizes and teachers who are tasked with catching up the students who are furthest behind. That persistent struggle has provided advocates of gifted education in the early elementary years with what’s perhaps their most powerful argument in sustaining such programs — but it reminds me of that old adage about treating the symptom rather than the disease. 

    At some point a year or two after kindergarten, I did get the chance to be among the chosen when I was selected for a pull-out program known as BEEP. I have no recollection of how we were picked, how often we met or what we did, apart from a performance the BEEP kids held of St. George and the Dragon. I played St. George and I remember uttering one line, declaring my intent to fight the dragon or die. I also remember vividly how much being in BEEP boosted my confidence in my potential — probably its greatest gift.

    Forty years later, the research is clear that every kid deserves the chance — and not just one — to slay a dragon. “You want to give every child the best opportunity to learn as possible,” said Meisels. But when it comes to separate gifted programming for select early elementary school students, “Is there something out there that says their selection is valid? We don’t have that.” 

    “It seems,” he added, “to be a case of people just fooling themselves with the language.” 

    Contact contributing writer Sarah Carr at [email protected]. 

    This story about gifted education was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Hechinger newsletter.

    This <a target=”_blank” href=”https://hechingerreport.org/were-testing-preschoolers-for-giftedness-experts-say-that-doesnt-work/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113170&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/were-testing-preschoolers-for-giftedness-experts-say-that-doesnt-work/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • Relational Communication Theory in Action: Enhancing Learning and Competence – Faculty Focus

    Relational Communication Theory in Action: Enhancing Learning and Competence – Faculty Focus

    Source link

  • Beyond the Price Tag: How Cost Shapes Families’ College Choices

    Beyond the Price Tag: How Cost Shapes Families’ College Choices

    Mother and teenage daughter in kitchen looking at a laptop PC
    Perception of cost has a major impact on college choice.

    Choosing a college is rarely just about academics, location, or prestige. For most families, it comes down to the question of cost. The numbers on a price tag do not just suggest affordability; they shape what feels possible. Sticker shock alone can quietly close a door before a student even fills out an application, while clear, honest information can keep dreams in play. In this moment of rising costs and growing financial anxiety, understanding how families navigate affordability has never mattered more.

    Before examining what RNL’s latest research shows, it helps to step back and see where the broader conversation is heading. Recent studies highlight how affordability, family background, and perceptions of cost steer the college search. Again and again, the evidence points to a simple truth: for families, financial reality and perception are tightly linked (Stabler-Havener, 2024). This context is essential for understanding the RNL findings and considering how colleges can truly meet families where they are.

    What research tells us

    The research is clear: affordability, family income, and perceptions of cost are among the strongest forces shaping college choices. In one recent study, only three in ten students who believed college was unaffordable planned to enroll, showing how perception alone can narrow opportunities (Stabler-Havener, 2024).

    Policy leaders are responding. State priorities now center on boosting affordability and families’ sense of value (Harnisch, Burns, Heckert, Kunkle, & Weeden, 2024). As families weigh cost and worth, the call for reform grows louder.

    Family perspective lies at the heart of these decisions. Financial worries shape the choices parents and students make, often shrinking the list of options for those with fewer resources (Chuong-Nguyen, 2025). Parental guidance and support are deeply shaped by income and stress, sometimes as early as elementary school, when children first start to believe in what is possible (Keeling, 2025). For many out-of-state students, aid and affordability matter more than distance or campus life (Stansell, 2025). While the campus experience may guide the final decision, cost remains the gatekeeper. Together, these studies send a clear message: real and perceived affordability remain central to college access.

    Policy changes with big impact

    Federal policy changes are reshaping the landscape of affordability as well. The One Big Beautiful Bill Act keeps undergraduate loan limits intact but introduces two significant changes: a $65,000 lifetime cap on Parent PLUS loans, and a rule eliminating Pell Grant eligibility if scholarships already cover the full cost of attendance. While these details may sound technical, their impact is deeply personal. Middle- and low-income families, and first-generation students, are most likely to feel squeezed by these new limits (American Council on Education, 2025; National Association of Independent Colleges and Universities, 2025). These changes may become the tipping point for families already sensitive to sticker price.

    What this means for colleges

    The research suggests several practical steps:

    • Make affordability unmistakably clear. Families often overestimate cost and underestimate available aid. Tools like net price calculators and plain-language award letters can help (Chuong-Nguyen, 2025; Stabler-Havener, 2024).
    • Reach parents early. Parents start shaping their child’s college expectations years before high school. Outreach in middle school can expand what families believe is possible (Keeling, 2025).
    • Highlight value as well as cost. Families want to know if college is worth the investment. Colleges can tell stories of career outcomes, alum success, and community, not just numbers (Harnisch et al., 2024; Stansell, 2025).
    • Connect finances to student experience. Students care about campus feel as much as aid. Affordability should be shown alongside housing, safety, clubs, and social life (Stansell, 2025).
    • Prioritize equity. First-generation and lower-income families face more information gaps and greater stress. Targeted advising, financial literacy programs, and direct communication can help bridge that divide (Chuong-Nguyen, 2025; Keeling, 2025).

    What RNL research tells us

    While these studies offer a broad view of how cost and perception shape college decisions, the lived experience of families comes into even sharper focus when we look at recent data from the 2025 Prospective Family Engagement Report. The findings from RNL, Ardeo, and CampusESP provide a window into what families are navigating right now: the confusion, the questions, and sometimes, the sense of being overwhelmed by the college search. Examining this data helps us move from general trends to the specific realities facing families today, and shows where institutions can make the most meaningful difference.

    The bottom line

    For families, cost is never just a number. It is tangled up with their hopes, sense of security, and vision for the future: sticker price, net cost, debt, and perception; all of these shape what feels possible. For colleges, the work goes beyond lowering costs. The real challenge is helping families understand those costs, connect them to real outcomes, and expand what each student believes is within reach.

    Families’ need for clear information

    The 2025 Prospective Family Engagement Report (RNL, Ardeo, & CampusESP, 2025) found that 99% of nearly 10,000 families surveyed believe clear cost, tuition, and academic information is essential. Yet almost one in four families cannot find it. The gap is even larger for first-generation families (37 percent) and those earning under $60,000 (43%). These gaps are not just inconvenient; they are real barriers.

    Faces behind the data

    Consider the single parent in rural Ohio, working two jobs and searching late at night for financial aid information. She finds buried calculators and confusing language and assumes the sticker price is final. The dream quietly shrinks.

    Alternatively, think of the middle-income family in suburban Atlanta. They make too much for much-needed aid but still feel stretched thin. They cross colleges off their list without ever seeing the actual net cost.

    Income-level differences in cost perception

    The study shows clear patterns (RNL, Ardeo, & CampusESP, 2025):

    • Families under $60,000 have the lowest awareness of cost tools, face the most difficulty finding aid information, and are most likely to rule out schools early due to sticker price.
    • Those earning $60,000–$149,000 have moderate awareness, but three in four have eliminated colleges based on sticker price alone.
    • Families earning $150,000 or more have the highest awareness and least trouble finding information, but even among them, almost three in four have ruled out colleges due to price.

    Financial aid and scholarships: The deciding factor

    Four out of five families list aid and scholarships among their top five decision factors; for almost two in five, it is the most important factor. The urgency is even greater for first-generation families (54%) and low-income households (68%).

    • 38% say aid and scholarships top the list.
    • 43% place them in the top five.

    Even among the highest-income families, more than a quarter cite aid as their top factor, and nearly half put it in their top five.

    Sticker shock and final cost

    • 72% of families have ruled out colleges because of sticker price. Middle-income families lead (76%), followed by high-income (74%) and low-income families (66%).
    • 65% say the final cost after aid is the biggest dealbreaker, consistent across first-generation (66%), continuing generation (65%), and especially middle-income families (73%).

    Financing difficulty and loan anxiety

    Paying for college feels “very difficult” for 28% of families, and “difficult” for another 27%. The challenge is sharpest for low-income families (47% “very difficult”) and first-generation families (40%). Even among households earning over $150,000, one in five reports that paying for college will be “very difficult.” Anxiety about borrowing is widespread; 61% of families feel uneasy about loans, regardless of income (RNL, Ardeo, & CampusESP, 2025).

    Implications for colleges

    • Clarity is currency. A trust gap grows when nearly every family values clear cost information, but the most price-sensitive families cannot find it. Make cost information unmistakable, on websites, in print, in portals, and through personal outreach.
    • Lead with your aid story. Aid and scholarships top the list for most families. Burying this information wastes a key point of connection. Use real examples and plain language.
    • Defuse sticker shock early. With nearly three-quarters of families eliminating schools based on sticker price, net price calculators should be prominent, easy to use, and personalized.
    • Do not forget middle-income families. They often miss out on need-based aid but are just as price-sensitive. They deserve targeted outreach and clear explanations of their options.
    • Address financing challenges directly. Offer flexible payment plans, start conversations about the total cost early, and provide tools for first-generation and low-income families. Even high-income families appreciate empathy and honesty.
    • Reframe borrowing. With 61 percent anxious about loans, transparency about repayment timelines, graduate earnings, and debt-to-income ratios is critical.

    The emotional weight of cost

    Cost is never just a number; it is an emotional flashpoint. Families weigh college prices as figures on a spreadsheet and as symbols of opportunity, security, and trust. Information gaps hit first-generation and low-income families hardest, but financial pressure is universal:

    • Aid matters.
    • Sticker price stings.
    • Financing feels difficult for almost everyone.
    • Borrowing brings real anxiety.

    The colleges that thrive will treat cost not only as a financial challenge but as a moment to build trust and expand possibilities for every family they serve.

    Revolutionize your financial aid offers with video

    References
    • American Council on Education. (2025, July 29). Summary: One Big Beautiful Bill Act (H.R. 1). Division of Government Relations and National Engagement.
    • Chuong-Nguyen, M. Q. (2025). College application experience: Personal and institutional factors affecting high school seniors’ college-going decision-making process and college choice (Doctoral dissertation, Concordia University Irvine).
    • Harnisch, T., Burns, R., Heckert, K., Kunkle, K., & Weeden, D. (2024). State priorities for higher education in 2024. State Higher Education Executive Officers Association (SHEEO).
    • Keeling, C. (2025). Perceptions of parents regarding their participation in decision-making related to the academic and technical education preparation of their children’s career pathways (Doctoral dissertation, Purdue University).
    • National Association of Independent Colleges and Universities. (2025, July). Frequently asked questions about the One Big Beautiful Bill Act. NAICU.
    • RNL, Ardeo, & CampusESP. (2025). 2025 Prospective family engagement report. Ruffalo Noel Levitz.
    • Stabler-Havener, J. M. (2024). Interactions between quality, affordability, and income groups at private colleges and universities (Doctoral dissertation, Fordham University).
    • Stansell, L. J. (2025). Driving enrollment amidst change: Exploring college choice of out-of-state students (Doctoral dissertation, University of Tennessee, Knoxville). TRACE: Tennessee Research and Creative Exchange. https://trace.tennessee.edu/utk_graddiss/12424

    Source link

  • The Student Satisfaction Inventory: Data to Capture the Student Experience

    The Student Satisfaction Inventory: Data to Capture the Student Experience

    Student Satisfaction Inventory: Female college student carrying a notebook
    Satisfaction data provides insights across the student experience.

    The Student Satisfaction Inventory (SSI) is the original instrument in the family of Satisfaction-Priorities Survey instruments.  With versions that are appropriate for four-year public/private institutions and two-year community colleges, the Student Satisfaction Inventory provides institutional insight and external national benchmarks to inform decision-making on more than 600 campuses across North America. 

    With its comprehensive approach, the Student Satisfaction Inventory gathers feedback from current students across all class levels to identify not only how satisfied they are, but also what is most important to them. Highly innovative when it first debuted in the mid-1990’s, the approach has now become the standard in understanding institutional strengths (areas of high importance and high satisfaction) and institutional challenges (areas of high importance and low satisfaction).

    With these indicators, college leaders can celebrate what is working on their campus and target resources in areas that have the opportunity for improvement. By administering one survey, on an annual or every-other-year cycle, campuses can gather student feedback across the student experience, including instructional effectiveness, academic advising, registration, recruitment/financial aid, plus campus climate and support services, and track how satisfaction levels increase based on institutional efforts.

    Along with tracking internal benchmarks, the Student Satisfaction Inventory results provide comparisons with a national external norm group of like-type institutions to identify where students are significantly more or less satisfied than students nationally (the national results are published annually). In addition, the provided institutional reporting offers the ability to slice the data by all of the standard and customizable demographic items to provide a clearer approach for targeted initiatives. 

    Like the Adult Student Priorities Survey and the Priorities Survey for Online Learners (the other survey instruments in the Satisfaction-Priorities Surveys family), the data gathered by the Student Satisfaction Inventory can support multiple initiatives on campus, including to inform student success efforts, to provide the student voice for strategic planning, to document priorities for accreditation purposes and to highlight positive messaging for recruitment activities. Student satisfaction has been positively linked with higher individual student retention and higher institutional graduation rates, getting right to the heart of higher education student success. 

    Sandra Hiebert, director of institutional assessment and academic compliance at McPherson College (KS) shares, “We have leveraged what we found in the SSI data to spark adaptive challenge conversations and to facilitate action decisions to directly address student concerns. The process has engaged key components of campus and is helping the student voice to be considered. The data and our subsequent actions were especially helpful for our accreditation process.”

    See how you can strengthen student success with the Student Satisfaction Inventory

    Learn more about best practices for administering the online Student Satisfaction Inventory at your institution, which can be done any time during the academic year on your institution’s timeline.

    Source link

  • Using Motivational and Satisfaction Assessments to Elevate Your KPIs

    Using Motivational and Satisfaction Assessments to Elevate Your KPIs

    In my recent conversations with student success leaders on campuses across the country, I have been hearing more focus on Key Performance Indicators (KPIs). Knowing and tracking appropriate KPIs are essential for gauging a college or university’s success in achieving its objectives. Specific KPIs that matter most will vary based on institutional selectivity, mission, and strategic goals. Some critical KPIs that many institutions track include:

    • enrollment yield
    • net tuition revenue
    • first-year fall to spring persistence
    • second-year return (official retention rate)
    • student learning outcomes
    • student engagement
    • overall student satisfaction
    • graduation rates/time-to-degree (four-year, five-year or six-year)
    • career placement rates
    • alumni giving/engagement rates

    Increasingly, institutions are recognizing the power of data-informed decision-making and leveraging student feedback to drive improvements in key areas and to see the results in their targeted KPIs. Critical components of this approach involve regularly assessing student motivation and student satisfaction.

    Proactively addressing challenges to enhance the student experience

    Motivational and satisfaction assessments provide valuable insights into the student journey, allowing institutions to proactively address challenges and enhance the student experience. These assessments, administered at various points throughout a student’s academic career, can reveal areas of strength and opportunities for improvement, directly impacting a range of KPIs.

    By regularly collecting and analyzing this student feedback, institutions can move beyond reactive problem-solving and instead cultivate a proactive, student-centered approach for continuous improvement. Beyond traditional data points, incorporating the students’ voice provides a richer understanding of the factors influencing student success and retention. The data gathered from these assessments are not only about identifying problems; they uncover the nuances of the student experience and understanding what truly drives engagement and success.

    Improving persistence with targeted interventions

    Understanding student motivation levels, particularly during the critical first and second years, allows for targeted interventions to improve persistence. Early identification of at-risk students, coupled with proactive support, can significantly impact first-year and second-year retention rates. Why stop there?

    Measuring satisfaction with services like advising, instruction, career services, and access to classes can significantly impact student persistence, graduation rates and, ultimately, career readiness. A positive campus climate, characterized by safety, inclusivity, and a strong sense of belonging, fosters student engagement and satisfaction, and student success, which may lead to improved alumni engagement. Furthermore, demonstrating a commitment to student feedback (and acting upon it) can enhance the institution’s reputation and attract prospective students who value a supportive and responsive learning environment.

    Boost student success by tracking the right KPIs

    What KPIs are you regularly tracking and how have you incorporated student feedback data into your efforts and your documented indicators?  If this is an area where you would like to do more, contact me to discuss how student motivation and satisfaction data can best help support your KPI efforts.

    Source link

  • What Families Tell Us About College Visits, Belonging, and Trust

    What Families Tell Us About College Visits, Belonging, and Trust

    Every family’s college search tells a story, one built on hopes, questions, and the quiet moments when a parent whispers, “This feels right.” Over the past year, I have immersed myself in both research and real voices to understand what drives that feeling.

    This blog brings those insights together. I begin with what the research shows, how campus visits, family engagement, and equity intersect, and then layer in fresh data from the 2025 RNL, Ardeo, and CampusESP Prospective Family Engagement Report.

    Together, they reveal a simple truth that feels anything but small: families want to feel seen, informed, and included in the journey. For me, this work is not just about enrollment; it is about belonging, trust, and designing experiences that make families confident in saying, “Yes, this is our place.”

    The research story: Why families and visits matter

    Across K–12 and higher education, families and campus visits consistently emerge as pivotal mechanisms shaping students’ aspirations, access, and belonging. In A Review of the Effectiveness of College Campus Visits on Higher Education Enrollment, Case (2024) shows that campus visits not only help students assess academic and cultural fit but also allow parents and guardians to evaluate safety, hospitality, and organizational factors that directly influence trust and enrollment decisions.

    Amaro-Jiménez, Pant, Hungerford-Kresser, and den Hartog (2020) reinforce that family-centered outreach, such as Latina/o Parent Leadership Conferences that combine campus tours with financial aid and admissions workshops, increases parents’ College Preparedness Knowledge (CPK) and confidence in guiding their children. These immersive experiences turn visits into learning opportunities that demystify college processes and affirm parental agency.

    From an operational lens, Kornowa and Philopoulos (2023) emphasize that admissions and facilities management share responsibility for the campus visit, describing it as “a quintessential part of the college search process for many students and families” (p. 96). Every detail, from signage to staff warmth, shapes families’ perceptions of authenticity and belonging, making visits both emotional and informational experiences.

    In K–12 contexts, Robertson, Nguyen, and Salehi (2022) find that underserved families, particularly those with limited income, face barriers such as inflexible schedules and unwelcoming environments when attending school tours. They call for trust-based, personalized engagement, often led by parent advocates, to turn visits into equitable opportunities rather than exclusive events.

    Similarly, Byrne and Kibort-Crocker (2022) frame college planning through Family Systems Theory, viewing the college search as a shared family transition. Families’ involvement in campus visits, financial planning, and orientation sessions fosters understanding and belonging, especially when institutions provide multilingual materials and parent panels. Even when parents lack “college knowledge,” their emotional support and presence remain vital assets.

    Finally, Wilson and McGuire (2021) expose how stigma and class-based power dynamics shape family engagement in schools. Working-class parents often feel judged or dismissed in institutional spaces, leading to withdrawal rather than disinterest. The authors urge empathetic, flexible communication to dismantle these barriers and create welcoming, inclusive climates for all families.

    Taken together, these six studies show that family engagement and visits are deeply intertwined acts of trust, access, and belonging. Whether evaluating campus safety, building college knowledge, or navigating inequities, families who feel welcomed, informed, and respected become co-authors in their children’s educational journeys.

    The research paints a clear picture: families want to feel informed, included, and welcomed. Our latest data with RNL, Ardeo, and CampusESP shows exactly where those feelings take root, and which experiences most influence their decision to say, “Yes, this is the right college for my student.”

    What families told us: Insights from the 2025 Prospective Family Engagement Report

    Families are not passive bystanders; they are active partners in the college search, weighing what they see, hear, and feel. Their feedback reveals a clear pattern: human connection and real-world experiences matter far more than abstract or digital tools.

    Campus visits and human touchpoints build trust

    The most powerful influences on family support are on-campus visits (97%) and face-to-face interactions with admissions staff (93%), faculty (92%), and coaches (88%).

    For first-generation (98%) and lower-income families (96%), these experiences are even more critical. Seeing the campus, meeting people, and feeling welcomed helps them imagine their student thriving there.

    Key insight: Families decide with both heart and head. A warm, well-organized visit remains the single most persuasive factor in earning their support.

    Virtual engagement expands access

    Two-thirds of families (67%) value virtual visits, but that rises to 75% for first-generation and 80% for lower-income families, groups often limited by cost or travel. Virtual experiences can level the playing field when they feel personal and guided, not automated.

    Key insight: Virtual visits are equity tools, not extras. They must be designed with care, warmth, and a human presence.

    Counselors and college fairs still count

    About 73% of families see college fairs and high school counselors as meaningful sources, especially first-generation (81%) and lower-income (84%) families. These trusted guides help families translate options and make sense of complex processes.

    Key insight: Families lean on human interpreters, counselors, fairs, and coaches to navigate choices with confidence.

    AI tools spark curiosity, not confidence

    Fewer than half of families find AI tools, such as chatbots, program matchers, or demos, meaningful (40–43%). Interest is higher among first-generation (53–56%) and lower-income (55%) families, who may see AI as a learning aid. Still, most want human reassurance alongside it.

    Key insight: AI works best as a co-pilot, not a replacement. Pair technology with empathy and guidance.

    Communication quality matters most

    Two experiences top the list:

    • Information about the program or school (97%)
    • Quality of communication with parents and families (96%)

    For first-generation and lower-income families, both climb to 98%, showing that clear, bilingual, and affirming outreach builds trust and inclusion.

    Key insight: Families value how colleges communicate care; clarity and tone matter as much as content.

    Equity lens: More support, more belonging

    Across nearly every measure, first-generation and lower-income families report higher experiences. They seek more touchpoints, more guidance, and more invitations into the process.

    Key insight: Equity is about designing belonging, mixing in-person and virtual options, speaking their language, and centering relationships.

    This story does not end with the data; it begins there

    Every number and story in this study points to the same truth: families want to feel invited in. They want experiences that inform people who listen, and moments that confirm their student belongs. Our work is to create those moments, to build trust in the details, warmth in the welcome, and clarity in the journey. Because when families feel it, when they walk the campus, meet the people, and think, “This feels right!”, they do not just choose a college. They choose belonging.

    Ready to reach your enrollment goals? Let’s talk how

    Our enrollment experts are veteran campus enrollment managers who now work with hundreds of colleges and universities each year. Find out how we can help you pinpoint the optimal strategies for creating winning student search campaigns, building your inquiry and applicant pools, and increasing yield.

    Complimentary Consultation

    References
    • Amaro-Jiménez, C., Pant, D., Hungerford-Kresser, H., & den Hartog, S. (2020). Identifying the impact of college access efforts on parents’ college preparedness knowledge. Journal of College Access, 6(2), 7–27.
    • Byrne, R., & Kibort-Crocker, E. (2022). What evidence from research tells us: Family engagement in college pathway decisions. Washington Student Achievement Council.
    • Case, R. D. (2024). A review of the effectiveness of college campus visits on higher education enrollment. International Journal of Science and Research, 13(9), 716–718. https://doi.org/10.21275/SR24911223658
    • Kornowa, L., & Philopoulos, A. (2023). The importance of a strong campus visit: A practice brief outlining collaboration between admissions and facilities management. Strategic Enrollment Management Quarterly, 11(1), 54–74.
    • Robertson, M., Nguyen, T., & Salehi, N. (2022). Not another school resource map: Meeting underserved families’ information needs requires trusting relationships and personalized care. Digital Promise Research Brief.
    • Wilson, S., & McGuire, K. (2021). ‘They had already made their minds up’: Understanding the impact of stigma on parental engagement. British Journal of Sociology of Education, 42(5–6), 775–791. https://doi.org/10.1080/01425692.2021.1908115

    Source link

  • The network effects of a university collapse

    The network effects of a university collapse

    Universities are bound together more tightly than ministers like to admit. They share credit lines, pension schemes, suppliers, and reputations. Contagion, once started, moves faster than policy can catch it.

    The question up until recently was the wrong one: could a university fail. The grown-up question is what happens next: to students who haven’t applied yet, to local communities, and to neighbouring universities. We have to begin with the obvious: students come first.

    Failure at one institution produces contagion effects, the magnitude of which depends on regional centrality and clustering. Government should focus on keeping that transmission reproduction number (or R number – remember that from Covid?) below one. This piece maps some of those transmission channels – I’ve modelled small changes to the bottom line of an average neighbouring university, based on student spend, pensions, interest rates and group buying.

    Our patient zero – that is, the first to fall – is a provider that is OfS-registered and regionally significant, and I have estimated the price shock for its financially average neighbour. Of course, I have to assume no rescue package from government (they have signalled as such).

    The calculations below are based on the average university using 2023–24 HESA data, excluding FE colleges with HE provision. Each percentage change is illustrative rather than predictive and actual outcomes will depend on local factors.

    Stay at home

    Student demand runs on policy signals and vibes as much as price. In 2024 we saw sponsored study visas fall year-on-year and dependants drop sharply, while PGT overseas remained twitchy.

    Throw a closure into that salad and you start to see conversion eroding. Bursary spend and fee waivers will rise to keep offers attractive. A percentage point here or there looks insignificant but adds up across multiple providers.

    International students will start looking elsewhere: Australia, Canada, Germany. Or they’ll just stay at home. Better to choose a sure thing than risk having your course disrupted halfway through. Home students may be similarly spooked – but have fewer alternatives.

    There are a few antidotes: multiple guaranteed transfer corridors, decent student protection plans, and teach-out clarity. And most importantly, comms that make sense to agents and parents.

    Illustrative hits to an average university elsewhere:

    • Additional bursary spend on international: £80m × 0.5% = £400k
    • Reduced international demand: £80m × 0.5% = £400k

    Protect the USS

    Multi-employer pension schemes, like USS and LGPS, can go very squiffy when a member exits. In that case, the rules force the member to pay a large exit bill called “Section 75”, and the sums can be eye-watering. It’s a standard expectation of a “last man standing” scheme.

    Trinity College, Cambridge wrote a cheque for about £30m to leave USS in 2019. USS has suggested that, for a sample of employers (mainly Oxbridge colleges), a crystallised bill could represent anywhere from 4 to 97 per cent of their cash and long-term investment balances, averaging around 26 per cent.

    In practice, an insolvent provider wouldn’t cough up, so other universities would absorb the orphan liability. But there isn’t a mechanical “spread the S75 bill this year” formula; it would show up, if at all, via the valuation and rate-setting process. The scheme is currently in surplus, so additional contribution costs are uncertain. Of course, not all universities are enrolled in USS, but the vast majority are enrolled in multi-employer schemes.

    Illustrative hit to a USS-enrolled university elsewhere:

    • Salary base: £181m × 72% = £130m
    • USS proportion: £130m × 70% (say) = £91m
    • 1 pp rate bump: £91m × 1% = £910k

    Save livelihoods

    Universities drive jobs, rents, transport and culture. Liverpool estimates £2.2bn GVA and 26,630 jobs supported nationwide, roughly one in fifty locally. Northampton reports £823m GVA and 10,610 jobs. National estimates put the sector above £116bn.

    Remove the local provider and the GVA virtuous circle turns vicious. Cafés lose footfall, landlords lose tenants (poor them), and pubs are no longer full of students. The extent depends on how rooted the provider is in its community.

    Government will find itself paying anyway. Either pre-emptively with small civic grants to keep key services alive, or retrospectively with bigger cheques after the rot sets in. Maybe it will finally put a stop to town and gown tensions.

    Illustrative hit to an average university elsewhere:

    • No direct cost to other universities
    • Material GDP and tax impacts for government
    • Likely need for community grants.

    Flatten the yield curve

    Lenders rarely treat a closure as an isolated blip; being hawkish, they would probably reprice the entire university category.

    Add 50 basis points to a £90m facility and you’ve created a recurring £450k drag until you refinance. All in all, that’s not a huge bite out of your cash flow, but it will certainly make you more cautious.

    To fix this, listen to your finance directors: stagger your maturities and fix your rates well in advance. Or, radical thought – stop yanking at your credit lines and make do with what you have.

    Illustrative hit to an average university elsewhere:

    • Additional interest costs: £90m × 0.50% = £450k

    Herd immunity

    Group buying is one of the few places with cash on the table. In 2023–24, the UK Universities Procurement Consortia (UKUPC) members put about £2.4bn through frameworks and reported roughly £116.1m (4.84%) in cashable savings. The Southern Universities Procurement Consortium (SUPC) talks about £575m of member spend and average levy rebates of around £30,000 per full member.

    If fewer universities use those routes, frameworks lose clout, and with that, discounts and rebates. The more volume that stays in the collective pot, the better the prices – but for critical services, it’s still wise to have a backup supplier in case one fails.

    Another group issue is shared services. Up until recently, they were seen as a poisoned chalice, but are now growing out of necessity. The usual worries are well-rehearsed: loss of control, infighting and VAT jitters. Still, some experiments, like Janet and UCAS, have been tremendously successful, although pricing relies on throughput.

    Shared IT, payroll, procurement or estates often come with joint and several obligations. If one partner hits trouble, you start to see real governance friction.

    The practical fixes are contractual. Ringfence any arrears so they do not spill onto everyone else, and rebalance charges on a published, defensible formula.

    Illustrative hit to an average university elsewhere:

    • Frameworked spend: £131m (total non-staff) × 60% (frameworked, say) x 4.84% (cashable savings) x 10% (diminution) = £380k.
    • Shared services: impossible to quantify.

    What ministers can do without a podium

    I’ve modelled small changes to the bottom line (again, illustratively) – in this example one university going under could cost others £2.5m, or 50 per cent of the average university’s 2023–24 surplus. This number isn’t rigorous or comprehensive, but serves as an interesting thought experiment.

    The rational response is a resolution regime that protects students and research, temporary liquidity for solvent neighbours, clear transfer routes when the worst happens, and deployment of short, targeted grants for civic programmes.

    A single collapse could probably be absorbed; a string of them could set off an irreversible domino effect with far-reaching consequences. Ministers need to plan for this now – or else risk a very hefty civic bailout.

    Source link

  • The Experts in My Neighborhood – Teaching in Higher Ed

    The Experts in My Neighborhood – Teaching in Higher Ed

    This post is one of many, related to my participation in  Harold Jarche’s Personal Knowledge Mastery workshop.

    The topic of how expertise is no longer valued today is often discussed. I realize that I am walking through well-trodden pathways, as I bring it up in these reflections on experts today. In The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters, Tom Nichols writes:

    These are dangerous times. Never have so many people had access to so much knowledge, and yet been so resistant to learning anything.

    In today’s post, I want to think less about the societal and educational concerns I have about the death of expertise and more about how I might continue to attempt to inculcate habits that can keep me from dying that same death, myself. Part of that practice involves finding and curating many experts to help shape my thinking, over time.

    PKM Roles from Harold Jarche

    For this topic, Jarche invites us to use a map of personal knowledge mastery (PKM) roles to determine where we currently reside and where we would like to go, in terms of our PKM practice. He offers this graphic as part of his Finding Perpetual Beta book:

    On the Y axis, we can sort ourselves into doing high or low amounts of sharing. As I wrote previously, my likelihood of sharing is in direct relation to the topic I’m exploring. However, as Jarche recommended social bookmarking as one way of sharing, perhaps I was selling myself short when I categorized myself as not likely to share anything overly controversial. I have over 35 thousand digital bookmarks on Raindrop.io and add around 10-20 daily. However, I’m more likely to be categorized as highly visible sharing in terms of the Teaching in Higher Ed podcast and the topics I write about on the Teaching in Higher Ed blog.

    On the X axis, our activities are plotted on a continuum more toward high or low sense-making. A prior workshop participant of Jarche’s wrote:

    We must make SENSE of everything we find, and that includes prioritising–recognising what is useful now, what will be useful later, and what may not be useful.

    Given my propensity for saving gazillions of bookmarks and carefully tagging them for future use, combined with my streak of weekly podcast episodes airing since June of 2014, when it comes to teaching and learning, I’m doing a lot of sense-making on the regular.

    These are the (NEW) Experts in My Neighborhood

    Taking inspiration from Sesame Street’s People in Your Neighborhood and from Jarche’s activity related to experts, I offer the following notes on experts. When I searched for people within teaching and learning on Mastodon, I found that I was already following a lot of them. I decided to then look at who people I already follow are following:

    • Ethan Zuckerman – UMass Amherst, Global Voices, Berkman Klein Center. Formerly MIT Media Lab, Geekcorps, Tripod.com
    • Sarah T. Roberts, Ph.D. – Professor, researcher, writer, teacher. I care about content moderation, digital labor, the state of the world. I like animals and synthesizers and games. On the internet since 1993. Mac user since they came out. I like old computers and OSes. I love cooking. Siouxsie is my queen.
      • I was intrigued by her having written a content moderation book called Behind the Screen. I know enough about content moderation to know that I know pretty much nothing about content moderation.
      • She hasn’t posted in a long while, so I’m not sure how much I’ll regularly have ongoing opportunities to see what she’s currently exploring or otherwise working on

    Other Things I Noticed

    As I was exploring who people I follow are connected with on Mastodon, I noticed that you can have multiple pinned posts, unlike other social media I’ve used. Many people have an introduction post pinned to the top of their posts, yet also have other things they want to have front and center. One big advantage to Bluesky to me has been the prevalence of starter packs. The main Mastodon account mentioned an upcoming feature involving “packs” around twenty days ago, but said that they’re not sure what they’ll call the feature.

    Sometimes, scrolling through social media can be depressing. I decided that the next time I’m getting down on Mastodon, I should just check out what’s happening on the compostodon hashtag. It may be the most hopeful hashtag ever.

    The Biggest Delight From the Experience

    Another person who was new to me as an expert on Mastodon was JA Westenberg. According to JA Westenberg’s bio, Joan is a tech writer, angel investor, CMO, Founder. A succinct goal is also included on the about page of JoanWestenberg.com:

    My goal: to think in public.

    As I was winding down my time doing some sensemaking related to experts, I came across a video from Westenberg that was eerily similar to what Jarche has been stressing about us making PKM a practice. I can’t retrace my steps for how I came across Joan’s video on Mastodon, but a video thumbnail quickly caught my eye. Why You Should Write Every Day (Even if You’re Not a Writer) captured my imagination immediately, as I started watching. In addition to the video, there’s a written article of the same title posted, as well.

    As I continue to pursue learning through the PKM workshop, I’m blogging more frequently than I may ever have (at least in the last decade for sure). Reading through Joan’s reactions to the excuses we make when we don’t commit to writing resonate hard. We think we don’t have time. How about realizing we’re not writing War and Peace, Joan teases, gently. Too many of us get the stinking thinking that we don’t have anything good to say or that this comes naturally to people who are more talented and articulate than we are. Joan writes:

    Writing every day is less about becoming someone who writes, and more about becoming someone who thinks.

    Before I conclude this post, I want to be sure to stress the importance I’m gleaning of not thinking of individual experts as the way to practice PKM. Rather, it is through engaging with a community of experts that we will experience the deepest learning. A.J. Jacobs stresses that we should heed his advice:

    Thou shalt pay heed to experts (plural) but be skeptical of any one expert (singular)

    By cultivating many experts whose potential disagreements may help us cultivate a more nuanced perspective on complex topics. When we seek to learn in the complex domain, the importance of intentionality, intellectual humility, and curiosity becomes even more crucial. Having access to a network of experts helps us navigate complexity more effectively.

    Source link

  • Higher education postcard: Totley Hall Training College

    Higher education postcard: Totley Hall Training College

    Back in 1943 the UK government knew that more school teachers would be needed. The school leaving age was to be raised: this and other planned changes meant that 70,000 extra teachers would be needed over the coming years. The teacher training colleges then in place trained 7,000 a year, so there was a problem.

    The solution? Emergency Training Colleges. A compressed curriculum was piloted at Goldsmiths College, and in five years about 50 such colleges produced about 35,000 teachers. But it was a short-term scheme, and many of the colleges were wound up after 1950 or 1951.

    Nevertheless, there continued to be a need to grow base capacity to train teachers. The emergency colleges had dealt with the immediate shortfall, but with more children attending schools every year, there was still work to be done. Some of the emergency colleges became regular training colleges, and some local authorities established new colleges of their own. And this is where Totley Hall enters the stage.

    Not shown on the card is Totley Hall, built in 1623 and in 1949 passed to Sheffield Council. This was to be the heart of a new training college – the Totley Hall Training College of Housecraft. Its mission: training domestic science teachers.

    There’s a wonderful account of the college’s foundation and development, written by Anna Baldry, who was one of the first lecturers at the college. It’s well worth a read. Highlights include her nerves at interview; problems with electricity blackouts; HMI inspections; the admission of men; its opening by Violet Attlee; and some lovely photographs.

    More prosaically, the college had by 1963 become the plain Totley Hall Training College, focusing on training primary teachers. In 1967 men were admitted; in 1969 the best students could continue to study for a fourth year to gain a Bachelor of Education (BEd) degree from the University of Sheffield, rather than the Certificate in Education. And in 1972 – there being simultaneous vacancies in the principalships – Totley Hall Training College and the nearby Thornbridge Hall Training College were merged, to form the Totley/Thornbridge College of Education.

    In 1976 the College became part of Sheffield Polytechnic, which was renamed Sheffield City Polytechnic – and this in turn became Sheffield Hallam University in 1992, and I’ve written about it here.

    Here’s a jigsaw of the card.

    The card was posted, but I can’t read the postmark, so don’t know when. The 3p stamp shows it was after decimalisation. If it was in 1971 or 1972 it was sent first class; if it was 1973 it was sent second class. Those are the only options for that stamp.

    An engagement? A wedding? A pools win? A baby? What do we think?

    Source link

  • Virginia lawmakers call for audit of UVA’s Justice Department deal

    Virginia lawmakers call for audit of UVA’s Justice Department deal

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • Two Democratic leaders in the Virginia Legislature are questioning the legality of the University of Virginia’s recent deal with the U.S. Department of Justice and calling for an independent review of its constitutionality.
    • In an eight-page letter this week, state Sens. Scott Surovell and L. Louise Lucas said the agreement “directly conflicts with state law, commits the University to eliminate legislatively mandated programs, subjects the University President to personal certification requirements and potentially places UVA in violation of its statutory obligations.”
    • The pair requested UVA Interim President Paul Mahoney and Rachel Sheridan, the head of UVA’s board, to formally respond by Nov. 7. UVA did not immediately respond to questions Thursday.

    Dive Insight:

    On Oct. 22, Mahoney signed a four-page agreement with the DOJ to eventually close five investigations into UVA. In exchange, the public university agreed to adhere to the DOJ’s sweeping July guidance against diversity, equity and inclusion efforts and provide the agency with quarterly compliance reports.

    In their letter, Surovell and Lucas lambasted Mahoney and Sheridan for “a fundamental breach of the governance relationship” between the university and the state.

    “This agreement was disturbingly executed with zero consultation with the General Assembly, despite the fact that the General Assembly controls the University and provides the bulk of its government funding,” they said, arguing the lack of legislative involvement could violate state statute.

    When announcing the deal, UVA said Mahoney struck the deal with input from the university’s governing board, whose members were “kept apprised of the negotiations and briefed on the final terms before signature.” Since the agreement doesn’t include a financial penalty, it did not require a formal vote from the board, the university said in an FAQ.

    Along with the board, Mahoney has said he struck the deal with input from the university’s leadership and internal and external legal counsel.

    Surovell and Lucas questioned if Jason Miyares, Virginia’s Republican Attorney General and an ally of President Donald Trump, had counseled the university about the deal. 

    Miyares — who fired UVA’s longtime legal counsel upon taking office in 2022 — is up for reelection in November with Trump’s endorsement, a backing Lucas and Surovell cast as an “inherent conflict of interest.” 

    It is unclear, they said, if Virginia’s top lawyer is “competent and capable of providing truly independent legal advice to Virginia’s public universities in this area of the law.”

    Virginia public colleges “need legal counsel who will zealously defend state sovereignty and institutional autonomy — not counsel whose political fortunes are tied to the very administration applying the pressure,” they said.

    The two lawmakers, along with Democratic state Sen. Mamie Locke, previously threatened UVA’s state funding if the university agreed to the Trump administration’s separate higher education compact, which offered preferential access to grant funding in exchange for unprecedented federal oversight. UVA turned it down five days before announcing its deal with the DOJ.

    Lucas and Surovell aren’t the only Virginia legislators to question the integrity of the UVA-DOJ deal. State Del. Katrina Callsen and Sen. R. Creigh Deeds, Democrats who represent UVA’s district, condemned it as subjecting the university “to unprecedented federal control.”

    In an Oct. 23 letter, the pair told Mahoney and the board that their approval of the agreement calls “into grave question your ability to adequately protect the interests and resources entrusted to you by the Virginia General Assembly.”

    “Your actions fail to leave the University free and unafraid to combat that which is untrue or in error,” they said. “By agreeing to these terms, UVA risks betraying the very principles you espouse in your letter: academic freedom, ideological diversity, and free expression.”

    Callsen and Deeds called on UVA leadership to reverse the deal and “reject further federal interference.”

    When asked on Thursday if Mahoney or the board had responded, Deed’s office referred to a story published by The Cavalier Daily, the university’s independent student newspaper.

    In a letter shared with The Daily, Mahoney and Sheridan said that they “respectfully disagree” with Deeds and Callsen’s assessment, adding that the deal is the “culmination of months of engagement” with the DOJ and other federal agencies over multiple civil rights investigations.

    They also said the institution’s deal with the federal government differs significantly from the “lengthy lists of specific obligations” agreed to by Columbia and Brown universities.

    “Our agreement is different — if the United States believes we are not in compliance, its only remedy is to terminate the agreement,” they said.

    Source link