Tag: opinion

  • Ideas for navigating editor-reviewer relationships (opinion)

    Ideas for navigating editor-reviewer relationships (opinion)

    An editor or reviewer can have an outsize impact on the career of a scholar, particularly in the early stages. The stakes can be high for an author. A negative review or edit can set back a research plan by months and harm a scholar’s chances for tenure or promotion. This reality creates a power imbalance between an editor or reviewer and an author that can be abused.

    Graduate schools offer few pointers on how to navigate editor and reviewer relationships. Our goal in this essay is to debunk the process and offer suggestions and observations for editors/reviewers and authors on how to approach the task in a more thoughtful and efficient way.

    Understanding the Reviewer and Editor Roles

    First, it is important to note that while reviewers and editors take part in a similar process—assessing the work of an author—the tasks are different. The editor is rarely an expert in the specific subject of an article and necessarily needs to rely on impartial reviewers to place the work in context. Nevertheless, the editor—and, at times, an editorial board—is the decision-maker in this equation. Having a clear and transparent line of communication between the author and the editor is critical.

    The task of the reviewer is to place the work in its scholarly context and to weigh its merit. Is the work breaking new ground? Is it challenging a long-held interpretation within the academy? Are the sources contemporary and the most relevant? Does the work fit the subject area of the journal or press? Can it be revised to make it suitable for publication?

    It is our strong belief that reviewers need to meet the authors where they are—that is, to understand the goal of the author, determine whether the work is suitable for the journal or press in question and, if so, help them reach the promised land of publication. Simply put: The reviewer should weigh the author’s case against the author’s intent.

    Unfortunately, this does not always happen: It is sometimes the case that reviewers stray from this path and insert suggestions that they would like to see addressed but that are not central to the submitted work. The dreaded “reviewer number 2” has become the bane of many an author’s existence. In this sort of review, the reviewer raises so many questions and objections that an author is left to ponder whether the two are reading the same text. And, it must be said, just as on social media, anonymity can at times lead to incivility. Instead of being helpful, sometimes a reviewer is unkind and cruel.

    The role of the editor is to referee between the goals of the author and the desires of the reviewer. Egos and politics often come into play in this process because reviewers in many cases are colleagues of the editor and contributors to the publication in question. Our experience suggests there are two major types of editors. Authors will need to adjust their approach based on which of these two types best describes their editor:

    • Sympathetic editor: This is the ideal. This editor will work with an author to publish a submission if the research is strong and will allow them to keep their own voice. They do not seek to impose their vision on the book or article. They do not allow their personal politics to influence the decision-making process. They are driven by one central question: Does the author accomplish what they set out to do? This type of editor tries to determine whether a reviewer is acting out of hubris by suggesting tangential and substantial changes or whether they are addressing core issues. On the opposite end of the spectrum, they are alert to the two-paragraph, lackadaisical reviewer who read the work over lunch while answering emails.
    • Visionary editor: It may sound counterintuitive, but an editor with their own vision for someone else’s work can mean frustration and ultimately rejection for an author. This type of editor sees someone else’s work as an opportunity to explore an aspect of a topic that interests them. They impose their own vision on someone else’s work rather than determining whether the author has achieved the goal they set for themselves. This typically takes the form of a lengthy response asking an author to fundamentally rethink their piece. The response contains so many critiques that to adhere to the suggestions would amount to writing a completely different piece of scholarship. This editor also tends to extend and even impede the process almost endlessly.

    As an example, upon the death of Fidel Castro in November 2016, the Latin American historian of this writing duo (Argote-Freyre) was asked by a journal editorial board member to author an article comparing the career of Castro with that of the prior dictator of Cuba, Fulgencio Batista. The resulting piece concluded that the two political figures shared more similarities than differences. The editor, although agreeing to the concept, was unhappy with the conclusions reached by the essay. The editor struck out paragraph after paragraph; a lecture on tone and thesis ensued.

    The editor suggested a piece analyzing the revisionist historiography on Batista—a subject outside the contours of the original assignment and one that would take many months to complete. The author made a rookie mistake in assuming that a member of the editorial board was vested with the authority to make assignments. In retrospect, it seems as if the assignment was foisted upon the working editor, who then wanted to steer the piece in a completely different direction. The author withdrew the piece; the only positive was that only a few months were lost in the process.

    The visionary editor is the type who is never satisfied. They forget that the piece is the author’s, not theirs. Yes, the editor is a gatekeeper for the journal or press, but if it is not a good fit, they should say so and move on. This picky editor sends a revision back to a new third (or fourth) reviewer, who is likely to ask for another, different round of revisions. This is nothing other than moving the goalposts. One of us had this occur with an editor who said, “As you know, we often send articles to several rounds of reviewers.” Well, we did not know, because the journal’s website did not say that. Such a process could go on forever and, to our eyes, makes no sense. The editor should decide on his or her own whether the author has revised sufficiently: It is clear from the reader reports what needed to be done, so just check and see. The editor needs to be decisive.

    At the point a work is about to be sent to an additional set of reviewers, an author needs to withdraw the article or book from consideration. Run as fast as you can in search of another editor and publication. Do not let someone waste your time, especially if your clock is ticking for tenure and promotion.

    How to Make Relationships Work— and When to Walk Away

    The author-editor relationship should be a dance, not a duel. An author is not at the mercy of the process; you are a partner. If you are not clicking with the editor, walk away. A bad first date rarely turns into a good second date. This is particularly true when working on a book project, given the many steps and long timeline involved.

    For a revise-and-resubmit, we suggest strongly that you be professionally assertive. Ask about the review of the resubmission before you do it. If the editor says it will go to new readers, withdraw the piece. This never goes well. Editors should be transparent about the steps involved. It is our experience that some editors are hesitant to divulge their process. If that is the case, the author needs to reassess the integrity of that process.

    Being fully transparent allows you to ask for transparency in return, whether you are an editor or an author. If, as we have experienced, two peer reviews come in that are quite opposed, the editor should get a third before returning to the author. If there are two or three reviews, the editor should synthesize them with a memo attached to the reports. The summary should go something like: “All reviewers agree chapter four needs to be revised with this material, but there is disagreement about chapter six.” There is also nothing wrong with asking the author to make the tough call on a contested point of interpretation. Once again, it is the author’s scholarship, not the editor’s, the journal’s or the press’s.

    For authors: Have a conversation with the editor. If it’s a call, follow up with a written summary. When responding to reader reports, especially when they disagree, say what you will and will not do. Do not say you will revise when you disagree—but don’t be stubborn. Give a little to get what you won’t compromise. If you disagree with a reviewer’s suggestion, say why, and ask the editor for approval not to make a specific change suggested in one of the reader reports. Get that approval. If the editor says the revision will go back to one or both original readers instead of making the final call himself, politely insist that the written exchange between the author and editor be sent along, too.

    It may not always work. Recently, one of us did just what we described and the editor said the plan sounded good, only to have the journal reject the revision. The editorial board said a specific change was not made even though the editor agreed that change would not be necessary. Poor communication and coordination between an editor and an editorial board should not penalize an author.

    Finally, we’d like to briefly weigh in on the argument that professors should reject peer reviewing because it is an unpaid task. If you do not want to do it, don’t—but there are compelling reasons to write responsible peer reviews. First, unpaid labor is not without merit. Even if your tenure and promotion committees might not value the task, that does not mean it is not worthwhile. You’re not paid to volunteer at your local food pantry, but you still do it. Second, people do this for you; it is time to be generous in return. Third, reviewing provides insights into the process for your own work. Peer reviewing keeps you current on trends in the field. Editing and peer reviewing make you a better writer and produce better scholarship. Isn’t that what we all want?

    Frank Argote-Freyre and Christopher M. Bellitto are professors of history at Kean University in Union, N.J., with extensive experience with peer review on both sides of the process. Argote-Freyre, a scholar of Latin American history, serves as a frequent peer reviewer and content editor on various book and article projects. Bellitto, a medievalist, is the series editor of Brill’s Companions to the Christian Tradition and academic editor at large of Paulist Press.

    Source link

  • Academic unions should adopt neutrality (opinion)

    Academic unions should adopt neutrality (opinion)

    Institutional neutrality at universities is having its moment in the aftermath of a year of nationwide campus protests over the Israel-Gaza war. The list of universities that have adopted neutrality has grown over the course of the past 12 months. The concept necessarily is expanding to include conversations around university investments. Yet, academic unions have slipped under the radar as purveyors of positions on political issues. They should not be neglected in the push for neutral stances except for those that directly pertain to an institutional mission. In the case of the union, this should be to promote labor interests. Professors from a range of ideologies should be able to find common cause for collective bargaining purposes without being forced into supporting other political positions.

    The lack of neutrality of professors’ unions on non-labor-related issues is a pernicious problem. Federal law and some state laws that pertain to unions work to compel professors’ speech. Under the federal National Labor Relations Act, if a majority of private sector workers voting in a union election choose to unionize, all workers in that bargaining unit must be exclusively represented by that union. New York’s Taylor Law requires the same for public employees. And, if workers want the benefits of membership, like voting for union leadership and contracts, they must pay dues.

    While public employees could choose not to be union members before the Supreme Court’s 2017 Janus v. AFSCME ruling, that case now guarantees their right to not pay agency fees. But even if workers wish to eschew membership and not pay fees, they cannot dissociate entirely. They are required to be represented by a union that speaks via statements at the local, state and national level on many non-labor-related subjects. Therefore, with their veneer of solidarity, unions quash viewpoint diversity and suppress First Amendment rights. They tie one of the only forms of dissent possible (withdrawing dues) to disenfranchisement from the union, the organization that negotiates their wages and labor conditions.

    Professors who do stop paying their dues are often derided as “free riders.” They risk offending union leadership, who have a say in university processes that can impact their employment, like grievances and denial of reappointment. The union is formally required to provide equal advocacy as their exclusive representative. However, even if one believes biases will never prevail against “free riders,” there is still the suppressive impact of professors’ perception that paying dues and keeping quiet is best for their careers.

    And so, professors are forced into a kind of protection racket, paying unions that may endorse positions with which they may disagree. The National Education Association has opined on everything from ending private prisons to climate change, from promoting women-led businesses to helmets for motorcyclists. They have issued statements on the Israel-Gaza conflict, advocated for codifying Roe v. Wade into law and called for Donald Trump’s ouster. They have adopted progressive ideological lenses throughout such statements, arguing for instance that “white supremacy culture” is prevalent in the current U.S., and that “intersectionality must be … addressed … in order to advance the [NEA’s] social justice work.”

    To be clear, I am not arguing that these positions taken by unions are bad. I am not reflecting my own political preferences. I am not highlighting progressive examples to critique only progressive examples: I could find none that can be considered conservative. I am not saying that it’s not possible that a majority of members agree with the statements. I am also not arguing that workers do not have the right to form associations to advocate for political causes.

    What I am arguing is that due to laws making exclusive representation compulsory, unions should adopt neutrality on political issues that do not impact the primary purpose of academic unions: advocating for professors’ interests as workers. This lets ideological diversity exist and prevents coerced speech and dues payments. This neutrality is of paramount importance with public sector unions, where union leadership activities may receive taxpayer-subsidized administrative benefits.

    This neutrality should extend to political endorsements of individual candidates. While there may be some argument to be made that endorsing a pro-union or pro–higher education candidate over their opponent directly pertains to professors’ interests as workers, this carries with it implicit endorsement of a wide slate of other policies. A better approach would be for unions to support (or critique) candidates’ specific policy proposals or voting records. It would also reduce antagonism between unions and candidates they did not endorse, should those be elected.

    Recent examples show the perils of academic unions not having a neutrality standard. In 2018, a University of Maine professor sued his union, noting his opposition to its stances, like endorsing Hillary Clinton for president. More recently, in 2022, six City University of New York professors filed suit against the Professional Staff Congress (PSC), which passed a pro-Palestinian resolution they viewed as antisemitic. They resigned their memberships, along with approximately 263 other professors. But because of the Taylor Law, they are required to be represented by the PSC, which did not give evidence it could be fair in representing them. The PSC called them free riders, claiming their lawsuit was “meritless … funded by the notoriously right-wing National Right to Work Legal Foundation,” and described the “‘Right to Work’ agenda” as “rooted in white supremacy.”

    After lower courts ruled to dismiss their suit, the CUNY professors appealed to the Supreme Court, which just this month declined to hear their case. Yet, while this case could have been a victory for viewpoint diversity and free speech and an impetus for unions to get on the institutional neutrality bandwagon, future such suits will doubtless arise and reach a court favorable to their claims. Academic unions should get ahead of such a court ruling and make union membership attractive to all who may want to participate based on advocacy for improved working conditions, but not for particular solutions to international wars—or for wearing motorcycle helmets.

    Colleen P. Eren is a professor of sociology and criminal justice at William Paterson University and a research fellow at the Segal Center for Academic Pluralism. Her commentaries on higher ed and other topics can be found across a range of publications, including The New York Times, Discourse, Reason, and the Foundation for Economic Education.

    Source link

  • We need new ways to protect academic freedom (opinion)

    We need new ways to protect academic freedom (opinion)

    Katherine Franke, formerly a law professor at Columbia University, is just the latest of many academics who have found themselves in hot water because of something they said outside the classroom. Others have been fired or resigned under pressure for what they posted online or said in other off-campus venues.

    In each of those cases, the “offending party” invoked academic freedom or freedom of speech as a defense to pressures brought on them, or procedures initiated against them, by university administrators. The traditional discourse of academic freedom or free speech on campus has focused on threats from inside the academy of the kind that led Franke and others to leave their positions.

    Today, threats to academic freedom and free speech are being mounted from the outside by governments or advocacy groups intent on policing colleges and universities and exposing what they see as a suffocating orthodoxy. As Darrell M. West wrote in 2022, “In recent years, we have seen a number of cases where political leaders upset about criticism have challenged professors and sought to intimidate them into silence.”

    We have seen this act before, and the record of universities is not pretty.

    During the 1940s and 1950s, an anticommunist crusade swept the nation, and universities were prime targets. In that period, “faculty and staff at institutions of higher learning across the country experienced increased scrutiny from college administrators and trustees, as well as Congress and the FBI, for their speech, their academic work, and their political activities.”

    And many universities put up no resistance.

    Today, some believe, as Nina Jankowicz puts it, that we are entering “an era of real censorship the likes of which the United States has never seen. How will universities respond?”

    If academic freedom and freedom of expression are to be meaningful, colleges and universities must not only resist the temptation to punish or purge people whose speech they and others may find offensive; they must provide new protections against external threats, especially when it comes to extramural speech by members of their faculties.

    They must become active protectors and allies of faculty who are targeted.

    As has long been recognized, academic freedom and free speech are not identical. In 2007, Rachel Levinson, then the AAUP senior counsel, wrote, “It can … be difficult to explain the distinction between ‘academic freedom’ and ‘free speech rights under the First Amendment’—two related but analytically distinct legal concepts.”

    Levinson explained, “Academic freedom … addresses rights within the educational contexts of teaching, learning, and research both in and outside the classroom.” Free speech requires that there be no regulation of expression on “all sorts of topics and in all sorts of settings.”

    Ten years after Levinson, Stanley Fish made a splash when he argued, “Freedom of speech is not an academic value.” As Fish explained, “Accuracy of speech is an academic value … [because of] the goal of academic inquiry: getting a matter of fact right.” Free speech, in contrast, means “something like a Hyde Park corner or a town-hall meeting where people take turns offering their opinions on pressing social matters.”

    But as Keith Whittington observes, the boundaries that Levinson and Fish think can be drawn between academic freedom and free speech are not always recognized, even by organizations like the AAUP. “In its foundational 1915 Declaration of Principles on Academic Freedom and Academic Tenure,” Whittington writes, “the AAUP asserted that academic freedom consists of three elements: freedom of research, freedom of teaching, and ‘freedom of extramural utterance and action.’”

    In 1940, Whittington explains, “the organization reemphasized its position that ‘when they speak or write as citizens,’ professors ‘should be free from institutional censorship or discipline.’”

    Like the AAUP, Whittington opposes “institutional censorship” for extramural speech. That is crucially important.

    But in the era in which academics now live and work, is it enough?

    We know that academics report a decrease in their sense of academic freedom. A fall 2024 survey by Inside Higher Ed found that 49 percent of professors experienced a decline over the prior year in their sense of academic freedom as it pertains to extramural speech.

    To foster academic freedom and free speech on campus or in the world beyond the campus, colleges and universities need to move from merely tolerating the expression of unpopular ideas to a more affirmative stance in which they take responsibility for fostering it. It is not enough to tell faculty that the university will respect academic freedom and free expression if they are afraid to exercise those very rights.

    Faculty may be fearful that saying the “wrong” thing will result in being ostracized or shunned. John Stuart Mill, one of the great advocates for free expression, warned about what he called “the tyranny of the prevailing opinion and feeling.” That tyranny could chill the expression of unpopular ideas.

    In 1952, during the McCarthy era, Supreme Court justice Felix Frankfurter also worried about efforts to intimidate academics that had “an unmistakable tendency to chill that free play of the spirit which all teachers ought especially to cultivate and practice.”

    Beyond the campus, faculty may rightly fear that if they say things that offend powerful people or government officials, they will be quickly caught up in an online frenzy or will be targeted. If they think their academic institutions will not have their back, they may choose the safety of silence over the risk of saying what they think.

    Whittington gets it right when he argues that “Colleges and universities should encourage faculty to bring their expertise to bear on matters of public concern and express their informed judgments to public audiences when doing so might be relevant to ongoing public debates.” The public interest is served when we “design institutions and practices that facilitate the diffusion of that knowledge.”

    Those institutions and practices need to be adapted to the political environment in which we live. That is why it is so important that colleges and universities examine their policies and practices and develop new ways of supporting their faculty if extramural speech gets them in trouble. This may mean providing financial resources as well as making public statements in defense of those faculty members.

    Colleges and universities should also consider making their legal counsel available to offer advice and representation and using whatever political influence they wield on behalf of a faculty member who is under attack.

    Without those things, academics may be “free from” the kind of university action that led Franke to leave Columbia but still not be “free to” use their academic freedom and right of free expression for the benefit of their students, their professions and the society at large.

    Austin Sarat is the William Nelson Cromwell Professor of Jurisprudence and Political Science at Amherst College.

    Source link

  • Thoughts on 20 years of college teaching (opinion)

    Thoughts on 20 years of college teaching (opinion)

    I have now been teaching at Duke University for 20 years. I have been through all kinds of teaching fads—active learning, team-based learning, alternative grading, service learning, etc. You might assume that I have become a better teacher over these many years. Yet I am noticing a curious trend in my course evaluations: Some of my students like me and my courses less and less.

    As a teaching faculty member, this matters greatly to my own career trajectory, and so I’ve wondered and worried about what to do. Why am I struggling to teach well and why are my students struggling to learn?

    Looking back on the past two decades of my teaching and reaching further back into my own college experience, I see six clear differences between now and then.

    Difference No. 1: Access to Information

    When I took my first college environmental science class, way back in 1992, I was mesmerized. This was before the days of Advanced Placement Environmental Science, so I came into the class knowing almost nothing about the topic, motivated by my naïve idea to be part of “saving the world.” To learn, I had a textbook (that I still have, all highlighted and marked up) and the lectures (for which I still have my notes). Sure, I could go to the library and find books and articles to learn more, but mostly I stuck to my textbook and my notes. I showed up to the lecture-based class to learn, to listen, to ask questions.

    Today, my students show up in my course often having taken AP Environmental Science, with access to unlimited information about the course topics, and with AI assistants that will help them organize their notes, write their essays and prepare for exams. I have had to shift from expert to curator, spending hours sifting through online articles, podcasts (SO many podcasts) and videos, instead of relying on a single textbook. I look for content that will engage students, knowing that some may also spend their class period fact-checking my lectures, which brings me to …

    Difference No. 2: Attention

    When I lecture, I look out to a sea of stickered laptops, with students shifting their attention between me, my slides and their screens. I remind them that I can tell when they are watching TikTok or texting, because the class material probably isn’t causing their amused facial expressions.

    Honestly, I am finding myself more distracted, too. While lecturing I am not only thinking about the lecture material and what’s on the next slide—I am also wondering how I can get my students’ attention. I often default to telling a personal anecdote, but even as they briefly look up to laugh, they just as quickly return their eyes to their screens.

    The obvious advice would be to have more engaging activities than lecturing but …

    Difference No. 3: More Lectures, Please

    After 2020, one comment showed up over and over on my course evaluations: lecture more. My students seemed not to see the value of small-group activities, gallery walks, interactive data exercises and discussions. They felt that they were not learning as much, and some of them assumed that meant that I didn’t know as much, which leads me to …

    Difference No. 4: Sense of Entitlement

    While I teach at a private elite university, my colleagues across a range of institutions have backed this up: Some students seem to not have much respect for faculty. The most common way this shows up is at the end of the semester, when students send me emails about why my course policies resulted in a grade they think is unfair, or after an exam, when they argue that I did not grade them fairly, which leads me to …

    Difference No. 5: Assessment Confusion

    When I was in college, I took midterms and finals. I rewrote my notes, made flash cards, created potential exam questions, asked friends for old exams and studied a lot. I took multiple-choice exams and essay exams, in-class exams and take-home exams. When I first started teaching my lecture-based class, I assigned two midterms and a final. I took the business of writing exams seriously, often using short-answer and essay exams that took a whole lot of time to grade. I wanted the experience of taking the exam to help students feel like they had learned something, and the experience of studying to actually entice them to learn.

    Then, two things happened. We faculty got all excited about alternative assessments, trying to make our classes more inclusive for more learning styles. And the students started rebelling about their exam grades, nitpicking our grading for a point here and there, angry that, as one student put it, I was “ruthless” in my grading. Students didn’t show up at my office hours eager to understand the concepts—they wanted more points.

    So, I threw out exams in favor of shorter papers, discussions and activities. In fall 2024, I had 74 students and I gave a whopping 67 of them A’s. To do well in my class now, you don’t really have to learn anything. You just need to show up. Except the problem with grading for attendance is …

    Difference No. 6: Our Students Are Struggling

    We all know that our students are struggling with more mental and emotional health issues, perhaps due to COVID-related learning loss, the state of the world and so many other things. Many of us include mental health resources in our syllabus, but we know that’s not enough. Students are much more open about their struggles with us, but we aren’t trained therapists and often don’t know the right thing to say. Who am I to determine whether or not one student’s excuse for missing a class is valid while another’s is not? How can I keep extending the deadlines for a struggling student while keeping the deadline firm for the rest? Sure, there are suggestions for this (e.g., offer everyone a “late assignment” ticket to use), but I still spend a lot of time sifting through student email requests for extensions and understanding. How can we be fair to all of our students while maintaining the rhythm of course expectations?

    Usually, one acknowledges the differences between students now and “back then” at retirement, reflecting on the long arc of a teaching career. But I am not at the end—I have a long way to go (hopefully). I am expected to be good at this in order to get reappointed to my teaching faculty position.

    Teaching requires much more agility now as we attempt to adapt to the ever-expanding information sphere, our students’ needs, and the state of the community and world beyond our classrooms. Instead of jumping to solutions (more active learning!), I think it’s reasonable to step back and acknowledge that there is no one change we need to make to be more effective educators in 2025. We also can acknowledge that some of the strategies we are using to make our classes more engaging and inclusive might backfire, and that there still is a time and place for really good, engaging lectures and really hard, useful exams.

    There are fads in teaching, and over the past 20 years, I have seen and tried plenty of them. We prize teaching innovation, highlighting new techniques as smashing successes. But sometimes we learn that our best-laid plans don’t work out, that what students really want is to hear from an expert, someone who can help them sort through the overwhelming crush of information to find a narrative that is relevant and meaningful.

    The students in our classrooms are not the same students we were, but maybe there is still a way to spark their enthusiasm for our subjects by simply asking them to be present. As debates about the value of higher education swirl around us, maybe caring about our students and their learning means asking them to put away their screens, take out a notebook and be present for our lectures, discussions and occasional gallery walk. For my part, I’m reminding myself that some students aren’t all that different than I was—curious, excited, eager to learn—and that I owe it to them to keep showing up committed to their learning and, maybe, prepared with a few more light-on-text lecture slides.

    Rebecca Vidra is a senior lecturer at the Nicholas School of the Environment at Duke University.

    Source link

  • Did the Ivy League really break America? (opinion)

    Did the Ivy League really break America? (opinion)

    Are many of the ills that plague American society caused by Ivy League admission policies?

    That is the premise of David Brooks’s cover story for the December issue of The Atlantic, “How the Ivy League Broke America.” Brooks blames the Ivies and “meritocracy” for a host of societal problems, including:

    • Overbearing parenting
    • Less time for recess (as well as art and shop) in schools
    • An economy that doesn’t provide opportunities for those without a college degree
    • The death of civic organizations like Elks Lodge and Kiwanis Club
    • The high percentage of Ivy League graduates who choose careers in finance and consulting
    • The rise of populism based on “crude exaggerations, gross generalizations, and bald-faced lies.”

    Brooks somehow left the decline of small-town mom-and-pop businesses and the popularity of reality television off his laundry list.

    You may be wondering how the Ivies contributed to or caused all these problems. The essence of Brooks’s argument is that “every coherent society has a social ideal—an image of what the superior person looks like.” His hypothesis is that America’s social ideals reflect and are determined by the qualities that Ivy League universities value in admission.

    One hundred years ago, the Ivy League social ideal was what Brooks terms the “Well-Bred Man”—white, male, aristocratic and preppy, athletic, good-looking, and personable. What was not part of the ideal was intellectual brilliance or academic prowess, and in fact those who cared about studying were social outcasts. Applying to the Ivies resembled applying for membership to elite social clubs.

    That changed starting in the 1930s when a group of educational leaders, the most prominent being Harvard president James Conant, worried that the United States was not producing leaders capable of dealing with the problems it would face in the future. Their solution was to move to an admission process that rewarded intelligence rather than family lineage. They believed that intelligence was the highest human trait, one that is innate and distributed randomly throughout the population. Conant and his peers believed the change would lead to a nation with greater opportunities for social mobility.

    Brooks seems far from sure that the change was positive for America. He acknowledges that “the amount of bigotry—against women, Black people, the LGBTQ community—has declined” (that might be debatable given the current political climate), but observes that the previous ideal produced the New Deal, victory in World War II, NATO and the postwar world led by America, while the products of the ideal pushed by Conant have produced “quagmires in Vietnam and Afghanistan, needless carnage in Iraq, the 2008 financial crisis, the toxic rise of social media, and our current age of political dysfunction.” Those examples seem cherry-picked.

    In the essay, Brooks cites a number of troubling societal problems and trends, all supported with extensive research, but the weakness of his argument is that he tries to find a single cause to explain all of them. That common denominator is what he calls “meritocracy.”

    Meritocracy, a society with opportunities based on merit, is an appealing concept in theory, but defining merit is where things get sticky. Merit may be similar to Supreme Court justice Potter Stewart’s description of pornography, in that you know it when you see it. Does merit consist of talent alone? Talent combined with work ethic? Talent, work ethic and character?

    Merit is in the eye of the beholder. If I was admitted to an Ivy League university, it was obviously because I had merit. If someone else, especially someone from an underrepresented population, got the acceptance instead of me, factors other than merit must have been at play. If two candidates have identical transcripts but different SAT scores, which one possesses more merit? Complicating the discussion is the fact that many things cited as measures of merit are in fact measures of privilege.

    For Brooks, Ivy League meritocracy involves an overreliance on intelligence and academic achievement, to the detriment of noncognitive skills that are more central to success and happiness in life. He argues that “success in school is not the same thing as success in life,” with success in school primarily being individual while success in life is team-based. He quotes Adam Grant’s argument that academic excellence is not a strong predictor of career excellence.

    Ultimately, he argues that “meritocracy” has spurred the creation of “an American caste system,” one in which “a chasm divides the educated from the less well-educated,” triggering “a populist backlash that is tearing society apart.” Yet Brooks’s beef is not so much with meritocracy as it is with a mindset that he attributes to Conant and his brethren. He equates meritocracy with a belief in rationalism and social engineering that assumes that anything of value can be measured and counted. What he is criticizing is something different from meritocracy, or at least reflects a narrow definition of meritocracy.

    Even if we don’t agree with Brooks’s definitions, or the implication that Ivy League admission policies are responsible for the ills of society, his article raises a number of important questions about the college admission process at elite colleges and universities.

    First, is the worship of standardized testing misplaced? The SAT became prominent in college admission at around the same time that Conant and others were changing the Ivy League admission paradigm. They believed that intelligence could be measured and latched onto the SAT as a “pure,” objective measure of aptitude. Today, of course, we recognize that test scores are correlated with family income and that scores can be manipulated through test preparation. And the “A” in SAT no longer stands for aptitude.

    Do we measure what we value or do we value what we can measure? Brooks criticizes the Ivies for focusing on academic achievement in school at the expense of “noncognitive skills” that might be more important to success in life after college, things like curiosity, relationship-building skills and work ethic. He’s right, but there are two reasons for the current emphasis. One is that going to college is going to school, so an admission process focused on scholastic academic achievement is defensible. The other is that we haven’t developed a good mechanism for measuring noncognitive skills.

    That raises a larger question. What do we want the admission process to accomplish? The SAT is intended to predict freshman year college GPA (in conjunction with high school grades). Is that a satisfactory goal? Shouldn’t we have a larger lens, aiming to identify those who will be most successful at the end of college, or after college? Should we admit those with the greatest potential, those who will grow the most from the college experience, or those who will make the greatest contribution to society after college?

    Brooks questions elite colleges’ preferences for “spiky” students over those who are well-rounded. Is a student body full of spiky students really better? An even more important question arises from a distinction Brooks made some years ago between “résumé virtues” and “eulogy virtues.”

    Does the elite college admission process as currently constituted reward and encourage students who are good at building résumés? A former student attending an elite university commented that almost every classmate had done independent academic research and started a nonprofit. Do students aspiring to the Ivies choose activities because they really care about them or because they think they will impress admission officers, and can admission officers tell the difference? What is the consequence of having a student body full of those who are good at playing the résumé-building game?

    There is one other issue raised by Brooks that I find particularly important. He argues that those who are successful in the elite admission process end up possessing greater “hubris,” in that they believe their success is the product of their talent and hard work rather than privilege and luck. Rather than appreciating their good fortune, they may believe they are entitled to it. That misconception may also fuel the populist backlash to elites that has increased the division within our country.

    I don’t buy Brooks’s definition of meritocracy or his contention that the Ivy League “broke” America, but his article nevertheless merits reading and discussion.

    Jim Jump recently retired after 33 years as the academic dean and director of college counseling at St. Christopher’s School in Richmond, Va. He previously served as an admissions officer, philosophy instructor and women’s basketball coach at the college level and is a past president of the National Association for College Admission Counseling. He is the 2024 recipient of NACAC’s John B. Muir Excellence in Media Award.

    Source link

  • A troubling moment for public higher ed (opinion)

    A troubling moment for public higher ed (opinion)

    David Kozlowski/Moment Mobile/Getty Images

    Earlier this month, my institution, Southern Methodist University, made headlines by hiring President Jay Hartzell away from the University of Texas at Austin, one of the country’s largest and most prestigious public universities. The move surprised many on both campuses and sent shock waves through higher education.

    While I can’t presume to know all the motivations behind President Hartzell’s decision and I don’t speak for SMU, as a faculty member who studies higher education, I believe this moment demands our attention. Many public universities are under serious threat, and private universities need to realize that their future is closely tied to the success of their public counterparts.

    For more than a decade, SMU has been my academic home. The campus boasts smart and curious students, dedicated faculty who care about teaching and research, and strong leadership from the administration and Board of Trustees. We’re in the middle of a successful capital campaign and enjoying both athletic success after our move to the Atlantic Coast Conference and a growing research profile.

    Yet, even as I anticipate the leadership that President Hartzell will bring to SMU, I can’t ignore the broader context that has made such a move more common and deeply troubling.

    Hartzell isn’t the only example of a major public university president leaving for the relative safety of private higher education. His predecessor at UT Austin Greg Fenves left for Emory University. Carol Folt resigned from the University of North Carolina at Chapel Hill before getting the University of Southern California presidency. Back in 2011, Biddy Martin famously left the University of Wisconsin at Madison for Amherst College in one of the early examples of this trend. So, what is going on and why are major public university presidencies less attractive than they once were?

    The Struggles of Public Universities

    Being a public university president in a red state is the toughest job in higher education today.

    Public universities in these politically charged environments are under siege. They face relentless ideological attacks from state legislators and are constantly forced to navigate resource challenges from years of underfunding.

    Politicians attacking public higher education are not simply questioning the budgets or management—they are attempting to dismantle these institutions. Efforts to reduce tenure protections, anti-DEI legislation and restrictions on what can be taught are all part of a broader effort to strip public universities of their autonomy.

    The goal of these attacks is clear: to reduce the influence and authority of public universities and their leaders and undermine the critical role they play in shaping a well-informed and educated workforce and citizenry.

    At the same time, some institutions are adopting policies of institutional neutrality, reducing the ability of presidents to speak out on these issues.

    The cumulative effect of these efforts is to make public universities and their leaders less effective in advocating for their missions, students and faculty.

    The Short-Term Advantages for Private Higher Ed

    In the short term, these challenges facing public universities have opened opportunities for private institutions. With public universities bogged down in political and financial crises, private universities can poach top faculty and administrators, offering them better resources and less political interference.

    I don’t fault private universities for capitalizing on these opportunities—they are acting in their own self-interest and in the interests of their own missions, students and faculty.

    But I fear that this approach is shortsighted and ultimately damaging to the broader higher education community. At a time when trust in higher education is declining, when the value of a college degree is being questioned and when the public is increasingly disillusioned with the academy, it is vital that we don’t allow attacks on public institutions to further erode public faith in all of higher education.

    Why Private Universities Must Stand Up for Public Higher Ed

    Private universities are uniquely positioned to advocate for the broader value of higher education and the critical role public institutions play.

    First, private universities can use their platforms to champion the ideals of higher education. With public universities under attack from state legislatures and special interest groups, private institutions can and should speak out against the politicization of higher education. Whether through research, advocacy or public statements, private universities can be powerful allies in the fight to protect the autonomy of public institutions.

    Second, private universities can advocate for increased public investments in higher education. They can use their influence to urge policymakers to restore funding for public universities and reject anti–higher education policies. At a time of declining public support, private universities can push for policies that ensure all students, regardless of background, have access to high-quality postsecondary education to develop the skills to succeed in today’s economy.

    Third, private universities can help bridge the divide between public and private higher education by forming partnerships with public two- and four-year institutions. These partnerships could include joint research initiatives, transfer and reciprocal enrollment programs, or shared resources to expand access and opportunity.

    The Time for Action Is Now

    In this critical moment for higher education, private universities need to demonstrate leadership—not just for their own interest, but for the interests of the entire industry. If we want to safeguard the unique contributions of both public and private higher education, we need to work together to ensure both sectors thrive.

    Now is the time for all those who believe in the transformational power of higher education to stand up and take action. The future of higher education depends on it.

    Michael S. Harris is a professor of higher education in the Simmons School of Education and Human Development at Southern Methodist University.

    Source link

  • College costs have grown, but so has the return (opinion)

    College costs have grown, but so has the return (opinion)

    FG Trade Latin/E+/Getty Images

    What’s the biggest problem facing college students today? Cost is a big concern, of course, for good reason. But many would point to something equally troubling—misperceptions about the value of college degrees. That’s no surprise when reasonable questions are raised about whether graduates are job-ready—and if too many jobs unnecessarily require diplomas.

    There has long been a paper ceiling that penalizes applicants who lack degrees. And more companies are now taking a closer look at so-called STARs—people Skilled Through Alternative Routes.

    The group Tear the Paper Ceiling says that 61 percent of Black workers, 55 percent of Hispanic workers, 66 percent of rural workers and 62 percent of veterans are considered STARs. They have learned valuable work skills through military service, certificate programs, on-the-job training and boot camps. But too often, they’ve been shut out unfairly.

    I applaud the work of this national group and their partners. The equity barriers to jobs are real. Only half of working-age people have a quality degree or other credential beyond high school, even as millions of jobs go unfilled in part because applicants lack the required background or credentials. It only makes sense to make sure we’re not leaving behind talented but uncredentialed neighbors.

    But to take a deeper look is to understand this isn’t only about expanding opportunity and filling today’s open jobs, but the jobs that an increasingly tech-driven, interconnected world will demand in coming years. Skills-based hiring is a good idea, but it won’t on its own come close to solving the nation’s human talent crisis. Increasing higher educational attainment by making sure many more people get better credentials—credentials of value—is the key.

    Foundation of Growth

    Higher education has always been about producing graduates who are ready to start careers, not just jobs. This matters because a person who is a good applicant for a position now could face challenges moving to better and higher-paying positions because they lack the foundation for career growth fostered in postsecondary programs.

    The American Association of Colleges and Universities has surveyed executives and hiring managers eight times since 2006. The most recent survey, from 2023, found that 80 percent of employers strongly or somewhat agree that college prepares people for success in the workforce. Getting a degree is certainly worth the time and money, respondents suggested, as the survey “found a strong correlation between the outcomes of a liberal education and the knowledge and skills employers view as essential for success in entry-level jobs and for advancement in their companies.”

    There will always be conflicting data points in times of change. For example, the push for skills-based hiring, including at the federal level, is opening doors to a broader array of good jobs that historically required a college degree. However, research by Harvard Business School and the Burning Glass Institute shows that college graduates still have an advantage when it comes to getting jobs with higher salaries and better benefits.

    It turns out that employers aren’t committing to skills-based hiring at the level that recent headlines might suggest. The Harvard–Burning Glass report tracked more than 11,000 jobs where a bachelor’s degree was no longer required in the job description. It found only a 3.5-percentage-point increase in the share of non-degree-holders hired into those roles—a decidedly underwhelming number suggesting the buzz about skills-based hiring may be more hype than trend.

    The Lifelong Payoff

    This and other signs reinforce the enduring value of degrees: A recent report from Georgetown University’s Center on Education and the Workforce found that 72 percent of jobs in the United States will require post–high school education or training by the year 2031. The center also found:

    • People with bachelor’s degrees earn, on average, $1.2 million more over their lifetime than those with only a high school education.
    • Of the 18.5 million annual job openings we expect in the coming years, more than two-thirds will require at least some college education.
    • Earnings for people without degrees have been growing over the past decade, but so has pay for degree holders. Even as people without degrees earn more, they are still not catching up with those with diplomas.

    Durable Skills Matter

    Employers often say they’re looking for “durable” skills, such as critical thinking, communication and problem-solving.

    Someone looking to hire an entry-level software developer might consider a candidate with skills in Python or other programming languages developed through informal learning. Many gifted techies are self-taught or developed skills through coding boot camps or working at start-ups, for example.

    But a college graduate with similar skills might stand out because of their experience working in groups to complete projects, their communication and presentation skills, analytical thinking, and other traits fostered in college classes.

    The catch: Across the board, we need better definitions of what our credentials mean. What defines a credential of value, exactly, and how do we make sure that the people obtaining credentials can do the work of the future?

    Certainly, our fast-moving, tech-driven economy increasingly rewards nimble problem-solvers. According to the World Economic Forum’s 2023 Future of Jobs report, employers estimate that 44 percent of workers’ skills will be disrupted in the next five years.

    “Cognitive skills are reported to be growing in importance most quickly, reflecting the increasing importance of complex problem-solving in the workplace,” the report said. “Surveyed businesses report creative thinking to be growing in importance slightly more rapidly than analytical thinking.”

    There are many implications to this change. Embedded in the education pay premium is a fairness issue when it comes to who goes to college and how we support them. The Georgetown center has long reported on the value of a college degree and the persistent opportunity gaps for women and people of color.

    The Change-Ready Nation

    Whatever the impact of skills-based hiring on the nation’s labor shortage, we shouldn’t stop there. Addressing the long-standing inequities in higher education and the workforce means ensuring that these skills-based pathways include opportunities for all workers, especially when it comes to pursuing further education and training even after they enter the workforce.

    Skills-based hiring and the push for increasing attainment aren’t countervailing forces. They’re aimed at ensuring that the nation grows and applies the talent it needs to be prepared for the human work of the 21st century, and to achieve the civic and economic benefits that people with good-paying jobs bring to their communities.

    In the end, this is about more than the job readiness of our students. We’re talking about the change readiness of our entire nation in a rapidly evolving economy. It makes sense to revamp job requirements to meet workforce demands, but there’s no denying we’ll need the best-educated country we can build if we’re going to deliver opportunity and economic prosperity fairly for everyone.

    Source link

  • Why small talk is a skill worth developing (opinion)

    Why small talk is a skill worth developing (opinion)

    You walk into the conference networking event, feeling alone, aware of the steady chatter throughout the room. You look to find someone you might know, you sense your breath growing faster and you experience that all-too-familiar pit in your stomach. You walk deeper into the room, taking a few grounding breaths, and notice others standing alone. You approach another conference attendee, feeling as if you are stepping outside of your body, and in your friendliest tone you introduce yourself and ask, “Where did you travel in from?”

    You did it! You initiated small talk with a stranger.

    Small talk is a mode of communication that occurs throughout the world, but not every culture engages in small talk to the same degree. In some cultures, it is expected, and in other cultures it can be perceived as inappropriate or rude. In addition to cultural context, one’s perception of small talk and propensity for engaging in it can be influenced by factors including, but not limited to, personality traits, degree of social comfort, mental health and wellness, past experiences, and the setting of the conversation. Small talk can also present specific challenges to language learners, neurodivergent individuals, people who are unaccustomed to talking with strangers and many others.

    Merriam-Webster Dictionary defines small talk as “light or casual conversation: chitchat.” (Seeing the word “chitchat” immediately brought me back to kindergarten, when my teacher, Mrs. Barker, would frequently say, “Kay, stop your chitchat.”) Cambridge Dictionary defines small talk as, “conversation about things that are not important, often between people who do not know each other well.” The emphasis on “not important” can give the impression that small talk is useless, however, within the U.S cultural context, small talk holds great importance in connecting individuals and laying the foundation for more substantial communication. Think of small talk as the gateway to more meaningful conversations.

    When done well, small talk relies on improvisation and adaptability, allowing for a flow of information and often uncovering unexpected insights and mutual interests. When I think of small talk I think of it as jazz, with each person riffing off the other to create a connection and to also make meaning in the moment. Effectively engaging in small talk by establishing commonalities can open a door for a future collaboration, expand your professional network, build rapport leading to a career or academic opportunity, enhance confidence and ease tension in an interview.

    Do you wish that small talk felt less awkward and more meaningful? Apply these strategies to reduce your small talk stress and to contribute to your career success:

    • Get curious. Harness your curiosity as you engage in small talk. Take the scenario we began with: Someone might ask, “Where did you travel in from?” because they are generally interested in meeting people from different parts of the country or world. Someone else might ask this question as a gateway to finding a future collaborator from a specific country or academic institution. Don’t just ask questions for the sake of chatting, but rather ask about topics in which you are genuinely interested. This approach will make engaging in small talk more enjoyable and valuable to you, and your interaction will feel authentic to the person with whom you are speaking.
    • Listen actively. As the other person responds to your question, try to refrain from planning what you will next ask, but rather focus on absorbing what they are sharing. Consider reflecting an aspect of something they mentioned. For example, if in response to “Where did you travel in from?” they say, “I flew in from Greece last night, and this is my first time in the States; I’m a Ph.D. student at the University of Crete,” you might empathize with their journey and ask how long they are visiting. After further discussion, you might feel inclined to offer to host the individual if they plan to travel around. Your one question, the one that initiated the small talk exchange, could even lead to a lifelong professional relationship.
    • Consider the context. The definition of small talk in the Cambridge Dictionary refers to a “conversation about things that are not important.” I would challenge you to not dismiss small talk as trivial but rather leverage it for more meaningful conversation. When thinking about the setting in which you are engaging in small talk, you can guide the conversation toward greater meaning. It would be odd if the individual attending the networking event at the conference opened the conversation with their name and asked, “What do you think about the weather?” This question would seem very disconnected from the event and purpose of the networking session. However, if the individual were waiting outside at an uncovered bus stop, it might be natural to strike up a conversation about the weather. Having an awareness about the context and setting will lead to an authentic conversation.
    • Have go-to questions. While you don’t want to arrive at every occasion with a script of possible questions, it can be a good exercise to reflect on the things about which you are genuinely curious. When attending a conference networking event, you may be interested in hearing about individuals’ career paths, learning about their research, gaining their advice, etc. In developing questions, focus on ones that are open-ended, where the response requires more than a yes or no. You might ask, “Which conference sessions are you most interested in attending?” Maybe that seems unimportant to you or even a bit superficial, but hearing about the other individual’s interest might inspire you to attend a session you would not have initially chosen. As the conversation unfolds, so will the opportunities to guide the conversation toward more meaningful topics, and you might next ask, “What research projects are you currently working on?”
    • Practice. It is likely that you have attended interview preparation and practice sessions but far less likely that you have attended a small talk training. This is not your fault. My plea to my fellow career development practitioners is this: If we know that many individuals approach small talk with feelings of discomfort or dread, and we also recognize that it is an important skill that leads to positive career outcomes, then we need to actively train and create opportunities for our students and postdocs to practice small talk in low-stakes settings. Consider building small talk into your interview preparation offerings, add a small talk learning module to an upcoming campus networking event, collaborate with your campus’s English language learning program to incorporate small talk activities and reinforce the many places and spaces where your students and postdocs are already engaging in small talk. An example would be when a student comes in for an appointment and asks, “How was your weekend?” By asking they might learn, for instance, that you were recently in Miami, a city on the top of their list of places to visit. In this exchange you could draw attention to how the student effectively engaged in small talk, reinforcing that it is a skill they already possess.
    • Know what topics not to lead with. In the U.S. cultural context, it is safe to say that you would not want to lead small talk with questions about politics, religion, finances, health or overly personal topics. Aspects of these topics might be categorized as sensitive or controversial and can create tension and lead to misunderstanding. Through engaging in small talk, you should be building a foundation of connection that can facilitate greater openness toward engaging in more meaningful topics. That said, maybe you are at the American Political Science Association’s annual meeting—in that context, it would be common for the small talk to include politics. The setting and context can serve to guide the topics and direction of the small talk.

    In academia, where emphasis on depth and scope of knowledge is highly valued, small talk can be easily viewed as a burden and overlooked as a necessary competency. But by applying a few small talk communication strategies, you will find that it can open doors and enhance career success. If you have yet to do so, embrace small talk as a skill worth developing, and get out there and chitchat. The effects on your professional life could be both profound and long-lasting.

    Kay Gruder is the associate director of graduate student and postdoc career programs and services at the University of Connecticut. She is a member of the Graduate Career Consortium, an organization providing an international voice for graduate-level career and professional development leaders.

    Source link

  • Review of Adamson’s “A Century of Tomorrows” (opinion)

    Review of Adamson’s “A Century of Tomorrows” (opinion)

    The name of an ambition more than it is of a body of knowledge, the term “futurology” is attributed by one source on word origins to Aldous Huxley. The author of Brave New World is a plausible candidate, of course; he is credited with coining it in 1946. But a search of JSTOR turns up an article from three years earlier suggesting that Oswald Spengler’s The Decline of the West made him the pioneer of “what one may hope will sometime develop into a real science of ‘Futurology.’”

    The author of that article was a political scientist and émigré from Nazi Germany named Ossip K. Flechtheim, then teaching at the historically Black Atlanta University; the article itself was published in a historically Black scholarly journal, Phylon. He soon decided that his idea’s time had come.

    By 1945, writing in The Journal of Higher Education, Flechtheim advocated for futurology both as an emerging line of interdisciplinary scholarship and as a matter of urgent concern to “the present-day student, whose life-span may well stretch into the twenty-first century.” He was optimistic about futurology’s potential to advance knowledge: Maintaining that “a large number of scholars” concurred on “the major problems which humanity would face” in the coming decades, he announced that “predict[ing] the most probable trends is a task which we have the means to accomplish successfully today.”

    But as Niels Bohr and/or Yogi Berra famously put it, “It is difficult to make predictions, especially about the future.” Flechtheim went on to publish landmark contributions to the incipient field of study, surely expecting that a proper social science of the future would be established by the turn of the millennium. But on this point, as in most cases, subsequent history only confirms the Bohr-Berra conundrum.

    One rough metric of futurology’s public-intellectual salience over time is how often the word appears per year in publications stored in the Google Books database. The resulting graph shows barely any use of the term before about 1960. But with the new decade there is a sudden burst of activity: a period of steep acceleration lasting about two decades, then collapsing dramatically over the final years of the 20th century. The JSTOR search results show much the same pattern.

    And so it is that Glenn Adamson’s A Century of Tomorrows: How Imagining the Future Shapes the Present (Bloomsbury Publishing) approaches the subject with not so much skepticism about futurology’s prospects as a certain irony about its very status as a distinct kind of knowledge. The author, a curator and a historian, attaches Flechtheim’s neologism as a label to a kaleidoscopic array of efforts to anticipate the shape of things to come, whether by analyzing statistical trends, through artistic creativity or in experimentation with new ways of life. The book concentrates on the United States and the 20th century, but inevitably the larger world and earlier history shape the book, which also reflects some 21st-century pressures as well.

    Plenty of science fiction novels have done better at imagining life in subsequent decades than think tank projections made in the same era. But comparing prognostications for relative accuracy is not Adamson’s real concern. Whatever means it may employ, the futurological imperative is always to respond to current reality—to its perceived failings or potentials, to the opportunities and terrors looming over the world or lurking just out of sight. Adamson writes that “every story about the future is also a demand to intervene in the present.” The forms of intervention considered include political movements, religious revivals, market research, scenarios for thermonuclear war, hippie communes, the insurance industry and time capsules assembled for future generations to ponder (to give an abbreviated list).

    The future’s uncertainty provides a blank screen for projecting contemporary issues in reimagined form and the opportunity to imagine alternatives. (Or to imagine inevitabilities, whether of the encouraging or despairing kind.)

    The author takes futurology to have emerged in the 19th century as a response to concerns previously the domain of religious traditions. Utopia and dystopia provide fairly obvious secular analogues to heaven and hell. But there is more to it than that. “For those who no longer saw the future as a matter of revealed truth,” Adamson writes, “new forms of authority stepped in to fill the gap. This is where the futurologists would come in. They would not only make claims about what lies ahead but also somehow persuade others of their ability to see it.”

    The grounds for claiming such authority proliferated, as did the visions themselves, in ways resistant to linear narrative. Instead, the author pulls seemingly unconnected developments together into thematic clusters, rather like museum exhibits displayed in partly chronological and partly thematic order.

    For example, the futurological cluster he calls the Machine includes the organization Technocracy, Inc., which in the early 1930s won a hearing for its plan to put the entire economy under the control of engineers who would end the waste, bottlenecks and underperformance that had, they purported, caused the Depression.

    Enthusiasm for the Technocracy’s social blueprints was short-lived, but it expressed a wider trend. Futurologists of this ilk “set about creating self-correcting, self-regulating systems; conceptually speaking, they became machine builders.” Under this heading Adamson includes enthusiasts for “the Soviet experiment” (as non-Communist admirers liked to call it), but also the market-minded professionals involved in industrial design, especially for automobiles: “The advance planning of annual model changes was a way to humanize technology, while also setting the horizon of consumer expectation.”

    Whereas the Machine-oriented visionaries of the early 20th century had specific goals for the future (and confidence about being able to meet them), a different attitude prevailed after World War II among those Adamson calls the Lab futurologists. The future was for them “something to be studied under laboratory conditions, with multiple scenarios measured and compared against one another.” Some of them had access to the enormous computers of the day, and the attention of people making decisions of the highest consequence.

    “Prediction was becoming a much subtler art,” the author continues, “with one defining exception: the prediction of nuclear annihilation, a zero multiplier for all human hopes.”

    Those who thought life in a Machine world sounded oppressive offered visions of the future as Garden, where a healthier balance between urban and rural life could prevail. A corresponding horror at Lab scenarios spawned what Adamson calls Party futurology. This started in Haight-Ashbury, fought back at the Stonewall and generated the radical feminist movement that still haunts some people’s nightmares.

    Missing from my thumbnail sketch here is all the historical texture of the book (including a diverse group of figures, leading and otherwise) as well as its working out of connections among seemingly unrelated developments.

    As mentioned, the book is centered on 20th-century America. Even so, “Flood,” the final chapter (not counting the conclusion), takes up forces that have continued to accumulate in the early millennium. Flood-era futurology is not defined either by climate change or digital hypersaturation of attention. The main element I’ll point out here is Adamson’s sense that futurology’s own future has been compromised by an excess of noise and meretricious pseudo-insight.

    The floods of dubious information (from too many sources to evaluate) make it harder to establish reality in the present, much less to extrapolate from it. Filling the void is a churn of simulated thought the author calls Big Ideas. “By this,” he writes, “I mean a general prediction about culture at large that initially feels like an important insight, but is actually either so general as to be beyond dispute, or so vague as to be immune to disproof.” Much better, on the whole, is to study the record of futurology itself, with its history as a warning against secular fortune-telling.

    Scott McLemee is Inside Higher Ed’s “Intellectual Affairs” columnist. He was a contributing editor at Lingua Franca magazine and a senior writer at The Chronicle of Higher Education before joining Inside Higher Ed in 2005.

    Source link

  • Higher ed is not a public good—but it could be (opinion)

    Higher ed is not a public good—but it could be (opinion)

    ogichobanov/iStock/Getty Images Plus

    When 85,000 Cornhuskers all wear red on game day, it’s easy to think of college as something larger than students and professors, classes, research and extracurriculars. Berkeley, Penn State and Michigan each have hundreds of thousands of online followers. Tar Heel nation is, after all, a nation.

    But wearing “college” on our chests does not a polity make. Higher education is not a public good and Americans know it.

    In the plainest sense, public goods aren’t excludable. Think of the air we breathe, interstate freeways and national defense. Everyone is affected by carbon dioxide levels, can travel by open roads and is protected, equally, from foreign threats.

    But when it comes to higher ed, exclusion is the name of the game.

    Admissions offices reject most applicants from selective colleges and create barriers at others. Tuition, even when subsidized, deters those shocked by sticker prices or unable to pay. Courses are controlled by departments, yet some intellectual climates drive students away. Governance, when behind closed doors, excludes parents, students, employers and other stakeholders.

    All told, the labyrinth of exclusionary practices makes higher ed more of a private than public good. We can interpret low public confidence in higher education as reflecting a belief that college is for someone else. Of those who matriculate, two-thirds of new community college students form the same opinion and drop out or enter a broken transfer system. One-third of new B.A. students will drop out or take more than six years to graduate. Once they’re gone, it’s often for good: Only 2.6 percent of stop-outs re-enrolled in the 2022–23 academic year. All told, this has led to a societal “diploma divide”: More people without a college degree voted for Donald Trump’s re-election in 2024 than in 2020.

    Colleges and universities do need to reclaim a place of pride in American society. But instead of ambiguous calls “reaffirming higher education’s public purpose,” why not simply be more public? And deliver an education that is, well, more good?

    My new book, Publicization: How Public and Private Interests Can Reinvent Education for the Common Good (Teachers College Press), argues that educational institutions of any sort—private nonprofit, state-controlled or proprietary—can be more publicly purposed when they meet two criteria. First, they must prepare each generation to sustain the common goods on which American life rests: a vibrant democracy, a productive economy, a civil society and a healthy planet. These are three long-standing aims and one new existential goal, around which colleges and universities can better organize the student experience.

    Second, institutions must themselves operate in ways that are more public than private. To do so, Publicization offers an “Exclusion Test” applicable to six domains—funding, governance, goals, accountability, equity and an institution’s underlying educational philosophy. Colleges and universities can apply the test to these areas and identify where operations can be less exclusionary and therefore more public.

    For example, do policies assume that some students aren’t “college ready,” or do we meet everyone—particularly those impacted by COVID-19—where they are? To what extent do applications create formal and informal hurdles, or do we offer more streamlined direct admission? Are inequitable proxies like Advanced Placement Calculus blocking talented students from admission, or does coursework in more widely relevant areas like statistics matter equally? Are free college plans riddled with eligibility fine print or open to anyone?

    Are courses gated by size, section, time of day and instructor approval, or are they more accessible? Are we mostly catering to young adults or presenting real options for the almost 37 million Americans with some college but no degree? Is federal funding considered a necessary evil, or is Washington engaged as a key stakeholder? Do boards focus narrowly on institutional issues or see themselves as hinges between school and society, mediating higher ed’s role in a democracy? Do we tolerate every private belief or hold ourselves to an epistemology premised on shared evidence and public scrutiny, what Jonathan Rauch calls the “Constitution of Knowledge”?

    As for an experience that’s good, higher ed’s 15-year-old success agenda focuses on access, affordability and student support. These aren’t enough. Quality must join the list, with a particular focus on our technical core: teaching and learning.

    Ask any of the nation’s 1.5 million professors and most will tell you they were not taught how to teach. They are world-class scholars. They serve their institutions. They are committed to students. But hardly any received comprehensive training in effective instruction. This persists despite the fact that most Americans believe the best colleges have the best teaching and evidence that effective instruction leads to more positive mindsets about one’s academic abilities, deeper learning, stronger retention and life readiness.

    As such, it’s no surprise that Richard Arum and Josipa Roksa found, in Academically Adrift (University of Chicago Press), “limited learning on college campuses.” That was in 2010 and not enough has changed, as recent articles in USA Today, The Washington Post, Washington Monthly, Forbes, Deseret News and The Chronicle of Higher Education affirm.

    But change is afoot. The National Academies of Sciences, Engineering and Medicine soon plan to publish STEM teaching standards, a first. Groups like the Equity-Based Teaching Collective have identified policies and practices to promote effective teaching campuswide. Over the past 10 years, the Association of College and University Educators, which I co-founded, has credentialed 42,000 professors in effective teaching at 500 institutions nationwide with proof of positive student impact. Last June’s second National Higher Education Teaching Conference gathered hundreds of higher education leaders and professors to accelerate the teaching excellence movement.

    College as a “public good”? Let’s give the public what it wants and deserves: a good education. In which the “best” colleges aren’t, by definition, the most exclusive. So that at family gatherings, our students tell their voting, poll-taking relatives how much they are learning, how great their professors are and how college is for them.

    Jonathan Gyurko teaches politics and education at Teachers College, Columbia University. His new book, Publicization: How Public and Private Interests Can Reinvent Education for the Common Good, was published by Teachers College Press last March.

    Source link