Tag: opinion

  • How to prepare proactively for a postdoc (opinion)

    How to prepare proactively for a postdoc (opinion)

    During my five years working in postdoctoral affairs at two higher education institutions, current postdoctoral associates have often shared their frustrations with me.

    Some feel they aren’t getting the credit they deserve in their research group. Others share they feel pressured to work long hours. And in terms of relationships with their mentors, some sense a lack of feedback and support from their faculty supervisor, while others feel they are micromanaged and lack autonomy.

    When I hear these things, it strengthens my belief that many of the problems that emerge during the postdoctoral experience could be reduced by more proactive communication prior to an individual accepting a position. Talking through personality, leadership and communication styles can help both postdocs and mentors better understand the relational dynamics, as well as the expectations and needs each bring to the partnership.

    So, while earlier “Carpe Careers” pieces have focused on the pragmatics of a postdoc job search and discovering postdoc opportunities, including those outside the traditional academic postdoc, I want to share the thought process late-stage Ph.D. students should be working their way through prior to and during a postdoc search, as well as advice on navigating the start of a postdoc position. My hope is that by carefully considering their own values and needs, graduate students can better understand if a postdoc position is the best career path for them, and if so, which postdoc position might be the right fit.

    The Right People and the Right Questions

    The first piece of advice I would give any prospective postdoc is that you must take ownership of your postdoc search. This includes talking to the right people and asking the right questions, which begins with asking yourself the most critical one: Why am I considering a postdoc position?

    People pursue postdocs for a variety of reasons. None are necessarily more appropriate than others, but your motivations for engaging in a postdoc should be clear to you. Some motivations might include:

    • To gain training and increase metrics of scholarly productivity in order to be a more competitive candidate for positions at research-intensive universities.
    • To learn new skills or techniques that will increase marketability, perhaps outside academia.
    • For international trainees, a postdoc path may allow for continued work in the United States while pursuing a green card and citizenship.
    • To increase time to think about career paths.
    • To explore a geographic location that might seem ideal for one’s career prospects.

    There is nothing wrong with any of these reasons, but understanding your reason will help you find the postdoc position that best fits your academic and professional journeys.

    Understanding Expectations

    Even if your goal is not to pursue an academic career and you don’t believe you will be in a postdoc position longer than a year, it is critical to take the postdoc experience seriously as professional experience, and accept and understand its responsibilities and deliverables.

    I fully acknowledge that the postdoc role can be nuanced and, ideally, it is some hybrid of employment, extended training and apprenticeship under a more senior faculty member. In nearly all cases, however, an individual is hired into a postdoc role to help make progress on a funded research project. This may involve funding from federal agencies such as the National Institutes of Health or National Science Foundation, a nonprofit foundation, or the institution itself. Regardless, a postdoc is hired to help deliver important outputs associated with a project that’s being paid for. From this perspective, the postdoc’s job is to help move the project forward and ultimately produce data and findings for further dissemination. Successful postdocs understand what these deliverables are and their importance to their faculty mentor.

    Of course, this does not mean postdocs should devote 100 percent of their time to producing research products. In fact, many years ago, the Office of Management and Budget made clear to federally funded U.S. agencies supporting graduate students and postdocs that such roles have dual functions of employee and trainee. The notice specifically states that postdocs “are expected to be actively engaged in their training and career development under their research appointments.” Additionally, the NIH is seeking to explicitly specify the percentage of time a postdoc should be devoting to their career and professional development through recommendations from a Working Group on Re-envisioning NIH-Supported Postdoctoral Training. In a report published in December 2023, the group suggests postdocs should have a minimum of 10 percent of their effort devoted to career and professional development activities.

    It’s clear that the job of a postdoc is to both deliver on research products and invest in one’s own training and professional development. Given the need to effectively balance these two activities, it is critical that prospective postdocs seek to understand how the group they might work in, or the faculty member they might work with, understands the position. And likewise, it is important for the candidate to convey their expectations to the same parties.

    A proactive conversation can be intimidating for some, but the Institute for Broadening Participation has created a list of questions taken from a National Academies report on enhancing the postdoc experience to get you started.

    Exploring the Landscape

    Potential postdocs should also consider speaking to current and/or past postdocs with experiences in groups and with people with whom they are interested in working. Past postdocs can often more freely enlighten others as to faculty members’ working and communications styles and their willingness to provide support.

    Another important factor prospective postdocs should consider is the support and resources institutions provide. This can range from employee benefits and postdoc compensation to career and professional development opportunities.

    A critical resource to help you understand the current institutional landscape for postdoc support in the United States is the National Postdoctoral Association’s Institutional Policy Report and Database. You can leverage this data by benchmarking the benefits of institutions you are considering for your postdoc. For example, in the most recently published report from 2023, 52 percent of responding U.S. institutions reported offering matching retirement benefits to their employee postdocs.

    Considering the entire package around a postdoc position is yet another important step in evaluating if a potential position aligns with your academic, professional and personal goals.

    Putting Together a Plan

    Once you have decided to accept a postdoc position, I advise communicating proactively with your new faculty supervisor to ensure all expectations are aligned. A great document to help with framing your potential responsibilities is the Compact Between Postdoctoral Appointees and Their Mentors from the Association of American Medical Colleges.

    Finally, I highly encourage any new postdoc to create an individual development plan to outline their project completion, skill development and career advancement goals. This can be shared with the supervisor to ensure both parties’ project completion goals match and the postdoc’s other goals will be supported. If faculty supervisors could benefit from additional resources that stress the importance of IDPs, I suggest this piece published in Molecular Cell and this Inside Higher Ed essay.

    Deciding whether to pursue a postdoc position, and how to pursue one proactively, is important to maximize your future prospects as a Ph.D. holder. Leveraging this advice, plus that of other online resources— such as the Strategic Postdoc online course from the Science Communication Lab and the Postdoc Academy’s Succeeding as a Postdoc online course and mentoring resources—will help you to choose a position with intention and engage in deliberate discussions prior to accepting it. This will increase the likelihood that your postdoc experience will align with your needs and help successfully launch the next stage in your career.

    Chris Smith is Virginia Tech’s postdoctoral affairs program administrator. He serves on the National Postdoctoral Association’s Board of Directors and is a member of the Graduate Career Consortium—an organization providing a national voice for graduate-level career and professional development leaders.

    Source link

  • Essay on the panopticon (opinion)

    Essay on the panopticon (opinion)

    Not quite a household word (beyond academia, anyway), “panopticon” nonetheless turns up in news stories with surprising frequency—here and here, for example, and here and here. The Greek roots in its name point to something “all seeing,” and in occasional journalistic usage it almost always functions as a synonym for what’s more routinely called “the surveillance society”: the near ubiquity of video cameras in public (and often private) space, combined with our every click and keystroke online being tracked, stored, analyzed and aggregated by Big Data.

    Originally, though, the panopticon was what the British political philosopher Jeremy Bentham proposed as a new model of prison architecture at the end of the 18th century. The design was ingenious. It also embodied a paranoid’s nightmare. And at some point, it came to seem normal.

    Picture a cylindrical building, each floor consisting of a ring of cells, with a watchtower of sorts at the center. From here, prison staff have an unobstructed view of all the cells, which at night are backlit with lamps. At the same time, inmates are prevented from seeing who is in the tower or what they are watching, thanks to a system of one-way screens.

    Prisoners could never be certain whether or not their actions were under observation. The constant potential for exposure to the authorities’ unblinking gaze would presumably reinforce the prisoner’s conscience— or install one, if need be.

    The panoptic enclosure was also to be a workhouse. Besides building good character, labor would earn prisoners a small income (to be managed in their best interest by the authorities), while generating revenue to cover the expense of food and housing. Bentham expected the enterprise to turn a profit.

    He had similar plans for making productive citizens out of the indigent. The panoptic poorhouse would, in his phrase, “grind rogues honest.” The education of schoolchildren might go better if conducted along panoptic lines; likewise with care for the insane. Bentham’s philanthropic ambitions were nothing if not grand, albeit somewhat ruthless.

    The goal of establishing perfect surveillance sometimes ran up against the technological limitations of Bentham’s era. (I find it hard to picture how the screens would work, for instance.) But he was dogged in promoting the idea, which did elicit interest from various quarters. Elements of the panopticon were incorporated into penitentiaries during Bentham’s lifetime—for one, Eastern State Penitentiary in Pennsylvania, opened in 1829—but never to his full satisfaction. He was constantly tinkering with the blueprints, to make the design more comprehensive and self-contained. He worked out a suitable plumbing system. He thought of everything, or tried.

    Only in the late 20th century did the panopticon elicit discussion outside the ranks of penologists and Bentham scholars. Even the specialists tended to neglect this side of his work, as the American historian Gertrude Himmelfarb complained in a book from 1968. “Not only historians and biographers,” she wrote, “but even legal and penal commentators seem to be unfamiliar with some of the most important features of Bentham’s plan.” They tended to pass it by with a few words of admiration or disdain.

    The leap into wider circulation came in the wake of Michel Foucault’s Discipline and Punish: The Birth of the Prison (1975). Besides acknowledging the panopticon’s significance in the history of prison design, Foucault treated it as prototypical of a new social dynamic: the emergence of institutions and disciplines seeking to accumulate knowledge about (and exercise power over) large populations. Panopticism sought to govern a population as smoothly, productively and efficiently as possible, with the smallest feasible cadre of managers.

    This was, in effect, the technocratic underside of Bentham’s utilitarianism, which defined an optimal social arrangement as one creating the greatest happiness for the greatest number of people. Bentham applied cost-benefit analysis to social institutions and human behavior to determine how they could be reshaped along more rational lines.

    To Foucault, the panopticon offered more than an effort at social reform, however grandiose. Its aim, he writes, “is to strengthen the social forces—to increase production, to develop the economy, spread education, raise the level of public morality; to increase and multiply.”

    If Bentham’s innovation is adaptable to a variety of uses, that is because it promises to impose order on group behavior by reprogramming the individual.

    From a technocrat’s perspective, the most dysfunctional part of society is the raw material from which it’s built. The panopticon is a tool for fashioning humans suitable for modern use.

    The prisoner, beggar or student dropped into the panopticon is, Foucault writes, “securely confined to a cell from which he is seen from the front by the supervisor; but the side walls prevent him from coming into contact with his companions.” Hundreds if not thousands of people surround him in all directions. The population is a crowd (something worrisome to anyone with authority, especially with the French Revolution still vividly in mind), but incapable of acting as one.

    As if to remind himself of his own humanitarian intentions, Bentham proposes that people from the outside world be allowed to visit the observation deck of the panopticon. Foucault explains, with dry irony, that this will preclude any danger “that the increase of power created by the panoptic machine may degenerate into tyranny …” For the panopticon would be under democratic control, of a sort.

    “Any member of society,” Foucault notes, had “the right to come and see with his own eyes how the schools, hospitals, factories, prisons function.” Besides ensuring a degree of public accountability, their very presence would contribute to the panopticon’s operations. Visitors would not meet the prisoners (or students, etc.) but observe them from the control and surveillance center. They would bring that many more eyes to the task of watching the cells for bad behavior.

    As indicated at the beginning of this piece, nonscholarly references to the panopticon in the 21st century typically appear as commentary on the norms of life online. This undoubtedly follows from Discipline and Punish being on the syllabus, in a variety of fields, for two or three generations now.

    Bentham was confident that his work would be appreciated in centuries to come, but he would probably be perplexed by this repurposing of his idea. He designed the panopticon to “grind rogues honest” through anonymous and continuous surveillance, which the digital panopticon exercises as well—but without a deterrent effect, to put it mildly.

    Bentham’s effort to impose inhibition on unwilling subjects seems to have been hacked; the panoptic technology of the present is programmed to generate exhibitionism and voyeurism. A couple of decades ago, the arrival of each new piece of digital technology was hailed as a tool for self-fashioning, self-optimization or some other emancipatory ambition. For all its limitations, the analogy to Bentham’s panopticon fits in one respect: Escape is hard even to imagine.

    Scott McLemee is Inside Higher Ed’s “Intellectual Affairs” columnist. He was a contributing editor at Lingua Franca magazine and a senior writer at The Chronicle of Higher Education before joining Inside Higher Ed in 2005.

    Source link

  • Ideas for navigating editor-reviewer relationships (opinion)

    Ideas for navigating editor-reviewer relationships (opinion)

    An editor or reviewer can have an outsize impact on the career of a scholar, particularly in the early stages. The stakes can be high for an author. A negative review or edit can set back a research plan by months and harm a scholar’s chances for tenure or promotion. This reality creates a power imbalance between an editor or reviewer and an author that can be abused.

    Graduate schools offer few pointers on how to navigate editor and reviewer relationships. Our goal in this essay is to debunk the process and offer suggestions and observations for editors/reviewers and authors on how to approach the task in a more thoughtful and efficient way.

    Understanding the Reviewer and Editor Roles

    First, it is important to note that while reviewers and editors take part in a similar process—assessing the work of an author—the tasks are different. The editor is rarely an expert in the specific subject of an article and necessarily needs to rely on impartial reviewers to place the work in context. Nevertheless, the editor—and, at times, an editorial board—is the decision-maker in this equation. Having a clear and transparent line of communication between the author and the editor is critical.

    The task of the reviewer is to place the work in its scholarly context and to weigh its merit. Is the work breaking new ground? Is it challenging a long-held interpretation within the academy? Are the sources contemporary and the most relevant? Does the work fit the subject area of the journal or press? Can it be revised to make it suitable for publication?

    It is our strong belief that reviewers need to meet the authors where they are—that is, to understand the goal of the author, determine whether the work is suitable for the journal or press in question and, if so, help them reach the promised land of publication. Simply put: The reviewer should weigh the author’s case against the author’s intent.

    Unfortunately, this does not always happen: It is sometimes the case that reviewers stray from this path and insert suggestions that they would like to see addressed but that are not central to the submitted work. The dreaded “reviewer number 2” has become the bane of many an author’s existence. In this sort of review, the reviewer raises so many questions and objections that an author is left to ponder whether the two are reading the same text. And, it must be said, just as on social media, anonymity can at times lead to incivility. Instead of being helpful, sometimes a reviewer is unkind and cruel.

    The role of the editor is to referee between the goals of the author and the desires of the reviewer. Egos and politics often come into play in this process because reviewers in many cases are colleagues of the editor and contributors to the publication in question. Our experience suggests there are two major types of editors. Authors will need to adjust their approach based on which of these two types best describes their editor:

    • Sympathetic editor: This is the ideal. This editor will work with an author to publish a submission if the research is strong and will allow them to keep their own voice. They do not seek to impose their vision on the book or article. They do not allow their personal politics to influence the decision-making process. They are driven by one central question: Does the author accomplish what they set out to do? This type of editor tries to determine whether a reviewer is acting out of hubris by suggesting tangential and substantial changes or whether they are addressing core issues. On the opposite end of the spectrum, they are alert to the two-paragraph, lackadaisical reviewer who read the work over lunch while answering emails.
    • Visionary editor: It may sound counterintuitive, but an editor with their own vision for someone else’s work can mean frustration and ultimately rejection for an author. This type of editor sees someone else’s work as an opportunity to explore an aspect of a topic that interests them. They impose their own vision on someone else’s work rather than determining whether the author has achieved the goal they set for themselves. This typically takes the form of a lengthy response asking an author to fundamentally rethink their piece. The response contains so many critiques that to adhere to the suggestions would amount to writing a completely different piece of scholarship. This editor also tends to extend and even impede the process almost endlessly.

    As an example, upon the death of Fidel Castro in November 2016, the Latin American historian of this writing duo (Argote-Freyre) was asked by a journal editorial board member to author an article comparing the career of Castro with that of the prior dictator of Cuba, Fulgencio Batista. The resulting piece concluded that the two political figures shared more similarities than differences. The editor, although agreeing to the concept, was unhappy with the conclusions reached by the essay. The editor struck out paragraph after paragraph; a lecture on tone and thesis ensued.

    The editor suggested a piece analyzing the revisionist historiography on Batista—a subject outside the contours of the original assignment and one that would take many months to complete. The author made a rookie mistake in assuming that a member of the editorial board was vested with the authority to make assignments. In retrospect, it seems as if the assignment was foisted upon the working editor, who then wanted to steer the piece in a completely different direction. The author withdrew the piece; the only positive was that only a few months were lost in the process.

    The visionary editor is the type who is never satisfied. They forget that the piece is the author’s, not theirs. Yes, the editor is a gatekeeper for the journal or press, but if it is not a good fit, they should say so and move on. This picky editor sends a revision back to a new third (or fourth) reviewer, who is likely to ask for another, different round of revisions. This is nothing other than moving the goalposts. One of us had this occur with an editor who said, “As you know, we often send articles to several rounds of reviewers.” Well, we did not know, because the journal’s website did not say that. Such a process could go on forever and, to our eyes, makes no sense. The editor should decide on his or her own whether the author has revised sufficiently: It is clear from the reader reports what needed to be done, so just check and see. The editor needs to be decisive.

    At the point a work is about to be sent to an additional set of reviewers, an author needs to withdraw the article or book from consideration. Run as fast as you can in search of another editor and publication. Do not let someone waste your time, especially if your clock is ticking for tenure and promotion.

    How to Make Relationships Work— and When to Walk Away

    The author-editor relationship should be a dance, not a duel. An author is not at the mercy of the process; you are a partner. If you are not clicking with the editor, walk away. A bad first date rarely turns into a good second date. This is particularly true when working on a book project, given the many steps and long timeline involved.

    For a revise-and-resubmit, we suggest strongly that you be professionally assertive. Ask about the review of the resubmission before you do it. If the editor says it will go to new readers, withdraw the piece. This never goes well. Editors should be transparent about the steps involved. It is our experience that some editors are hesitant to divulge their process. If that is the case, the author needs to reassess the integrity of that process.

    Being fully transparent allows you to ask for transparency in return, whether you are an editor or an author. If, as we have experienced, two peer reviews come in that are quite opposed, the editor should get a third before returning to the author. If there are two or three reviews, the editor should synthesize them with a memo attached to the reports. The summary should go something like: “All reviewers agree chapter four needs to be revised with this material, but there is disagreement about chapter six.” There is also nothing wrong with asking the author to make the tough call on a contested point of interpretation. Once again, it is the author’s scholarship, not the editor’s, the journal’s or the press’s.

    For authors: Have a conversation with the editor. If it’s a call, follow up with a written summary. When responding to reader reports, especially when they disagree, say what you will and will not do. Do not say you will revise when you disagree—but don’t be stubborn. Give a little to get what you won’t compromise. If you disagree with a reviewer’s suggestion, say why, and ask the editor for approval not to make a specific change suggested in one of the reader reports. Get that approval. If the editor says the revision will go back to one or both original readers instead of making the final call himself, politely insist that the written exchange between the author and editor be sent along, too.

    It may not always work. Recently, one of us did just what we described and the editor said the plan sounded good, only to have the journal reject the revision. The editorial board said a specific change was not made even though the editor agreed that change would not be necessary. Poor communication and coordination between an editor and an editorial board should not penalize an author.

    Finally, we’d like to briefly weigh in on the argument that professors should reject peer reviewing because it is an unpaid task. If you do not want to do it, don’t—but there are compelling reasons to write responsible peer reviews. First, unpaid labor is not without merit. Even if your tenure and promotion committees might not value the task, that does not mean it is not worthwhile. You’re not paid to volunteer at your local food pantry, but you still do it. Second, people do this for you; it is time to be generous in return. Third, reviewing provides insights into the process for your own work. Peer reviewing keeps you current on trends in the field. Editing and peer reviewing make you a better writer and produce better scholarship. Isn’t that what we all want?

    Frank Argote-Freyre and Christopher M. Bellitto are professors of history at Kean University in Union, N.J., with extensive experience with peer review on both sides of the process. Argote-Freyre, a scholar of Latin American history, serves as a frequent peer reviewer and content editor on various book and article projects. Bellitto, a medievalist, is the series editor of Brill’s Companions to the Christian Tradition and academic editor at large of Paulist Press.

    Source link

  • Academic unions should adopt neutrality (opinion)

    Academic unions should adopt neutrality (opinion)

    Institutional neutrality at universities is having its moment in the aftermath of a year of nationwide campus protests over the Israel-Gaza war. The list of universities that have adopted neutrality has grown over the course of the past 12 months. The concept necessarily is expanding to include conversations around university investments. Yet, academic unions have slipped under the radar as purveyors of positions on political issues. They should not be neglected in the push for neutral stances except for those that directly pertain to an institutional mission. In the case of the union, this should be to promote labor interests. Professors from a range of ideologies should be able to find common cause for collective bargaining purposes without being forced into supporting other political positions.

    The lack of neutrality of professors’ unions on non-labor-related issues is a pernicious problem. Federal law and some state laws that pertain to unions work to compel professors’ speech. Under the federal National Labor Relations Act, if a majority of private sector workers voting in a union election choose to unionize, all workers in that bargaining unit must be exclusively represented by that union. New York’s Taylor Law requires the same for public employees. And, if workers want the benefits of membership, like voting for union leadership and contracts, they must pay dues.

    While public employees could choose not to be union members before the Supreme Court’s 2017 Janus v. AFSCME ruling, that case now guarantees their right to not pay agency fees. But even if workers wish to eschew membership and not pay fees, they cannot dissociate entirely. They are required to be represented by a union that speaks via statements at the local, state and national level on many non-labor-related subjects. Therefore, with their veneer of solidarity, unions quash viewpoint diversity and suppress First Amendment rights. They tie one of the only forms of dissent possible (withdrawing dues) to disenfranchisement from the union, the organization that negotiates their wages and labor conditions.

    Professors who do stop paying their dues are often derided as “free riders.” They risk offending union leadership, who have a say in university processes that can impact their employment, like grievances and denial of reappointment. The union is formally required to provide equal advocacy as their exclusive representative. However, even if one believes biases will never prevail against “free riders,” there is still the suppressive impact of professors’ perception that paying dues and keeping quiet is best for their careers.

    And so, professors are forced into a kind of protection racket, paying unions that may endorse positions with which they may disagree. The National Education Association has opined on everything from ending private prisons to climate change, from promoting women-led businesses to helmets for motorcyclists. They have issued statements on the Israel-Gaza conflict, advocated for codifying Roe v. Wade into law and called for Donald Trump’s ouster. They have adopted progressive ideological lenses throughout such statements, arguing for instance that “white supremacy culture” is prevalent in the current U.S., and that “intersectionality must be … addressed … in order to advance the [NEA’s] social justice work.”

    To be clear, I am not arguing that these positions taken by unions are bad. I am not reflecting my own political preferences. I am not highlighting progressive examples to critique only progressive examples: I could find none that can be considered conservative. I am not saying that it’s not possible that a majority of members agree with the statements. I am also not arguing that workers do not have the right to form associations to advocate for political causes.

    What I am arguing is that due to laws making exclusive representation compulsory, unions should adopt neutrality on political issues that do not impact the primary purpose of academic unions: advocating for professors’ interests as workers. This lets ideological diversity exist and prevents coerced speech and dues payments. This neutrality is of paramount importance with public sector unions, where union leadership activities may receive taxpayer-subsidized administrative benefits.

    This neutrality should extend to political endorsements of individual candidates. While there may be some argument to be made that endorsing a pro-union or pro–higher education candidate over their opponent directly pertains to professors’ interests as workers, this carries with it implicit endorsement of a wide slate of other policies. A better approach would be for unions to support (or critique) candidates’ specific policy proposals or voting records. It would also reduce antagonism between unions and candidates they did not endorse, should those be elected.

    Recent examples show the perils of academic unions not having a neutrality standard. In 2018, a University of Maine professor sued his union, noting his opposition to its stances, like endorsing Hillary Clinton for president. More recently, in 2022, six City University of New York professors filed suit against the Professional Staff Congress (PSC), which passed a pro-Palestinian resolution they viewed as antisemitic. They resigned their memberships, along with approximately 263 other professors. But because of the Taylor Law, they are required to be represented by the PSC, which did not give evidence it could be fair in representing them. The PSC called them free riders, claiming their lawsuit was “meritless … funded by the notoriously right-wing National Right to Work Legal Foundation,” and described the “‘Right to Work’ agenda” as “rooted in white supremacy.”

    After lower courts ruled to dismiss their suit, the CUNY professors appealed to the Supreme Court, which just this month declined to hear their case. Yet, while this case could have been a victory for viewpoint diversity and free speech and an impetus for unions to get on the institutional neutrality bandwagon, future such suits will doubtless arise and reach a court favorable to their claims. Academic unions should get ahead of such a court ruling and make union membership attractive to all who may want to participate based on advocacy for improved working conditions, but not for particular solutions to international wars—or for wearing motorcycle helmets.

    Colleen P. Eren is a professor of sociology and criminal justice at William Paterson University and a research fellow at the Segal Center for Academic Pluralism. Her commentaries on higher ed and other topics can be found across a range of publications, including The New York Times, Discourse, Reason, and the Foundation for Economic Education.

    Source link

  • We need new ways to protect academic freedom (opinion)

    We need new ways to protect academic freedom (opinion)

    Katherine Franke, formerly a law professor at Columbia University, is just the latest of many academics who have found themselves in hot water because of something they said outside the classroom. Others have been fired or resigned under pressure for what they posted online or said in other off-campus venues.

    In each of those cases, the “offending party” invoked academic freedom or freedom of speech as a defense to pressures brought on them, or procedures initiated against them, by university administrators. The traditional discourse of academic freedom or free speech on campus has focused on threats from inside the academy of the kind that led Franke and others to leave their positions.

    Today, threats to academic freedom and free speech are being mounted from the outside by governments or advocacy groups intent on policing colleges and universities and exposing what they see as a suffocating orthodoxy. As Darrell M. West wrote in 2022, “In recent years, we have seen a number of cases where political leaders upset about criticism have challenged professors and sought to intimidate them into silence.”

    We have seen this act before, and the record of universities is not pretty.

    During the 1940s and 1950s, an anticommunist crusade swept the nation, and universities were prime targets. In that period, “faculty and staff at institutions of higher learning across the country experienced increased scrutiny from college administrators and trustees, as well as Congress and the FBI, for their speech, their academic work, and their political activities.”

    And many universities put up no resistance.

    Today, some believe, as Nina Jankowicz puts it, that we are entering “an era of real censorship the likes of which the United States has never seen. How will universities respond?”

    If academic freedom and freedom of expression are to be meaningful, colleges and universities must not only resist the temptation to punish or purge people whose speech they and others may find offensive; they must provide new protections against external threats, especially when it comes to extramural speech by members of their faculties.

    They must become active protectors and allies of faculty who are targeted.

    As has long been recognized, academic freedom and free speech are not identical. In 2007, Rachel Levinson, then the AAUP senior counsel, wrote, “It can … be difficult to explain the distinction between ‘academic freedom’ and ‘free speech rights under the First Amendment’—two related but analytically distinct legal concepts.”

    Levinson explained, “Academic freedom … addresses rights within the educational contexts of teaching, learning, and research both in and outside the classroom.” Free speech requires that there be no regulation of expression on “all sorts of topics and in all sorts of settings.”

    Ten years after Levinson, Stanley Fish made a splash when he argued, “Freedom of speech is not an academic value.” As Fish explained, “Accuracy of speech is an academic value … [because of] the goal of academic inquiry: getting a matter of fact right.” Free speech, in contrast, means “something like a Hyde Park corner or a town-hall meeting where people take turns offering their opinions on pressing social matters.”

    But as Keith Whittington observes, the boundaries that Levinson and Fish think can be drawn between academic freedom and free speech are not always recognized, even by organizations like the AAUP. “In its foundational 1915 Declaration of Principles on Academic Freedom and Academic Tenure,” Whittington writes, “the AAUP asserted that academic freedom consists of three elements: freedom of research, freedom of teaching, and ‘freedom of extramural utterance and action.’”

    In 1940, Whittington explains, “the organization reemphasized its position that ‘when they speak or write as citizens,’ professors ‘should be free from institutional censorship or discipline.’”

    Like the AAUP, Whittington opposes “institutional censorship” for extramural speech. That is crucially important.

    But in the era in which academics now live and work, is it enough?

    We know that academics report a decrease in their sense of academic freedom. A fall 2024 survey by Inside Higher Ed found that 49 percent of professors experienced a decline over the prior year in their sense of academic freedom as it pertains to extramural speech.

    To foster academic freedom and free speech on campus or in the world beyond the campus, colleges and universities need to move from merely tolerating the expression of unpopular ideas to a more affirmative stance in which they take responsibility for fostering it. It is not enough to tell faculty that the university will respect academic freedom and free expression if they are afraid to exercise those very rights.

    Faculty may be fearful that saying the “wrong” thing will result in being ostracized or shunned. John Stuart Mill, one of the great advocates for free expression, warned about what he called “the tyranny of the prevailing opinion and feeling.” That tyranny could chill the expression of unpopular ideas.

    In 1952, during the McCarthy era, Supreme Court justice Felix Frankfurter also worried about efforts to intimidate academics that had “an unmistakable tendency to chill that free play of the spirit which all teachers ought especially to cultivate and practice.”

    Beyond the campus, faculty may rightly fear that if they say things that offend powerful people or government officials, they will be quickly caught up in an online frenzy or will be targeted. If they think their academic institutions will not have their back, they may choose the safety of silence over the risk of saying what they think.

    Whittington gets it right when he argues that “Colleges and universities should encourage faculty to bring their expertise to bear on matters of public concern and express their informed judgments to public audiences when doing so might be relevant to ongoing public debates.” The public interest is served when we “design institutions and practices that facilitate the diffusion of that knowledge.”

    Those institutions and practices need to be adapted to the political environment in which we live. That is why it is so important that colleges and universities examine their policies and practices and develop new ways of supporting their faculty if extramural speech gets them in trouble. This may mean providing financial resources as well as making public statements in defense of those faculty members.

    Colleges and universities should also consider making their legal counsel available to offer advice and representation and using whatever political influence they wield on behalf of a faculty member who is under attack.

    Without those things, academics may be “free from” the kind of university action that led Franke to leave Columbia but still not be “free to” use their academic freedom and right of free expression for the benefit of their students, their professions and the society at large.

    Austin Sarat is the William Nelson Cromwell Professor of Jurisprudence and Political Science at Amherst College.

    Source link

  • Thoughts on 20 years of college teaching (opinion)

    Thoughts on 20 years of college teaching (opinion)

    I have now been teaching at Duke University for 20 years. I have been through all kinds of teaching fads—active learning, team-based learning, alternative grading, service learning, etc. You might assume that I have become a better teacher over these many years. Yet I am noticing a curious trend in my course evaluations: Some of my students like me and my courses less and less.

    As a teaching faculty member, this matters greatly to my own career trajectory, and so I’ve wondered and worried about what to do. Why am I struggling to teach well and why are my students struggling to learn?

    Looking back on the past two decades of my teaching and reaching further back into my own college experience, I see six clear differences between now and then.

    Difference No. 1: Access to Information

    When I took my first college environmental science class, way back in 1992, I was mesmerized. This was before the days of Advanced Placement Environmental Science, so I came into the class knowing almost nothing about the topic, motivated by my naïve idea to be part of “saving the world.” To learn, I had a textbook (that I still have, all highlighted and marked up) and the lectures (for which I still have my notes). Sure, I could go to the library and find books and articles to learn more, but mostly I stuck to my textbook and my notes. I showed up to the lecture-based class to learn, to listen, to ask questions.

    Today, my students show up in my course often having taken AP Environmental Science, with access to unlimited information about the course topics, and with AI assistants that will help them organize their notes, write their essays and prepare for exams. I have had to shift from expert to curator, spending hours sifting through online articles, podcasts (SO many podcasts) and videos, instead of relying on a single textbook. I look for content that will engage students, knowing that some may also spend their class period fact-checking my lectures, which brings me to …

    Difference No. 2: Attention

    When I lecture, I look out to a sea of stickered laptops, with students shifting their attention between me, my slides and their screens. I remind them that I can tell when they are watching TikTok or texting, because the class material probably isn’t causing their amused facial expressions.

    Honestly, I am finding myself more distracted, too. While lecturing I am not only thinking about the lecture material and what’s on the next slide—I am also wondering how I can get my students’ attention. I often default to telling a personal anecdote, but even as they briefly look up to laugh, they just as quickly return their eyes to their screens.

    The obvious advice would be to have more engaging activities than lecturing but …

    Difference No. 3: More Lectures, Please

    After 2020, one comment showed up over and over on my course evaluations: lecture more. My students seemed not to see the value of small-group activities, gallery walks, interactive data exercises and discussions. They felt that they were not learning as much, and some of them assumed that meant that I didn’t know as much, which leads me to …

    Difference No. 4: Sense of Entitlement

    While I teach at a private elite university, my colleagues across a range of institutions have backed this up: Some students seem to not have much respect for faculty. The most common way this shows up is at the end of the semester, when students send me emails about why my course policies resulted in a grade they think is unfair, or after an exam, when they argue that I did not grade them fairly, which leads me to …

    Difference No. 5: Assessment Confusion

    When I was in college, I took midterms and finals. I rewrote my notes, made flash cards, created potential exam questions, asked friends for old exams and studied a lot. I took multiple-choice exams and essay exams, in-class exams and take-home exams. When I first started teaching my lecture-based class, I assigned two midterms and a final. I took the business of writing exams seriously, often using short-answer and essay exams that took a whole lot of time to grade. I wanted the experience of taking the exam to help students feel like they had learned something, and the experience of studying to actually entice them to learn.

    Then, two things happened. We faculty got all excited about alternative assessments, trying to make our classes more inclusive for more learning styles. And the students started rebelling about their exam grades, nitpicking our grading for a point here and there, angry that, as one student put it, I was “ruthless” in my grading. Students didn’t show up at my office hours eager to understand the concepts—they wanted more points.

    So, I threw out exams in favor of shorter papers, discussions and activities. In fall 2024, I had 74 students and I gave a whopping 67 of them A’s. To do well in my class now, you don’t really have to learn anything. You just need to show up. Except the problem with grading for attendance is …

    Difference No. 6: Our Students Are Struggling

    We all know that our students are struggling with more mental and emotional health issues, perhaps due to COVID-related learning loss, the state of the world and so many other things. Many of us include mental health resources in our syllabus, but we know that’s not enough. Students are much more open about their struggles with us, but we aren’t trained therapists and often don’t know the right thing to say. Who am I to determine whether or not one student’s excuse for missing a class is valid while another’s is not? How can I keep extending the deadlines for a struggling student while keeping the deadline firm for the rest? Sure, there are suggestions for this (e.g., offer everyone a “late assignment” ticket to use), but I still spend a lot of time sifting through student email requests for extensions and understanding. How can we be fair to all of our students while maintaining the rhythm of course expectations?

    Usually, one acknowledges the differences between students now and “back then” at retirement, reflecting on the long arc of a teaching career. But I am not at the end—I have a long way to go (hopefully). I am expected to be good at this in order to get reappointed to my teaching faculty position.

    Teaching requires much more agility now as we attempt to adapt to the ever-expanding information sphere, our students’ needs, and the state of the community and world beyond our classrooms. Instead of jumping to solutions (more active learning!), I think it’s reasonable to step back and acknowledge that there is no one change we need to make to be more effective educators in 2025. We also can acknowledge that some of the strategies we are using to make our classes more engaging and inclusive might backfire, and that there still is a time and place for really good, engaging lectures and really hard, useful exams.

    There are fads in teaching, and over the past 20 years, I have seen and tried plenty of them. We prize teaching innovation, highlighting new techniques as smashing successes. But sometimes we learn that our best-laid plans don’t work out, that what students really want is to hear from an expert, someone who can help them sort through the overwhelming crush of information to find a narrative that is relevant and meaningful.

    The students in our classrooms are not the same students we were, but maybe there is still a way to spark their enthusiasm for our subjects by simply asking them to be present. As debates about the value of higher education swirl around us, maybe caring about our students and their learning means asking them to put away their screens, take out a notebook and be present for our lectures, discussions and occasional gallery walk. For my part, I’m reminding myself that some students aren’t all that different than I was—curious, excited, eager to learn—and that I owe it to them to keep showing up committed to their learning and, maybe, prepared with a few more light-on-text lecture slides.

    Rebecca Vidra is a senior lecturer at the Nicholas School of the Environment at Duke University.

    Source link

  • Did the Ivy League really break America? (opinion)

    Did the Ivy League really break America? (opinion)

    Are many of the ills that plague American society caused by Ivy League admission policies?

    That is the premise of David Brooks’s cover story for the December issue of The Atlantic, “How the Ivy League Broke America.” Brooks blames the Ivies and “meritocracy” for a host of societal problems, including:

    • Overbearing parenting
    • Less time for recess (as well as art and shop) in schools
    • An economy that doesn’t provide opportunities for those without a college degree
    • The death of civic organizations like Elks Lodge and Kiwanis Club
    • The high percentage of Ivy League graduates who choose careers in finance and consulting
    • The rise of populism based on “crude exaggerations, gross generalizations, and bald-faced lies.”

    Brooks somehow left the decline of small-town mom-and-pop businesses and the popularity of reality television off his laundry list.

    You may be wondering how the Ivies contributed to or caused all these problems. The essence of Brooks’s argument is that “every coherent society has a social ideal—an image of what the superior person looks like.” His hypothesis is that America’s social ideals reflect and are determined by the qualities that Ivy League universities value in admission.

    One hundred years ago, the Ivy League social ideal was what Brooks terms the “Well-Bred Man”—white, male, aristocratic and preppy, athletic, good-looking, and personable. What was not part of the ideal was intellectual brilliance or academic prowess, and in fact those who cared about studying were social outcasts. Applying to the Ivies resembled applying for membership to elite social clubs.

    That changed starting in the 1930s when a group of educational leaders, the most prominent being Harvard president James Conant, worried that the United States was not producing leaders capable of dealing with the problems it would face in the future. Their solution was to move to an admission process that rewarded intelligence rather than family lineage. They believed that intelligence was the highest human trait, one that is innate and distributed randomly throughout the population. Conant and his peers believed the change would lead to a nation with greater opportunities for social mobility.

    Brooks seems far from sure that the change was positive for America. He acknowledges that “the amount of bigotry—against women, Black people, the LGBTQ community—has declined” (that might be debatable given the current political climate), but observes that the previous ideal produced the New Deal, victory in World War II, NATO and the postwar world led by America, while the products of the ideal pushed by Conant have produced “quagmires in Vietnam and Afghanistan, needless carnage in Iraq, the 2008 financial crisis, the toxic rise of social media, and our current age of political dysfunction.” Those examples seem cherry-picked.

    In the essay, Brooks cites a number of troubling societal problems and trends, all supported with extensive research, but the weakness of his argument is that he tries to find a single cause to explain all of them. That common denominator is what he calls “meritocracy.”

    Meritocracy, a society with opportunities based on merit, is an appealing concept in theory, but defining merit is where things get sticky. Merit may be similar to Supreme Court justice Potter Stewart’s description of pornography, in that you know it when you see it. Does merit consist of talent alone? Talent combined with work ethic? Talent, work ethic and character?

    Merit is in the eye of the beholder. If I was admitted to an Ivy League university, it was obviously because I had merit. If someone else, especially someone from an underrepresented population, got the acceptance instead of me, factors other than merit must have been at play. If two candidates have identical transcripts but different SAT scores, which one possesses more merit? Complicating the discussion is the fact that many things cited as measures of merit are in fact measures of privilege.

    For Brooks, Ivy League meritocracy involves an overreliance on intelligence and academic achievement, to the detriment of noncognitive skills that are more central to success and happiness in life. He argues that “success in school is not the same thing as success in life,” with success in school primarily being individual while success in life is team-based. He quotes Adam Grant’s argument that academic excellence is not a strong predictor of career excellence.

    Ultimately, he argues that “meritocracy” has spurred the creation of “an American caste system,” one in which “a chasm divides the educated from the less well-educated,” triggering “a populist backlash that is tearing society apart.” Yet Brooks’s beef is not so much with meritocracy as it is with a mindset that he attributes to Conant and his brethren. He equates meritocracy with a belief in rationalism and social engineering that assumes that anything of value can be measured and counted. What he is criticizing is something different from meritocracy, or at least reflects a narrow definition of meritocracy.

    Even if we don’t agree with Brooks’s definitions, or the implication that Ivy League admission policies are responsible for the ills of society, his article raises a number of important questions about the college admission process at elite colleges and universities.

    First, is the worship of standardized testing misplaced? The SAT became prominent in college admission at around the same time that Conant and others were changing the Ivy League admission paradigm. They believed that intelligence could be measured and latched onto the SAT as a “pure,” objective measure of aptitude. Today, of course, we recognize that test scores are correlated with family income and that scores can be manipulated through test preparation. And the “A” in SAT no longer stands for aptitude.

    Do we measure what we value or do we value what we can measure? Brooks criticizes the Ivies for focusing on academic achievement in school at the expense of “noncognitive skills” that might be more important to success in life after college, things like curiosity, relationship-building skills and work ethic. He’s right, but there are two reasons for the current emphasis. One is that going to college is going to school, so an admission process focused on scholastic academic achievement is defensible. The other is that we haven’t developed a good mechanism for measuring noncognitive skills.

    That raises a larger question. What do we want the admission process to accomplish? The SAT is intended to predict freshman year college GPA (in conjunction with high school grades). Is that a satisfactory goal? Shouldn’t we have a larger lens, aiming to identify those who will be most successful at the end of college, or after college? Should we admit those with the greatest potential, those who will grow the most from the college experience, or those who will make the greatest contribution to society after college?

    Brooks questions elite colleges’ preferences for “spiky” students over those who are well-rounded. Is a student body full of spiky students really better? An even more important question arises from a distinction Brooks made some years ago between “résumé virtues” and “eulogy virtues.”

    Does the elite college admission process as currently constituted reward and encourage students who are good at building résumés? A former student attending an elite university commented that almost every classmate had done independent academic research and started a nonprofit. Do students aspiring to the Ivies choose activities because they really care about them or because they think they will impress admission officers, and can admission officers tell the difference? What is the consequence of having a student body full of those who are good at playing the résumé-building game?

    There is one other issue raised by Brooks that I find particularly important. He argues that those who are successful in the elite admission process end up possessing greater “hubris,” in that they believe their success is the product of their talent and hard work rather than privilege and luck. Rather than appreciating their good fortune, they may believe they are entitled to it. That misconception may also fuel the populist backlash to elites that has increased the division within our country.

    I don’t buy Brooks’s definition of meritocracy or his contention that the Ivy League “broke” America, but his article nevertheless merits reading and discussion.

    Jim Jump recently retired after 33 years as the academic dean and director of college counseling at St. Christopher’s School in Richmond, Va. He previously served as an admissions officer, philosophy instructor and women’s basketball coach at the college level and is a past president of the National Association for College Admission Counseling. He is the 2024 recipient of NACAC’s John B. Muir Excellence in Media Award.

    Source link

  • A troubling moment for public higher ed (opinion)

    A troubling moment for public higher ed (opinion)

    David Kozlowski/Moment Mobile/Getty Images

    Earlier this month, my institution, Southern Methodist University, made headlines by hiring President Jay Hartzell away from the University of Texas at Austin, one of the country’s largest and most prestigious public universities. The move surprised many on both campuses and sent shock waves through higher education.

    While I can’t presume to know all the motivations behind President Hartzell’s decision and I don’t speak for SMU, as a faculty member who studies higher education, I believe this moment demands our attention. Many public universities are under serious threat, and private universities need to realize that their future is closely tied to the success of their public counterparts.

    For more than a decade, SMU has been my academic home. The campus boasts smart and curious students, dedicated faculty who care about teaching and research, and strong leadership from the administration and Board of Trustees. We’re in the middle of a successful capital campaign and enjoying both athletic success after our move to the Atlantic Coast Conference and a growing research profile.

    Yet, even as I anticipate the leadership that President Hartzell will bring to SMU, I can’t ignore the broader context that has made such a move more common and deeply troubling.

    Hartzell isn’t the only example of a major public university president leaving for the relative safety of private higher education. His predecessor at UT Austin Greg Fenves left for Emory University. Carol Folt resigned from the University of North Carolina at Chapel Hill before getting the University of Southern California presidency. Back in 2011, Biddy Martin famously left the University of Wisconsin at Madison for Amherst College in one of the early examples of this trend. So, what is going on and why are major public university presidencies less attractive than they once were?

    The Struggles of Public Universities

    Being a public university president in a red state is the toughest job in higher education today.

    Public universities in these politically charged environments are under siege. They face relentless ideological attacks from state legislators and are constantly forced to navigate resource challenges from years of underfunding.

    Politicians attacking public higher education are not simply questioning the budgets or management—they are attempting to dismantle these institutions. Efforts to reduce tenure protections, anti-DEI legislation and restrictions on what can be taught are all part of a broader effort to strip public universities of their autonomy.

    The goal of these attacks is clear: to reduce the influence and authority of public universities and their leaders and undermine the critical role they play in shaping a well-informed and educated workforce and citizenry.

    At the same time, some institutions are adopting policies of institutional neutrality, reducing the ability of presidents to speak out on these issues.

    The cumulative effect of these efforts is to make public universities and their leaders less effective in advocating for their missions, students and faculty.

    The Short-Term Advantages for Private Higher Ed

    In the short term, these challenges facing public universities have opened opportunities for private institutions. With public universities bogged down in political and financial crises, private universities can poach top faculty and administrators, offering them better resources and less political interference.

    I don’t fault private universities for capitalizing on these opportunities—they are acting in their own self-interest and in the interests of their own missions, students and faculty.

    But I fear that this approach is shortsighted and ultimately damaging to the broader higher education community. At a time when trust in higher education is declining, when the value of a college degree is being questioned and when the public is increasingly disillusioned with the academy, it is vital that we don’t allow attacks on public institutions to further erode public faith in all of higher education.

    Why Private Universities Must Stand Up for Public Higher Ed

    Private universities are uniquely positioned to advocate for the broader value of higher education and the critical role public institutions play.

    First, private universities can use their platforms to champion the ideals of higher education. With public universities under attack from state legislatures and special interest groups, private institutions can and should speak out against the politicization of higher education. Whether through research, advocacy or public statements, private universities can be powerful allies in the fight to protect the autonomy of public institutions.

    Second, private universities can advocate for increased public investments in higher education. They can use their influence to urge policymakers to restore funding for public universities and reject anti–higher education policies. At a time of declining public support, private universities can push for policies that ensure all students, regardless of background, have access to high-quality postsecondary education to develop the skills to succeed in today’s economy.

    Third, private universities can help bridge the divide between public and private higher education by forming partnerships with public two- and four-year institutions. These partnerships could include joint research initiatives, transfer and reciprocal enrollment programs, or shared resources to expand access and opportunity.

    The Time for Action Is Now

    In this critical moment for higher education, private universities need to demonstrate leadership—not just for their own interest, but for the interests of the entire industry. If we want to safeguard the unique contributions of both public and private higher education, we need to work together to ensure both sectors thrive.

    Now is the time for all those who believe in the transformational power of higher education to stand up and take action. The future of higher education depends on it.

    Michael S. Harris is a professor of higher education in the Simmons School of Education and Human Development at Southern Methodist University.

    Source link

  • Why small talk is a skill worth developing (opinion)

    Why small talk is a skill worth developing (opinion)

    You walk into the conference networking event, feeling alone, aware of the steady chatter throughout the room. You look to find someone you might know, you sense your breath growing faster and you experience that all-too-familiar pit in your stomach. You walk deeper into the room, taking a few grounding breaths, and notice others standing alone. You approach another conference attendee, feeling as if you are stepping outside of your body, and in your friendliest tone you introduce yourself and ask, “Where did you travel in from?”

    You did it! You initiated small talk with a stranger.

    Small talk is a mode of communication that occurs throughout the world, but not every culture engages in small talk to the same degree. In some cultures, it is expected, and in other cultures it can be perceived as inappropriate or rude. In addition to cultural context, one’s perception of small talk and propensity for engaging in it can be influenced by factors including, but not limited to, personality traits, degree of social comfort, mental health and wellness, past experiences, and the setting of the conversation. Small talk can also present specific challenges to language learners, neurodivergent individuals, people who are unaccustomed to talking with strangers and many others.

    Merriam-Webster Dictionary defines small talk as “light or casual conversation: chitchat.” (Seeing the word “chitchat” immediately brought me back to kindergarten, when my teacher, Mrs. Barker, would frequently say, “Kay, stop your chitchat.”) Cambridge Dictionary defines small talk as, “conversation about things that are not important, often between people who do not know each other well.” The emphasis on “not important” can give the impression that small talk is useless, however, within the U.S cultural context, small talk holds great importance in connecting individuals and laying the foundation for more substantial communication. Think of small talk as the gateway to more meaningful conversations.

    When done well, small talk relies on improvisation and adaptability, allowing for a flow of information and often uncovering unexpected insights and mutual interests. When I think of small talk I think of it as jazz, with each person riffing off the other to create a connection and to also make meaning in the moment. Effectively engaging in small talk by establishing commonalities can open a door for a future collaboration, expand your professional network, build rapport leading to a career or academic opportunity, enhance confidence and ease tension in an interview.

    Do you wish that small talk felt less awkward and more meaningful? Apply these strategies to reduce your small talk stress and to contribute to your career success:

    • Get curious. Harness your curiosity as you engage in small talk. Take the scenario we began with: Someone might ask, “Where did you travel in from?” because they are generally interested in meeting people from different parts of the country or world. Someone else might ask this question as a gateway to finding a future collaborator from a specific country or academic institution. Don’t just ask questions for the sake of chatting, but rather ask about topics in which you are genuinely interested. This approach will make engaging in small talk more enjoyable and valuable to you, and your interaction will feel authentic to the person with whom you are speaking.
    • Listen actively. As the other person responds to your question, try to refrain from planning what you will next ask, but rather focus on absorbing what they are sharing. Consider reflecting an aspect of something they mentioned. For example, if in response to “Where did you travel in from?” they say, “I flew in from Greece last night, and this is my first time in the States; I’m a Ph.D. student at the University of Crete,” you might empathize with their journey and ask how long they are visiting. After further discussion, you might feel inclined to offer to host the individual if they plan to travel around. Your one question, the one that initiated the small talk exchange, could even lead to a lifelong professional relationship.
    • Consider the context. The definition of small talk in the Cambridge Dictionary refers to a “conversation about things that are not important.” I would challenge you to not dismiss small talk as trivial but rather leverage it for more meaningful conversation. When thinking about the setting in which you are engaging in small talk, you can guide the conversation toward greater meaning. It would be odd if the individual attending the networking event at the conference opened the conversation with their name and asked, “What do you think about the weather?” This question would seem very disconnected from the event and purpose of the networking session. However, if the individual were waiting outside at an uncovered bus stop, it might be natural to strike up a conversation about the weather. Having an awareness about the context and setting will lead to an authentic conversation.
    • Have go-to questions. While you don’t want to arrive at every occasion with a script of possible questions, it can be a good exercise to reflect on the things about which you are genuinely curious. When attending a conference networking event, you may be interested in hearing about individuals’ career paths, learning about their research, gaining their advice, etc. In developing questions, focus on ones that are open-ended, where the response requires more than a yes or no. You might ask, “Which conference sessions are you most interested in attending?” Maybe that seems unimportant to you or even a bit superficial, but hearing about the other individual’s interest might inspire you to attend a session you would not have initially chosen. As the conversation unfolds, so will the opportunities to guide the conversation toward more meaningful topics, and you might next ask, “What research projects are you currently working on?”
    • Practice. It is likely that you have attended interview preparation and practice sessions but far less likely that you have attended a small talk training. This is not your fault. My plea to my fellow career development practitioners is this: If we know that many individuals approach small talk with feelings of discomfort or dread, and we also recognize that it is an important skill that leads to positive career outcomes, then we need to actively train and create opportunities for our students and postdocs to practice small talk in low-stakes settings. Consider building small talk into your interview preparation offerings, add a small talk learning module to an upcoming campus networking event, collaborate with your campus’s English language learning program to incorporate small talk activities and reinforce the many places and spaces where your students and postdocs are already engaging in small talk. An example would be when a student comes in for an appointment and asks, “How was your weekend?” By asking they might learn, for instance, that you were recently in Miami, a city on the top of their list of places to visit. In this exchange you could draw attention to how the student effectively engaged in small talk, reinforcing that it is a skill they already possess.
    • Know what topics not to lead with. In the U.S. cultural context, it is safe to say that you would not want to lead small talk with questions about politics, religion, finances, health or overly personal topics. Aspects of these topics might be categorized as sensitive or controversial and can create tension and lead to misunderstanding. Through engaging in small talk, you should be building a foundation of connection that can facilitate greater openness toward engaging in more meaningful topics. That said, maybe you are at the American Political Science Association’s annual meeting—in that context, it would be common for the small talk to include politics. The setting and context can serve to guide the topics and direction of the small talk.

    In academia, where emphasis on depth and scope of knowledge is highly valued, small talk can be easily viewed as a burden and overlooked as a necessary competency. But by applying a few small talk communication strategies, you will find that it can open doors and enhance career success. If you have yet to do so, embrace small talk as a skill worth developing, and get out there and chitchat. The effects on your professional life could be both profound and long-lasting.

    Kay Gruder is the associate director of graduate student and postdoc career programs and services at the University of Connecticut. She is a member of the Graduate Career Consortium, an organization providing an international voice for graduate-level career and professional development leaders.

    Source link

  • College costs have grown, but so has the return (opinion)

    College costs have grown, but so has the return (opinion)

    FG Trade Latin/E+/Getty Images

    What’s the biggest problem facing college students today? Cost is a big concern, of course, for good reason. But many would point to something equally troubling—misperceptions about the value of college degrees. That’s no surprise when reasonable questions are raised about whether graduates are job-ready—and if too many jobs unnecessarily require diplomas.

    There has long been a paper ceiling that penalizes applicants who lack degrees. And more companies are now taking a closer look at so-called STARs—people Skilled Through Alternative Routes.

    The group Tear the Paper Ceiling says that 61 percent of Black workers, 55 percent of Hispanic workers, 66 percent of rural workers and 62 percent of veterans are considered STARs. They have learned valuable work skills through military service, certificate programs, on-the-job training and boot camps. But too often, they’ve been shut out unfairly.

    I applaud the work of this national group and their partners. The equity barriers to jobs are real. Only half of working-age people have a quality degree or other credential beyond high school, even as millions of jobs go unfilled in part because applicants lack the required background or credentials. It only makes sense to make sure we’re not leaving behind talented but uncredentialed neighbors.

    But to take a deeper look is to understand this isn’t only about expanding opportunity and filling today’s open jobs, but the jobs that an increasingly tech-driven, interconnected world will demand in coming years. Skills-based hiring is a good idea, but it won’t on its own come close to solving the nation’s human talent crisis. Increasing higher educational attainment by making sure many more people get better credentials—credentials of value—is the key.

    Foundation of Growth

    Higher education has always been about producing graduates who are ready to start careers, not just jobs. This matters because a person who is a good applicant for a position now could face challenges moving to better and higher-paying positions because they lack the foundation for career growth fostered in postsecondary programs.

    The American Association of Colleges and Universities has surveyed executives and hiring managers eight times since 2006. The most recent survey, from 2023, found that 80 percent of employers strongly or somewhat agree that college prepares people for success in the workforce. Getting a degree is certainly worth the time and money, respondents suggested, as the survey “found a strong correlation between the outcomes of a liberal education and the knowledge and skills employers view as essential for success in entry-level jobs and for advancement in their companies.”

    There will always be conflicting data points in times of change. For example, the push for skills-based hiring, including at the federal level, is opening doors to a broader array of good jobs that historically required a college degree. However, research by Harvard Business School and the Burning Glass Institute shows that college graduates still have an advantage when it comes to getting jobs with higher salaries and better benefits.

    It turns out that employers aren’t committing to skills-based hiring at the level that recent headlines might suggest. The Harvard–Burning Glass report tracked more than 11,000 jobs where a bachelor’s degree was no longer required in the job description. It found only a 3.5-percentage-point increase in the share of non-degree-holders hired into those roles—a decidedly underwhelming number suggesting the buzz about skills-based hiring may be more hype than trend.

    The Lifelong Payoff

    This and other signs reinforce the enduring value of degrees: A recent report from Georgetown University’s Center on Education and the Workforce found that 72 percent of jobs in the United States will require post–high school education or training by the year 2031. The center also found:

    • People with bachelor’s degrees earn, on average, $1.2 million more over their lifetime than those with only a high school education.
    • Of the 18.5 million annual job openings we expect in the coming years, more than two-thirds will require at least some college education.
    • Earnings for people without degrees have been growing over the past decade, but so has pay for degree holders. Even as people without degrees earn more, they are still not catching up with those with diplomas.

    Durable Skills Matter

    Employers often say they’re looking for “durable” skills, such as critical thinking, communication and problem-solving.

    Someone looking to hire an entry-level software developer might consider a candidate with skills in Python or other programming languages developed through informal learning. Many gifted techies are self-taught or developed skills through coding boot camps or working at start-ups, for example.

    But a college graduate with similar skills might stand out because of their experience working in groups to complete projects, their communication and presentation skills, analytical thinking, and other traits fostered in college classes.

    The catch: Across the board, we need better definitions of what our credentials mean. What defines a credential of value, exactly, and how do we make sure that the people obtaining credentials can do the work of the future?

    Certainly, our fast-moving, tech-driven economy increasingly rewards nimble problem-solvers. According to the World Economic Forum’s 2023 Future of Jobs report, employers estimate that 44 percent of workers’ skills will be disrupted in the next five years.

    “Cognitive skills are reported to be growing in importance most quickly, reflecting the increasing importance of complex problem-solving in the workplace,” the report said. “Surveyed businesses report creative thinking to be growing in importance slightly more rapidly than analytical thinking.”

    There are many implications to this change. Embedded in the education pay premium is a fairness issue when it comes to who goes to college and how we support them. The Georgetown center has long reported on the value of a college degree and the persistent opportunity gaps for women and people of color.

    The Change-Ready Nation

    Whatever the impact of skills-based hiring on the nation’s labor shortage, we shouldn’t stop there. Addressing the long-standing inequities in higher education and the workforce means ensuring that these skills-based pathways include opportunities for all workers, especially when it comes to pursuing further education and training even after they enter the workforce.

    Skills-based hiring and the push for increasing attainment aren’t countervailing forces. They’re aimed at ensuring that the nation grows and applies the talent it needs to be prepared for the human work of the 21st century, and to achieve the civic and economic benefits that people with good-paying jobs bring to their communities.

    In the end, this is about more than the job readiness of our students. We’re talking about the change readiness of our entire nation in a rapidly evolving economy. It makes sense to revamp job requirements to meet workforce demands, but there’s no denying we’ll need the best-educated country we can build if we’re going to deliver opportunity and economic prosperity fairly for everyone.

    Source link