Should Higher Education Be More Challenging?

Save this storySave this storySave this storySave this story

About two decades ago, while I was a graduate student focusing on English literature, I led a session in a designated observation space at my university’s teaching hub. My students and I gathered around a lengthy oval table while cameras captured our interactions. The specific novel we examined escapes me, but I distinctly recall what I gleaned from reviewing the recording later, alongside a teaching mentor. She highlighted that, when I solicited responses from students, I predominantly looked toward my right, overlooking the raised hands on my left. I didn’t allow pauses to linger sufficiently, interjecting just as a student was building up the nerve to speak. On a positive note, she observed that I was employing a technique she favored, which I had adopted from one of my former professors: it resembled cold-calling, except that, after surprising a student with a challenging question, you informed them that you would revisit them shortly, providing them time to contemplate their response. She termed this “warm-calling.”

Instructing was my preferred element of graduate studies, and I enrolled in as much training as possible. While I was instructing, or otherwise prioritizing students, my contribution to the undertaking of higher learning resonated with me: I dedicated years to studying literature to convey it to students seeking self-improvement. However, beyond the classroom, the endeavor became less defined. I recognized that my professional success hinged on scholarly research. My teaching capabilities were largely inconsequential. Indeed, I had been cautioned that teaching served as a diversion from the “essential task” of producing articles for my colleagues.

It sometimes appeared as though academic work was also a distraction for my students. Though sincere and hardworking, they often found themselves so deeply involved in extracurricular pursuits—charitable activities, musical ensembles, athletics, startups—that allocating time for academic work became challenging. I had personally been involved in a startup as an undergraduate, and was familiar with the underlying reasoning behind excessive extracurricular involvement: grade inflation, which enabled average students to exert less effort, also complicated the process for outstanding students to distinguish themselves academically. All incentives, for both educators and learners, promoted reduced effort in the classroom and increased involvement outside it.

These inconsistencies were unsurprising; they mirrored the intricate character of the contemporary university, where undergraduate instruction represents merely one of several competing priorities. The underlying philosophy was, in essence, that students would absorb whatever they could from the university’s leading researchers, some of whom excelled at teaching, while others did not. Some courses would be rigorous, others absurdly easy; grades would remain consistently high; and, regardless, ample opportunities would exist outside of formal instruction. Exposure to numerous exceptional minds would be enriching. Learning, when not facilitated through direct instruction, would occur through indirect means.

Was this philosophy convincing? Two decades ago, it seemed so—but presently the components may no longer align effectively. Student debt has evolved into a burden for an entire generation, with countless individuals securing federal loans to finance their degrees. Concurrently, college appears to have become markedly easier, in ways that imply a weakening of its fundamental roles. In “The Amateur Hour: A History of College Teaching in America,” education specialist Jonathan Zimmerman notes that, in 2011, around forty-three per cent of college grades awarded were As; in 1988, the equivalent figure was thirty-one, and, in 1960, it stood at fifteen. (In The Atlantic, Rose Horowitch indicates that, in 2024, the average G.P.A. of Harvard’s graduating cohort was 3.8.) Over approximately the same period, “the average time spent studying by college students decreased by almost 50 percent, from 25 to 13 hours per week.” Zimmerman references a survey revealing that, during a single semester, half of the respondents from diverse institutions had not participated in a single course necessitating more than twenty total pages of writing.

It remains accurate that college graduates generally attain higher earnings. However, the most current data indicates that individuals with four-year degrees are encountering challenges in securing employment. Simultaneously, artificial intelligence may soon transform work across numerous sectors; numerous sought-after college majors, such as marketing, may prove less valuable than previously. When A.I. is used by students, it additionally poses a risk of transforming the classroom into a charade, where the act of learning is simulated rather than embraced. Students can employ chatbots to complete their assignments, teachers can award inflated grades for that work, and everyone can feel content while acquiring minimal knowledge. “There’s a mutually agreed upon mediocrity between the students and the teachers and administrative faculty,” the folk singer Jesse Welles explains, in his song “College.” “You pretend to try, they’ll pretend you earned the grade.” If your ambition is to become a doctor or an engineer, Welles sings, college could be worthwhile; otherwise, you might “skip the Adderall prescription,” and subscribe to “a YouTube subscription.”

Since the mid-twentieth century, the quantity of Americans attending college has risen significantly—in absolute numbers, by approximately a factor of five. This advancement has seemed inevitable, propelled by the expansion of knowledge-based professions and the extension of higher education to formerly marginalized groups. Nevertheless, over the past decade, enrollment figures have commenced a decline, and this contraction is predicted to persist. Demographics represent a potential contributing factor: a decrease in the birth rate, which commenced around 2007, is anticipated to result in a smaller cohort of high-school seniors. Yet, it also seems plausible that a growing number of individuals are concluding that college has undergone changes and no longer justifies its cost. Universities endeavor to project an image of permanence, yet higher education functions as an industry akin to any other, subject to both periods of prosperity and decline. If college constitutes a bubble, could it be nearing its point of collapse?

“Academics were traditionally the focal point of college, accompanied by various secondary attractions,” Zimmerman shared with me during a recent discussion. “Currently, the secondary attractions might hold greater prominence.” Even well-intentioned, adequately funded universities have encountered difficulties in halting this trend, and Zimmerman identifies the origins of the issue in the history of American college instruction. He commences with Mark Hopkins, a professor of philosophy who served as the president of Williams College from 1836 to 1872. If Socrates conceived the seminar, Hopkins served as his American representative: at a time when education frequently occurred through extensive lectures and via rote memorization, he guided students in dialogues concerning the meaning of existence. “The quintessential college is Mark Hopkins at one end of a log and a student at the other,” James A. Garfield, one of his former students, later remarked. This concept evolved into a guiding principle for educators, Zimmerman writes, who came to perceive college instruction as a “charismatic” activity, largely contingent upon the individual vitality of professors.

Compelling justifications support this perspective. An exceptional educator possesses the capacity to transform one’s life; a syllabus pre-written by an administrative body is unlikely to achieve this. As K-12 educators understand, administrative oversight of curricula abounds with procedural and political pitfalls. Zimmerman demonstrates that colleges have navigated this terrain by upholding Garfield’s vision. Incrementally, they expanded in size and institutional complexity, featuring organizational charts laden with provosts—yet, as “an increasing proportion of American higher education fell under bureaucratic control, instruction generally remained exempt.” Presently, administrators meticulously manage every aspect of college life, but the conception and execution of coursework largely remains a private domain for individual professors to decide for themselves.

Although numerous attempts have been made to reform instruction within American higher education, these have largely opposed bureaucracy, Zimmerman notes, with reformers seeking to render instruction “more personal,” by advocating decentralization and individuality. For instance, in the years subsequent to the First World War, leading universities introduced discussion sections, or “preceptorials,” to facilitate increased one-on-one engagement between students and educators who might inspire them. During the nineteen-seventies, a progressive movement in higher education furthered this concept, encouraging professors to prioritize unrestrained discussion over authoritative instruction. Numerous students favored the highly conversational approach. Others observed that it remained possible to engage in stimulating classroom discussions without substantial learning. Zimmerman quotes one student voicing dissatisfaction with “groovy” professors. Another observes that he can engage in emotionally driven conversations outside of class, within his dormitory.

Several limited initiatives have sought to exert more direct authority over classroom dynamics in colleges. The establishment of “centers for teaching and learning,” such as the one I attended, aimed to educate professors on optimal teaching practices. (Zimmerman notes that one impediment involves the condescension frequently directed toward specialists in education by other professors, who regard them as “the least scholarly members of the academic guild.”) In the nineteen-eighties, “portfolio assessment” policies mandated that professors submit their syllabi to peer committees; occasionally, state legislatures or other entities have requested that universities validate themselves through charters and metrics (and, at times, compelled them to adhere to political standards). These initiatives, and others like them, have variably succeeded, failed, or backfired. Yet, none have altered the fundamental reality that college instruction in many respects represents “an amateur enterprise” for professors, who practice it according to their inclinations while primarily being evaluated based on their research.

Considering all of this, it is remarkable that college instruction attains such a high standard. This stands as a testament to the enthusiasm and dedication that numerous professors exhibit toward their responsibilities as educators (and, similarly, to the authentic curiosity and ambition of their students). To a significant extent, Zimmerman contends, professors resist external meddling precisely because they approach teaching with such seriousness. “By the 1980s, across every kind of four-year institution, the amount of time faculty spent teaching was inversely related to their salaries,” Zimmerman finds. (He asserts that this persists even at numerous smaller colleges seeking to emphasize students.) Yet throughout academia, he writes, “faculty members at every level threw themselves into teaching despite—or even because of—its lack of material reward.” He quotes Michael Sherry, a historian at Northwestern University: “What devalues teaching in professional terms might also be just what makes it valuable to us as individuals,” Sherry wrote. “It is ours, not the profession’s.” The unadulterated nature of the classroom experience constitutes one of the primary motivations for individuals pursuing professorships in the first place. Hopkins aspired to occupy that log, as well.

During my undergraduate and graduate years, it never occurred to me to question the almost total educational autonomy afforded to my educators. I simply circumvented the ineffective ones and actively sought out those who exercised their autonomy skillfully. Learning from individuals who exemplified intellectual independence proved transformative. Broadly, Zimmerman argues that this autonomy has functioned as a beneficial force. It has facilitated professorial innovation, fostering personal connections with students, conveying nuanced perspectives, incorporating their research into instruction, and harnessing and disseminating their intellectual vigor.

However, it has additionally engendered adverse outcomes, some of which now appear to be coalescing into a crisis. One of the most apparent involves a burgeoning and dysfunctional reliance on student evaluations of their professors. Zimmerman writes that, in the nineteen-sixties, evaluations largely featured in student-run publications; by the eighties and nineties, deans who had implicitly agreed not to evaluate scholars as educators had extensively adopted them as “official administrative mechanisms.” Presently, students frequently represent the sole observers of professors during instruction, and their evaluations are accorded significance, both in promotion decisions and in an institution’s appraisal of its own efficacy. Nevertheless, it remains evident that they constitute exceptionally flawed measures of teaching prowess. “Nobody would think of judging a faculty member’s research by polling students about it,” Zimmerman observes. Evidence indicates that students do not, in fact, learn more from instructors whom they reward with favorable evaluations. In actuality, it demonstrates that “highly rated professors are more likely to be male, white, good-looking, and easy.”

Entrusting students with control over educational standards would likely constitute an ill-advised strategy under any circumstances. However, it has proved particularly problematic during a period when the typical student body exhibits escalating academic deficiencies. The “definitive story” of recent decades, Zimmerman conveyed to me, has involved a decline in students’ attention spans, purportedly driven by smartphones, social media, and other information technologies. Within this environment, student evaluations have generated a counterproductive motivation for classes to become increasingly undemanding, and for grades to be artificially inflated. If evaluations and grades remain elevated—and if schools possess no independent means of ascertaining whether learning has transpired—then who can assert that anything has altered? Given the limited centralized oversight of education, there exist few expeditious or coercive means for a college to intervene.

This same dynamic can obscure the transformations introduced by artificial intelligence. Additionally, endeavors to counteract widespread chatbot-facilitated academic dishonesty may themselves present challenges. Recently, in the Times, the writer, professor, and technology analyst Clay Shirky documented some of the ways in which professors are addressing A.I. Shirky asserts that the fundamental approach involves moving away from writing and embracing “a return to an older, more relational model of higher education.” This might entail substituting the composition of extensive essays with briefer, in-class assignments, completed by hand, potentially within examination booklets; students might additionally be assessed during obligatory office hours and through oral examinations compelling them “to demonstrate knowledge in real time.” Shirky contends that “our current practices around student writing are not part of some ancient tradition”; he notes that “at times, writing was discouraged”—for instance, at the University of Paris, in 1355. Freshman composition courses as we know them did not achieve widespread popularity until after the Second World War.

During a period marked by declining literacy rates, does it seem logical for colleges to respond to A.I. by embracing oral educational traditions? This hinges on the perceived importance of literacy. As an individual who pursued graduate studies in English, I am inclined to believe that modern civilization was founded upon the written word, and to dread its decline. However, as a scholar specializing in new media, Shirky appears amenable to the notion that, given that “the production of ordinary writing now requires much less skill,” colleges should exhibit less attachment to writing and reading. Perhaps literacy represented an issue for a previous era, and contemporary universities ought to concentrate on different challenges. “Contrary to much popular opinion, college is not in the information transfer business; we are in the identity formation business,” he writes.

On the other hand, much rests upon the effectiveness of “relational” education. During my graduate studies, I underwent two stressful oral examinations, each time studying for a year, under the supervision of an advisor, prior to facing a panel of three professors. This arrangement proved feasible due to the limited number of students, approximately a dozen, within my cohort. Last summer, three Canadian professors delineated how they had administered oral examinations to over six hundred students simultaneously. Within so-called ConVOEs—Concurrent Video-Based Oral Exams—students receive questions from a computer and subsequently verbalize their responses into a camera. “We advised students that only the first 60 seconds of each of their responses would be graded, which encouraged concise and direct answers,” the professors wrote. Human beings graded the responses, employing a rubric, and utilizing software that enabled them to view the clips at twice their original speed, “significantly increasing grading throughput.” Therefore, yes, it constitutes an oral examination—but not precisely Mark Hopkins on a log.

Naturally—even if students reduce their reading, limit their original writing, skip class more frequently, and receive evaluations based on elevator pitches delivered to computers—college could persist as an engaging and enjoyable experience. Opportunities for substantial learning, personal development, and involvement in compelling and fulfilling projects could still abound. A reduction in learning, due to diminished “information transfer,” might occur. However, this might not matter; after all, one would remain unaware of what they were missing.

Assuming that college continues to decline in difficulty, one might contemplate whether it will retain its indispensability for so many individuals. What actions will employers undertake if college becomes more broadly regarded as a form of personal-development retreat for young individuals, rather than a program providing thorough intellectual training? Should I apply for a fitness model position, listing on my résumé “member of the gym” from 2021 to 2025, modeling agencies will presumably seek verification that I actually engaged in exercise; they will want to inspect my physique. (Perhaps I merely ambled on the elliptical while viewing “Love Island” and appreciating the presence of physically fit individuals.) Numerous employers presently subject prospective employees to elaborate interview and assessment protocols, sometimes requiring them to execute tasks or resolve problems while recruiters observe. We might anticipate a wider adoption of these practices, extending to fields where they are currently absent and becoming more stringent in areas where they are already employed. (For instance, at The New Yorker and numerous other publications, editorial candidates are requested to complete an “edit test.” Currently, they can complete the test remotely; this might change.)

Should employers exhibit heightened skepticism regarding college, and more frequently assess graduates, a novel equilibrium could emerge, with the job market revealing, or professing to reveal, which college educations genuinely cultivate cognitive aptitude, and which do not. Does pursuing a history major truly enhance one’s thinking capabilities, in manners valued by employers? Historically, this has been accepted as a matter of trust. Should this trust erode, and decision-makers external to academia initiate more explicit interventions, novel pressures will be imposed upon established fields.

Colleges themselves may diversify. Certain institutions may persist in the current manner, or experiment with novel (or traditional) pedagogical strategies, such as those outlined by Shirky, that acknowledge A.I. Other institutions may seek to consolidate. It remains plausible to envision that, within certain schools, assertive administrators could seize control, compelling professors to coalesce around practices that uphold established norms of reading and writing. Within these schools, one might need to relinquish electronic devices before entering the library, and prepare for essay composition by taking notes on paper, or dedicated, A.I.-free devices. Evenings could then be devoted to homework within proctored computer laboratories where only printed books or authorized websites are permitted. Except in the computer-science department, A.I. will be prohibited, or rigorously monitored. Colleges today emphasize “the great books”; these colleges could simply emphasize “books.” Within the sci-fi epic “Dune,” a rebellion identified as the Butlerian Jihad results in the proscription of “thinking machines.” Certain young individuals will aspire to partake in this rebellion; perhaps degrees obtained from universities that accommodate those students will be deemed more valuable, or at minimum more representative of a specific form of meaningful human effort.

Across diverse sectors, A.I. prompts us to consider precisely what we accomplish, and why, and whether we wish to persist in performing it ourselves when a computer can replicate it to a certain degree. Higher education proves no exception. Even prior to the advent of artificial intelligence, the readily accessible information and distractions available on the internet were prompting students to ponder whether the labor of learning was tedious and potentially pointless. In Zimmerman’s account, the relaxation of college standards has stemmed, ultimately, from a centuries-old, meticulously cultivated tradition of organizational disarray; nevertheless, it might have appeared to represent a collective professorial shrug, an acknowledgment that academic work was not worth undertaking in any case. “We weren’t actually persuading the kids about the validity of what we do,” Zimmerman told me. Currently, colleges may face no alternative but to persuade. They’ll additionally need to demonstrate their value, in real time. ♦

Sourse: newyorker.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *