In Fahrenheit 451, Ray Bradbury imagined a world that was, back in the 1950s, considered “futuristic.” Unfortunately, nowadays there is nothing farfetched or sci-fi about Bradbury’s bleak vision. For, just as in Fahrenheit 451, our purposes have become inverted. We may not have firefighters who are tasked with starting fires rather than extinguishing them (yet). But we do have the metaphorical equivalent: Specialists in many fields are now expected to do the opposite of what was originally intended, to destroy that which we were once expected to nurture.
Librarians, for instance, were once hired for the purpose of developing their collections. Now, many of them are being tasked with culling those collections and deciding which databases to eliminate. Educators once helped learners to expand their worldviews. Now, we are often required to truncate our course offerings (“students hate choices,” some have been told). Those at the educational helm once served as the guardians and champions of the liberal arts. Now, many of them seem fixated on destroying the very foundation of the institution that makes their positions possible. Meanwhile, books are disappearing. So far, the bonfires may still be more metaphorical rather than literal, but that doesn’t mean they’re not equally destructive.
And, just as in Fahrenheit 451, a huge swath of the population numbs itself to the growing dehumanization. Today most people are being treated as commodities rather than human beings, in a world that has lost its way and reduced everything to that which can be bought and sold. Most of us are manipulated, commodified, and dehumanized. More of us ought to be outraged, and perhaps if so many weren't numb, more would be. But too many people stick buds in their ears, stare in a daze at the giant screens on their walls. Like Montag's sad wife Mildred, too many numb themselves to the point where thawing out might prove too painful.
Lest you think I exaggerate: Take the article (if you can call it that) that appeared in Time.com this week: “Why Ph.D.’s shouldn't teach college students,” by Marty Nemko, who is described as a “life coach.” (Au revoir to Time--I remember when you were halfway respectable.) Nemko begins by citing the usual alarmist memes: almost half of college freshmen don’t graduate within six years, some studies show students learn little in college, one-quarter of graduates were found to be “living at home” two years after finishing college, almost half said their lives “lacked direction,” and twenty percent made less than $30,000.
I’ve certainly noticed this dearth of well-compensated jobs and clear career paths, and I know people who have had to move home after college. But whose fault might all that be? The fault of college professors? Are we the ones who supported the systemic destruction of unions, outsourcing, a stagnant minimum wage, and the erosion of pensions, medical care and other benefits? Are we the ones who decided that “corporations are people,” that elections should be buyable, and that we should turn a blind eye to white-collar crime? If the economy is becoming more difficult to navigate, the blame for that rests on those who are in charge of the economy. (I'll address issues such as time to graduation and whether students are learning at a future date.)
Nemko goes on to state that “college hasn’t changed much in centuries”—a preposterous claim, as if education today is still delivered in Latin to Anglo-Saxon Protestant males only, and as if we have stuck with the trivium and quadrivium rather than adding any new fields of study. “There’s still a research-oriented Ph.D. sage on the stage lecturing on the liberal arts to a student body too often ill-prepared and uninterested in that,” says Nemko--as if the liberal arts are central to today’s university experience (don’t I wish!), as if student “ill preparation” is our fault, as if the point is to cater to student “interests," and as if the “sage on the stage” model is used exclusively. (And for that matter, as if most undergraduates are being taught by full-time, “research-oriented” professors instead of by underpaid adjuncts.)
The longer Nemko argues, the more illogical his statements become, until he asserts that Ph.D.’s shouldn’t even teach because “the gap between [Ph.D.’s] and their students’ intellectual capabilities and interests is too great.” I’m slightly amused at his backhanded acknowledgement that we “snobbish” Ph.D.’s might actually know some stuff (so much stuff, apparently, that our intellectual prowess has rendered us incapable of communicating with our fellow human beings). I'm less amused by the fact that he's calling today’s students stupid.
So who should be teaching students, according to Nemko? “Bachelor’s-level graduates who themselves had to work hard to get an A.” What an excellent idea; Find recent graduates who struggled with the course material themselves and have them do the teaching, without striving to understand the more complex material conveyed in graduate school. Show of hands time: How many of you would like to go under the knife of a surgeon who, as an undergraduate, was taught biology by someone with a B.A. who “had to work hard” to get it? (What’s that you say? No, thank you?) Nemko also suggests that prospective teachers “complete a pedagogy boot camp, a one-weekend to one-semester intensive.”
Oh, so that’s how long it should take to train teachers: one weekend! (To think of all those years I wasted...) Or maybe a semester, says Nemko (I suppose that's if you’re learning to teach something really hard, like logical argument). Then, as if Nemko hasn’t tied himself up in enough conceptual knots already, he claims that such training is required of teaching assistants "but not of professors”—a curious assertion, since teaching assistants are the ones who become professors.
So what is Nemko’s solution to the higher education dilemma? He suggests that most courses should be “taught online by interactive video." Why? Because “the online format allows for . . . exciting simulations impossible to provide in a nation’s worth of live classes.”
Well, now we really are back in the inverted world of Fahrenheit 451—where firefighters start fires, and face-to-face, live interactions between human beings are less important than simulations of same. The virtual is somehow more valuable than the real--just as Mildred grows more attached to the imaginary television “family” that appears on her wall screens than to the actual husband standing in front of her. News flash for Mr. Nemko and all those who think like him: Education is not about “exciting simulations,” but about real relationships, between real people. Yes, even now.
***
Despite my visceral and, I admit, angry reaction to this screed, I'll concede when Nemko suggests that the U.S. emulate Europe by expanding its apprenticeship programs for skilled labor. Guess what? We used to have more apprenticeships in the U.S.—thanks to unions. (I know whereof I speak; I married a man who completed a rigorous formalized apprenticeship with the carpenters’ local.) But guess what’s happened to unions?
Back before the greed of the 1980's created policies that have steadily eroded the benefits enjoyed by working people, we all had more choices. Colleges were expanding access, tuition was reasonably affordable, and for those otherwise inclined, apprenticeships were available. And we still need those apprenticeships. I’ll fight as vigorously for the dignity and fair treatment of those who engage in physical labor as I do for those who engage in the life of the mind. All of us are needed. For starters, those of us who are fortunate enough to work indoors need people to build, maintain, and clean the buildings that house us. Those who perform necessary physical work need those of us who are trained in other areas—law, medicine, pharmacy, law enforcement, education, literacy, the arts, and much more (and let’s not forget the old line, “No farmers, no food"--isn't that where it all starts?) Bottom line: we all need each other. There is dignity in, as well as the need for, all types of work. No human being has the right to consider himself or herself more “valuable” than someone who works in a different capacity.
I also believe that the arts and humanities, and all the advantages that they confer on us, should be available to everybody regardless of how we make our living, rather than confined to educational institutions. There’s no reason a carpenter shouldn’t enjoy studying history, or a janitor shouldn’t write poetry, or an ironworker shouldn’t play the string bass. But it’s also vital that the arts and humanities continue to flourish within the university system—as the place where knowledge can be nourished, expanded, disseminated, and perpetuated.
If that is to happen, we need to change our priorities.
Too often, those of us who teach in the humanities are threatened with extinction because of our relatively small numbers. When we ask for help with recruiting students, we’re told there is no point; the decline of the humanities is said to be a “nationwide” problem, and we are told students don’t “want” to study “useless” subjects anymore (“use” being defined here as “directly leading to the making of money, along a predictable straight-line path”). It’s not surprising that during tough economic times, many people—especially the less well-off—prioritize job training over deeper fulfillment (insert the basic principles of Maslow’s Hierarchy here). It’s also not surprising that the humanities is marginalized in a culture like America’s, which tends to be anti-intellectual, materialistic, impatient, hyper-individualistic, and in other respects antithetical to all that the humanities stands for.
But America has always tended toward all of the above. Yet despite that, for several decades following World War II, America offered (arguably) some of the highest-quality liberal arts education in the world, all the while expanding access. What was different?
When I read institutional histories nowadays, it appears there was a time when many college administrators themselves believed in, promoted, and protected the liberal arts. Many of those at the helm had decided that some things are worthwhile even when a direct, immediate financial benefit is difficult to calculate. Institutions had also decided that those with expertise in an area should be the ones teaching it, and that curricular decisions should be made by those with expertise, not by those new to the field, like students.
Now, anyone who knows me knows that I’m almost fanatically student-centered. I’ll take a bullet for the younger generation, whom I see as not as instigators of mischief but as victims of my own generation’s bad choices. However, as a teacher I’m also aware that student preferences about what they think they “want” should not always be the deciding factor when it comes to curricular planning or student advising. Can students know they want something when they may not yet know it exists? How can they know what will be “relevant” to their futures when nobody can see the future?
We old geezers can’t see the future either, of course. But with age, many of us have gained the advantage of retrospect. We can look back and recognize how things we didn’t think we “needed to know” at the time have turned out to be invaluable. (I often wince when I remember my own teenage self, so “sure” about where my life was headed and what I “wanted,” when it turns out I knew almost nothing.) Today I’m grateful for my teachers and parents and elders, older and wiser than me, who sometimes forced me to abandon what I thought I wanted in favor of what they knew I needed. As much as I respect my own students, I also understand that there are times when, as an educator (not to mention as a parent), I’m obligated to steer them away from what they think they “want.” University administrations should do the same and advocate, publicly and vigorously, for the values that have sustained the liberal arts for centuries, despite countless changes (Nemko's assertion to the contrary).
I don’t think I’m being foolishly nostalgic if I recall a time when our society seemed to have collectively decided there are some things worth keeping, some things that matter besides the "bottom line." We could decide that again. We could decide to treat people as human beings rather than as commodities. We could decide to promote human dignity, the quest for meaning, and the creation of a world in which all human lives have intrinsic value and all people have an opportunity to find a sense of purpose. Institutions of higher education could (and should) play an important role in fostering all of that.
We could decide that institutions of higher education will serve as the guardians of intangible things of value: our histories, literatures, religions, mythologies, artistic expressions, and ideas, across cultures and centuries. We could decide that some things matter more than squeezing every last dime of profit out of each other, and more than treating other human beings as commodities to be bought and sold or as “others” deserving of contempt. We could choose real relationships over “exciting simulations.” We could choose to value expertise and intelligence rather than expediency and popularity. We could choose to create institutions whose members are encouraged to nurture rather than destroy. We could choose to acknowledge, and work to change, the viciousness of the current economic system--the underlying reason there is “something rotten” in the state of higher education today.
We not only can choose those all this; we need to. In order to do so, we need to find educational leaders who are committed to doing all of the above. We need people in charge who are committed to creating than destroying, and to building institutions rather than tearing them down.