I did know. We shared a grim chuckle and moved away from small talk. We had plenty to discuss; at the moment so many things seem confusing. Semester start-up, always chaotic, has been especially so this year, for multiple reasons--among them, a massive office move that now renders my nom de plume, "Underground Professor," metaphorical rather than literal. (I'm still going to keep the name, though.) And it's not just the move; it's been a year of intense upheaval, not just for me but, it seems, for nearly everyone I know. Everyone seems tired.
Some of this, of course, stems from the life phase that I and my cohort find ourselves in. When you hit your fifties, you're well advised to begin stockpiling sympathy cards. When I prepare my taxes, I notice that most of my charitable donations are now in memory of somebody, and my subgenre of specialty is rapidly becoming the eulogy. At this age, much of that can't be helped. But some of our current instability is less inevitable than taxes and death. Much of it doesn't need to be this way.
Take the management buzz term of the moment: "disruptive innovation." (Read Jill Lepore's provocative New Yorker critique HERE.) If you've ever been ticked off about a technology "upgrade" that rendered obsolete something you found perfectly serviceable; if you're tired of constantly being retrained because "they" seem to change how things are done as soon as you learn how to do things; if your budget is constantly screwed up because items you'd already checked off your wish list have broken down when you thought they should last longer--you know all about disruptive innovation. And, as Lepore points out, it's spread from the tech world to everything else--like education--and those who have the power to make changes ought to change this. Disruption may have its place, but it doesn't belong everywhere, all the time. In the field of education particularly, more disruption is the last thing we need.
My teaching philosophy holds that learners, of all ages, benefit from an environment that is stable. The most effective schools, at all levels, offer faculty who stay for the long haul, who are professionally and emotionally invested in and committed to a particular place. Quality teaching, as David Kirp points out HERE, necessitates the building of relationships--and quality relationships require commitment, patience, and time. There are no shortcuts. Learning isn't something to be disrupted. Technology cannot replace human connection. The relationships we build--students with teacher, students with other students--enable a conversation regarding our course content, guided by me in my role as teacher, and it is through that dialogical process that information is transformed into knowledge, perhaps even wisdom. Technology can facilitate some of this, but if all gadgets disappeared tomorrow, learning could still happen, since the foundations--relationships and content--would still exist. Keep technology and remove one of those other factors? That's not education, it's just people playing with fancy toys.
I'm no Luddite (if I were, I couldn't have a blog), and I've found many technological innovations to be professionally useful. I can post links to articles on our course web page, keep my grade records electronically (eliminating my fear of accidentally leaving them at Starbucks), show relevant You Tube videos and Ted Talks in class, and use group chat to plan activities with our student creative writing club--a motley crew whose assorted schedules make it nearly impossible to meet in person. All good stuff, and that's just for starters. Believe it or not, I've even taught online, and from time to time I've even done it well (though doing it well poses challenges).
So it's not innovation that disturbs me. In fact, I like to think of myself as fairly cutting-edge on most fronts. But disruption? That's just one of the many problems inherent in applying a corporate business model to a field like education where profit was never the point. The push-back against that, along with soaring college costs and consequent student debt, has led many to revisit the question of what higher education should be for. One of those at the forefront, William Deresiewicz, fans a firestorm in his new book, Excellent Sheep. (Read his provocative and controversial critique of Ivy League education HERE, and one of the kinder counter-arguments--Nathan Heller's "Poison Ivy"--HERE.)
Deresiewicz claims, "College is an opportunity to stand outside the world for a few years, between the orthodoxy of your family and the exigencies of career, and contemplate things from a distance." Heller points out the class bias inherent in assuming that someone has that luxury: "When an impoverished student at Stanford, the first in his family to go to college, opts for a six-figure salary in finance after graduation, a very different but equally compelling kind of 'moral imagination' may be at play." Yet even Heller, though he takes an ostensibly more populist view, can't seem to get out of making elitist assumptions: What if an impoverished first-generation student at a no-name state university wants to study English or history and is fine with five figures? Must we always be focused on the biggest names, the most prestigious? And must we assume that those with less economic privilege are only interested in earning more money?
Meanwhile, The Economist couldn't care less about the "mind-expanding, soul-enriching" paradigm of Deresiewicz: "For most students the 'graduate premium' of better-paid jobs still repays the costs of getting a degree. But not all courses pay for themselves," complain the "numbers guys"--as if the pointlessness of meaning-seeking and reflection has already been established and we've all concluded that university education and vocational training are really the same thing. If a course doesn't "pay for itself," The Economist suggests, it's not worth taking. And yet I can't totally bash the "numbers guys" either. You can't do the college thing, however you conceptualize it, if you don't have money. (And the college debt crisis is a subject unto itself.)
This I know from experience. I also know that Heller's point is valid; when I was eighteen, I didn't have the luxury of "standing outside the world" for a few years, or even one year, or even a week. (Maybe a day, if I stayed home with a bad enough cold.) To anyone who knew me as a child, I appeared to be on the college-bound trajectory: I earned high grades, I played piano and violin, my dad worked in the aerospace industry--well, there's your first glitch. The 1971 recession nearly killed that industry and nearly bankrupted our family. Add in my father's serious health challenges, in a country with for-profit health care that cares not for unemployed people with "pre-existing conditions" (one dire consequence of prioritizing numbers over human needs), and without scholarships I wouldn't go anywhere. I should have been able to get some--but then my dad, uninsured, was seriously injured in a freak accident a week after I graduated from high school, and I was distraught. People in Stryker frames resemble pigs being roasted over a spit, turned every two hours. The screws in their heads create rivulets of dried blood that can't be washed away without risk of further spinal cord injury. Stand outside the world? I wouldn't know what that's like.
When the time came, I chose a vocational path, studying at night while working at my full-time secretarial job. I married my boyfriend, who was serving his apprenticeship in the local carpenters' union: hard physical labor all day, apprenticeship classes at night. (Yes, construction workers go to school--at least the properly trained ones do.) We were young. We worked. We worked hard. And I don't regret it. The Ivy League world described by Deresiewicz was (and, though I am now an academic, remains) as foreign to my daily reality as a distant solar system. We needed to survive. That's why, as much as I love what I do, I don't knock students who choose clear career paths.
And yet much of what Deresiewicz says still resonates with me: "It is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become a unique individual--a soul. The job of college is to assist you to begin to do that." I know just what he means: College did help me make those connections. The experience was invaluable in ways that are not always easy to articulate.
Yet I find myself resisting some of what he says. College is certainly not the only way to do become "unique," and I would hope he's not suggesting that less educated people don't have souls. (I don't think that's what he means, but if one headed down a slippery enough slope, one could draw that conclusion.)
What about those who don't go to college--either because they can't, or because they don't want to? What about those who financially need to be in college for job training purposes, who might have loved to "stand apart from the world" but will never have that luxury? What about older people who didn't go to college at the usual time but want to do so now? (In all my web surfing of "crisis in the humanities" articles, I have yet to discover one that considers the perspective of non-traditional students.) And if we recognize all that, how do we hold on to our idealism? For those who want the kind of life-expanding educational experience that Deresiewicz promotes, how do they pay the ever-soaring price if they're not already well situated?
If answers were easy, we'd probably have them already. But maybe we can start by trying to unravel that nagging, confusing issue of class. Deresiewicz points out the elitism of the Ivy Leagues and the sometimes single-minded achievement orientation among those who aspire to it--yet the preferable alternatives he suggests are places like Reed and Kenyon, still out of most people's reach. And The Economist, for all its touting of MOOCs and technology as the answer to everything, seems to assume that we've already abandoned as economically unfeasible any notion of a more personalized, meaningful education. If you want a job, apparently you're required to abandon the search for meaning. Many who argue for the humanities from an idealized point of view assume that anyone seeking practical skills is somehow lacking in "moral imagination," while many who argue from a job-training perspective neglect to consider the "soul" altogether, as if--as Adam Gopnik put it--our sole purpose in life is to "produce goods and services as efficiently as we can, sell them to each other as cheaply as possible, and die."
Amidst all this chaos, perhaps what we need to question are the binaries as well as the hierarchies--the whole notion of "class," along with its implicit suggestion that some lives are more valuable and/or meaningful than others. There are more things in heaven and earth than the "excellent sheep" that Deresiewicz criticizes, or the cash-strapped "social climbers" for whom Heller claims to advocate. There are more places where we can disseminate and foster all that we found valuable in the humanities disciplines than the usual elite academic enclaves. And there are more purposes to formal higher education than future employability alone. Yet the majority of us--however idealistic we may be--also need to make economic ends meet.
It is high time for innovation in higher education. The goal of innovation, however, should not be to disrupt. The language and strategies of the corporate world should not be applied where they do not belong. What we need to do instead is build a new kind of institution--one that is accessible to all, and that is structured to meet both our material and our non-material needs.