Notes from the underground professor
  • Underground Professor
  • About This Blog
  • Why the Pseudonym?

kids these days! (are they really more narcissistic than we are?)

8/27/2014

0 Comments

 
Blue hair—or purple or hot pink. Multiple piercings, sometimes in weird places that you don’t necessarily want to know about. Tattoos, also in weird places that you don’t want to know about.  Smoking electronic cigarettes. Liberal use of “f” word, as noun/verb/adjective/adverb, as if it’s not even a bad word anymore (we used it too, of course, but at least we had the good sense to know we were being naughty). Texting, texting, texting. Interacting with machines instead of people. Abbreviating the daylights out of everything, ppl (totes)! And what is that crap they are listening to? When I was that age we listened to real music. Kids these days! (I mean, they don't even take the same drugs as our generation did!)  Instant gratification. Sound bytes.  Endless posting of self on Instagram, Facebook, Twitter--all of which kills literacy and sustained concentration and detaches us from one another and makes us lesser human beings. Oh yeah, and let's not forget narcissistic. Jean Twenge proved that kids these days are more narcissistic in The Narcissism Epidemic, right? I mean, they take selfies! (and my computer program doesn't even think "selfies" is a word). 

Well, it's that time of year again--fall semester start-up. (For me, as for most anyone I know who works in education, the "new year" does not start in January--it starts now. Hence the slightly slower pace of my blog posts--I always find these semester start-up periods to be more insane than the actual semesters, at least until finals time.)  This afternoon I'll greet incoming freshmen at our dean's welcome, and next Tuesday I'll face a new group of "deer in the headlights," only a few weeks removed from high school. I'll try to help give them tools to navigate an increasingly crazy world (or so it seems to this old geezer, anyway). It's a funny thing about teaching traditional-age freshmen (though I do teach many nontrads as well): Each year I find myself a year older, but most in the incoming class are still eighteen, another year removed from my own reality. Given all the factors listed above, how in God’s name is a self-respecting AARP member supposed to teach this generation? 

I begin by silencing myself. Shutting myself up. Dropping all my preconceptions about the supposed degradation of the current culture we live in, who my students are, what they are capable of achieving.  I know, I'm supposed to teach to a predetermined set of "student learning outcomes," but who really knows any of us "should" be learning to survive in our current crazy world anyway?

What if I begin by listening?  What if I ask them to write about the world they are living in, the world they have grown up (and are still growing up) in?  What if I recognize that, as young people who have never held power, it's ourselves we should be pointing the finger at if we find today's young people to be somehow lacking? What if we remembered our own youths a little more accurately, rather than through the psychedelic glasses of nostalgia (with a Beatles soundtrack playing in the background)? After all, for every "The Long and Winding Road" that came along, there was a song like "Seasons in the Sun" (or worse, "Billy, Don't Be a Hero"). We might not have torn our jeans or dyed our hair purple, but we did feathering and curly perms, and we wore flared gauchos with midriff tops fashioned from bandanas. (Nowadays when I look at pictures of my teenage self, I have a bit more sympathy for my poor late dad and his periodic, "Young lady, do you really want to leave the house looking like that?")

Point: The world we know is gradually disappearing, piece by recognizable piece. (I know I keep harping on celebrity deaths, but they're a stark reminder of that.)  The older we get, the less we recognize our surroundings. We lose our bearings. This has happened to every generation, and it will also happen--someday--to this dyed, pierced, tattooed, vaping, effing, texting, abbreviating, selfie-taking generation we call the Millennials. Of course we probably won't be around by the time they're able to tell us, "I finally understand"--just as too many of our own parents and grandparents are not around for us to tell the same thing. But it will happen, whether or not we're here to see it.

Change is the way of things. What if, rather than bemoaning the inevitable, we instead decided to learn about our current world from the perspective of our students? In Pedagogy of the Oppressed, Paulo Freire posited a teacher who "is no longer merely the-one-who-teaches, but one who is himself [and herself] taught in dialogue with the students, who in turn while being taught teach.” A lot of educators I know like to quote passages like this. We call it dialogical education. Touting ourselves as "dialogical educators" makes us sound cool at conferences and in our reappointment and promotion dossiers. But do we mean it? 

One of my assignments calls for students to write an ethnographic fieldwork essay about a subculture--either one with which they identify, or one that they wish to learn more about. Sometimes when I see their topic choices, I feel ill-prepared to assess them--I say things like "What is cosplay?", which makes the whole class laugh when I didn't even think I was being funny.  As I read the drafts, though, I am always struck by a common theme that emerges collectively: the search for, and the need for, a sense of community, human connection and meaning in a world engineered to isolate us from one another. I remember sitting around campfires singing "If I had a hammer" while people played guitars. I remember my parents and grandparents talking about the silver linings of great depressions and world wars--"at least everyone pulled together."  I remember squeezing my teenage self and all my girlfriends into photo booths, or asking random fellow tourists to snap pictures of me in front of various landmarks with my Kodak Instamatic, and I remember when Polaroids were a popular party accessory, and then I wonder why my generation is so freaked out about "selfies." You won't ever get me to say "totes" (except in quotes to indicate I'm not really saying it), but I read my high school yearbook where every other person told me to have an "outrageous" summer, and I'm frankly not sure which is more stupid.

And then--despite justifiable media outrage about cyber-bullying--I see young people gathering together on Facebook to prevent a suicide (who talks about this?). I see young people encouraging each other. I see young people who accept differences and dream of creating a better world. Sometimes, in them, I even see a version of my own younger self--when I look past the blue hair and the nose ring and stop to listen.

0 Comments

"JOHN KEATING" VERSUS THE ZOMBIE APOCALYPSE

8/21/2014

2 Comments

 
So far this blog doesn't show many comments, but my "Stats" feature shows a lot of hits--people are reading. This makes me happy. And I know, from corresponding privately with some readers, that this audience is an interesting mix, of people who work inside academia and people who do not. This makes me happy too. 

But it also makes me feel slightly uncomfortable, because the traditional advice to writers is "Know your audience," and deliberately addressing a mixed audience is something that serious writers are advised to avoid. I am not supposed to have it both ways here; unlike the fictional narrator of Salman Rushdie's Midnight Children who declares, "I refuse to choose," I am supposed to make up my mind. Who am I talking to? 

To some extent, I understand why it's important to target a specific audience. Here, for instance, readers outside academia might grow bored with an analytical piece like the one below that parses my reaction to Adam Gopnik's defense of the humanities, while those who "profess English" for a living might cringe at my not-so-scholarly response to Dead Poets Society.  For years I've pondered writing about the mirror-relationship of the micr0-community I was raised in (fundamentalist religion) and the one I joined (academia)--the fact that my once-home community resists "intellectualism" while my professional community resists "sentimentality." A couple of years ago when I told a friend I might write about this, he told me, "Wow--you're in the unique position of being able to offend everybody you know, and all at the same time."

The point isn't to offend anyone, but academic readers who expect this blog to be about the plight of the humanities in the bureaucratic age may sometimes wonder exactly what I'm trying to do here. (To paraphrase Spamalot's Lady of the Lake, it may seem I've "really lost the plot.") What's the relationship between celebrity deaths and what linguist Noam Chomsky calls the "Walmart model of education"?  (As an aside: In some ways Walmart does work as an analogy for the current direction of higher education: as Chomsky points out, a small, well-paid management class holds power over a cadre of lower-paid workers who are treated as interchangeable and dispensable. But there is one flaw in the analogy: at least Walmart does keep its prices low, however ethically problematic its tactics might be. With higher education embracing a similar model, wouldn't one expect the cost of higher ed to go down, rather than reaching unprecedented highs?)   

Pondering this question led me to think, once again, about Robin Williams--this time, the persona that he often played in his starring roles. The "Woody Allen persona" is privileged yet neurotic, unlucky in love, and prone to witty but whiny existentialist observations; the "Harrison Ford persona" is arrogant and emotionally distant, yet ultimately charming and loveable despite himself. The "Robin Williams persona" is typically a creative, high-energy maverick who is forced to operate in a highly traditional and repressive environment--a New England prep school, a hospital, a psychiatric office, his estranged wife's house, and so forth.

Not every film follows an identical plot trajectory, but in many of them, Williams' protagonist energizes those who have been stymied by rigidity; everyone falls in love with him, except for the tradition-bound bureaucratic individual who views him as a threat to the status quo. When one of his "followers" (or, as in Mrs. Doubtfire, Williams' character himself) makes a serious mistake resulting in some kind of disaster, the traditionalists blame the Williams character, who--depending on which effect the screenwriter is going for--either triumphs or departs in shame, his spirits nevertheless buoyed by the praise of those followers who truly understood what he was trying to do. (Actually, this is also the plot trajectory of just about every "creative teacher fighting against the system" movie out there.)  Cliche alert!

But I prefer to explore concepts like "cliche" (as with "sentimental") in depth, rather than using negative labels as a way to shut down discussion. I'm not one of those who believes that anything "popular" must therefore be artistically inane. If a cliche resonates with broad audiences, I'd argue that there are reasons, and we'd do well to consider what those reasons might be rather than assume something is drivel just because a lot of people like it. I'd suggest that audiences respond to the Williams persona--and to the "creative maverick fighting the repressive system" plot, hackneyed though it may be--because these are not just fictional gimmicks. They dramatize a question many of us live on a daily basis: Will we act as humane people, serving fellow human beings through systems designed to meet our various needs? Or will we serve the system itself, sacrificing the humanity of others--and, ultimately, our own humanity as well--to the machine? 

How we experience this issue depends on where we are positioned in society; power is unevenly distributed, and some people have more opportunity than others to make decisions that affect the lives of others. All of us, however--albeit to widely varying degrees--know how it feels to be treated as something less than fully human. (Try calling any company and finding an actual person on the end of the line; watch your computer as it decides it's time to "install updates" without your consent. Now magnify this feeling of "not mattering" several zillion times if you are young, old, poor, disabled, under-employed, a member of a minority group, and so forth--and apply it to more earth-shaking situations, like whether you have clean water, health care, or enough to eat.) 

As power and wealth grow increasingly concentrated in the hands of a few, and as such dehumanizing beliefs as racism and sexism appear to grow more socially acceptable, the system dehumanizes more and more people. Consequently, as Naomi Alderman points out in a 2011 Granta article, it's no coincidence that the monster du jour is the zombie. To quote Alderman: "Zombies are the horrifying crowd of the urban poor, the grasping hands reaching out for something which, if you gave it to them, would destroy you. They’re the interchangeable anonymous people we encounter on our daily commute, those whose humanity we cannot acknowledge."

Joseph Campbell posited in The Power of Myth that Darth Vader is evil precisely because he is "a bureaucrat, living not in terms of himself but in terms of an imposed system." Campbell posits this as the dilemma of contemporary humankind: "Is the system going to flatten you out and deny you your humanity, or are you going to be able to make use of the system to the attainment of human purposes?" The Robin Williams movie persona is Darth's inverse; his characters typically attempt to humanize a system that is "flattening people out" (much like the societies imagined in dystopian novels).  The humanizing effects of interpersonal connection, dignity, creativity, humor, and other qualities found in the "Williams persona" are the antidote to the zombie apocalypse.

De-humanization takes many forms. It happens when human beings are treated as "human resources" (we used to be called "personnel," with the root word "person," but nowadays we're just "resources"). It happens when we're conceptualized merely as "consumers", or as "producers," or as "budgetary line items." It happens when we write off the humanity of anyone we perceive to be different from "us" (for whatever reason) and therefore "inferior." Dehumanization lies at the heart of just about every social ill, both now and historically. When one society goes to war with another, the propaganda typically constructs the "enemy" as an animal or a monster--something other than a human being. Slavery was justified for centuries on the basis that those of other races were "partial" human beings who "don't feel pain like 'we' do," or who "don't love their children the same way 'we' do." Genocide is justified on the same principle, that "we" matter and "they" do not. (Such attitudes have hardly disappeared.) When multinational corporations justify exploitation, low pay and appalling working conditions, the machine is at play. The apocalypse is here; the apocalypse is now. As Alderman puts it, the zombies "won't just kill us, they'll turn us into one of them."

The question of whether we serve the system or vice-versa also lies at the heart of today's educational dilemmas. When the real challenges faced by K-12 students--poverty, illness, learning disabilities, chaotic home lives--are disregarded in a never-ending quest for higher test scores that neglects context and measures only one narrow aspect of learning, the system is winning. When "accountability" leads to assessment practices that stifle creativity and openness in the classroom, the system is winning. When a bottom-line-above-all-else mindset gives short shrift to the arts and humanities because they are "insufficiently profitable," the system is winning. When teachers at any level lack a living wage, appropriate benefits, job security, and the kind of autonomy and trust normally accorded to well-trained professionals, the system is winning. (The latter is, of course, true for those working in other sectors as well.) 

When the system wins, all of us--not just teachers and students, but the families and communities in which we all live--become zombies, walking around undead,  as the Darth Vaders of the world control the ship. (Yes, I know I'm mixing my metaphors here.) Those of us with stakes in the educational enterprise are not alone. The process of dehumanization--the zombie apocalypse--is something we all face if we do not begin to work, collectively, toward a more humanized approach to education as well as to all other realms of life.

That is why I'm consciously committing several "sins" in this blog:  addressing a mixed audience, boomeranging between scholarly and not-so-scholarly sources, mixing metaphors, coming at this "why save the humanities" question from multiple angles. For it's not just the corporate model of education that threatens our collective humanity; those threats are everywhere. We will do better to address them in solidarity rather than in isolation.

If we wish to make progress in wrestling education back from the bureaucratic Darth Vaders, then those of us who are educators--at all levels, from preschool through PhD--should be as comfortable discussing what we do with construction workers, plumbers, janitors, doctors, lawyers, hairdressers and accountants as we are with each other. We will only be able to stem the current march toward zombie education when those from all walks of life understand that all of us have a stake in creating an educational system that humanizes everyone, not only those who formally received a particular kind of education from a particular kind of place. If we care about these issues, we need to know how to speak to a broad audience about why they matter.  

Of course there is a time and place for more specialized academic discourse, and I still believe and participate in that. But when it comes to re-humanizing our current approach to education, educators can't do this alone. Everyone, regardless of occupation, needs to understand what's at stake. We don't have to be as outrageously iconoclastic as the Robin Williams characters that collectively defined his cinematic persona, nor will the plots of our own lives be as predictable or as simple as the Hollywood formula. But if we dismiss as "hackneyed" or "cliche" the response of audiences  to those characters, we will fail to grasp the crucial point: the need for humanity in the face of a mechanistic system. 

Now that I think about it, perhaps I'm following the time-worn advice after all: "Know your audience." It may not fit traditional definitions, but this audience is one I know: it includes both academic and non-academic readers, and I'm fine with that. So long as none of them have already turned into zombies. I want to believe it is not too late.

2 Comments

Sentimental journeys: A return to "Dead Poets Society"

8/16/2014

0 Comments

 
I wrote "Blight," the piece posted below, three years ago in response to the same-day deaths of Michael Jackson and Farrah Fawcett. Of course it's outdated; Robin Williams was still alive then, as was Casey Kasem, whose "American Top 40" weekly radio show characterizes my memory of later childhood as much as anything else: the sappy yet addictive long-distance dedications, that distinctive voice, the weekly mystery of the countdown order (especially the Top 10), and the final phrase that I both anticipated and dreaded: "Keep your feet on the ground, and keep reaching for the stars!"  Anticipated, because when I was between ages ten and seventeen it sounded inspiring, in the vague and unformed way that appeals to those at a vague and unformed stage of life. Dreaded, because it meant the weekend was truly over; time for Monday morning, ugh.

Celebrity deaths, as I explored in "Blight," often affect us, but not because we're mindless celebrity stalkers who don't have lives. They affect us because we do have lives, through which we often embark on retrospective sentimental journeys when we learn of a celebrity's death. Discographies that appear as dry, meaningless lists on a Wikipedia entry might have been the soundtracks of our childhood; similarly dry filmographies might symbolize key turning points in our own thought processes. Celebrity deaths resuscitate memories of our earlier lives, our prior selves. 

Casey Kasem brings back my tweendom and teendom, while Robin Williams evokes the next stage, coming of age. Mork & Mindy hit the airwaves when I was in my late teens, followed by Williams' version of that '70s cultural artifact: the stand-up comedy LP.  A group of us frequently went camping in those days, and in addition to the usual paraphernalia, we always brought along a boom box and our bootlegged cassette tape of Reality, what a concept!  We listened to it often enough to have it memorized (still). Who else but Robin Williams could singlehandedly voice every one of the characters he invented for his Shakespearean version of Three Mile Island: "A Meltdowner's Night Mare."  ("The stream that cometh from the plant, what news? A three-headed fish? No big deal ...") Then came his movies, including one that played a role in changing my life: Dead Poets Society.

Okay; at this point you're either hooked, or you're thinking, "She's got to be kidding."  An English professor trumping Dead Poets Society?  How cliche. How self-serving. But back when I first saw the movie, within the first few days of its 1989 release, I wasn't yet an English teacher. In fact, I wasn't even--nor had I ever been--a college student. I had a full-time, decently paying job, it would be two years before I finally set foot in an undergraduate classroom, and I had no idea I was ever going to teach English. Absolutely none. 

The original reviews were mostly good, and so was my reaction--mostly.  I enjoyed the film, but even back then, I didn't love absolutely everything about it. Certain aspects concerned me--too many guys, for one thing. Too magical, for another--can someone really be completely transformed in a few weeks, just by reading a few words? Is it really that easy? What's more, Neil's suicide felt unnecessarily melodramatic, and I wondered if the film might unwittingly romanticize acts of self-destruction (a thought that is now, in light of recent events, tragically ironic).  

And yet for all that, the positive elements outweighed the others. I still remember sitting in the theater--too close to the front because we'd arrived on the late side--while a little voice whispered, "You should be doing something other than what you're doing." I'd heard that voice while watching the fantastic 1983 British film Educating Rita, and I was starting to hear it more and more, not just at the movies.  "I hear voices."  Perhaps I should have been concerned. Instead, two years later I took a major risk (or at least it felt like one): I became a thirty-year-old college freshman. 

After explored the required first-year courses, I decided my major would be English. My very first day in a college English class, I felt a new and unusual sensation: belonging.  I knew this was the right place for me. That knowledge--still with me--provides the energy source that has fueled me for more than two decades, keeping me going through all of it--B.A., grad school, dissertation, jobs both part-time and full-time, workplace challenges. It gets me through every dispute, every setback, every near-quitting experience, both as a student and as a professor. It gets me through committee meetings, administrative duties, even assessment projects--not to mention the unanticipated requirement to work with spreadsheets. (Note to students: Never assume that something you're learning isn't going to be "necessary."  You may be surprised.)  

The actual work of "professing English" is more challenging than I ever imagined--there's far more to it than any outsider can see, as with any career.  (Scratch any ideas of the lazy, tweedy professor who kicks back and smokes a pipe while pondering the meaning of Moby Dick and works ten hours a week; that stereotype, if it was ever true, no longer holds.)  The "getting here" was also more challenging than I anticipated. (Good thing, too; I might not have done it had I foreseen all the potholes.)  Yet through it all, I've never lost that sense of finally belonging, finally finding the right place. Dead Poets Society wasn't the only factor, but it proved to be a timely catalyst.  The fictional John Keating had done his number on me, whether I liked it or not.  Carpe diem. 

Yuck, what a sentimental story. I know many scholarly-type people who will grasp what I'm talking about and agree. But I also know many who will roll their eyes and say that I don't even sound like a "real academic." How you feel about literature shouldn't matter; there's no such thing as magic; talking like this, some argue, is not going to help us make the case to our administrations or our constituencies for sustaining the teaching of literature or creative writing. 

For many in our profession, the worst insult you can hurl at a fellow academic is "sentimental." It's often a too-easy way of discounting things the rational mind can't account for (like hearing voices while watching Dead Poets Society). An easy way to shrug off whatever makes us uncomfortable. An easy way to deal with whatever our shared belief system doesn't cover by slapping on a label that warns us to avoid it. A way to define who is "one of us" and who is not--"they" do sentimentality, "we" don't.

Funny, isn't it; I grew up in a church in which the worst insult you could hurl at a fellow member was "intellectual." A too-easy way of discounting things the official church narrative couldn't account for.  An easy way to shrug off whatever made us uncomfortable. An easy way to deal with whatever our shared belief system didn't cover by slapping on a label that warned us to avoid it. A way to define who was "one of us" and who was not--"they" did intellectualism, "we" didn't.

Mirror images? Micro-communities, built around shared identities that necessitate splitting off a key element of what makes us the complex, often contradictory beings that we are. (Consistency is overrated.) I prefer not to split off but instead to try to live by the Martin Luther King quote posted in my office:  "Only through the bringing together of head and heart--intelligence and goodness--shall mankind rise to a fulfillment of his nature." (I'll edit that to say "humankind," and "our nature.") 

Nowadays, I still have one quip with Dead Poets Society: I worry about its potential glamorization of suicide.  Sometimes I notice many English-major-types who are a little too caught up in the romantic fantasy of the fantastically talented yet tortured genius--the Virginia Woolfs and Sylvia Plaths and Ernest Hemingways and Vincent Van Goghs, committing the ultimate act of defiance against a world that was never meant for one as beautiful as them. Drugs, too much drinking--aren't they the price we creative geniuses must pay for being so profound, so inspired?  I say no; suicide is never glamorous. It's devastating. The horrendously sad loss of the multi-talented Robin Williams this week reminds us of that. 

So I still believe Dead Poets Society could have made its points without quite so much melodrama. But I'm not going to join the chorus of cynics and write off the film as merely "sentimental"--just as I did not heed the childhood warnings against all that is "intellectual" and avoid the supposedly demon-infested college campus (well, at least not forever). The Tin Man didn't give up his brain to receive his heart, and the Scarecrow didn't need to trade in his heart for his brain. The point of their journey was to gain all of it--the ability to think and the ability to feel, without running away in fear at the first point of discomfort. And depending on what group(s) you identify with, knowing how to "bring together head and heart" might sometimes make you feel like the resident weirdo, if you're in a micro-community where the motto (explicit or implicit) is, "We don't do that around here."  

"Real" literary scholars might scoff at Dead Poets Society, saying it's too simplistic. But I think there's a time and place for simplistic. For instance, I can't argue at all when John Keating tells his students, "No matter what anybody tells you, words and ideas can change the world." 

Yes, they can.

Rest in peace, Robin Williams, and thank you for bringing those words and ideas to life, for me and so many others.

0 Comments

WHY WE MOURN CELEBRITY DEATHS

8/11/2014

0 Comments

 
The day that Michael Jackson and Farrah Fawcett died, I penned an essay entitled "Blight," spinning off Gerard Manley Hopkins' poem "Spring and Fall."  (I went on to submit it to our city's biannual National League of American Pen Women competition, and it won.)  While discussing Robin Williams' death with many friends today--and encountering the inevitable People Who Don't Understand Why People Mourn Celebrity Deaths--I remembered this essay and thought it pertinent. I haven't updated it at all to reflect the latest events, but I post it here because I think it's still relevant. For those of you who are not familiar with Hopkins' poem, I have posted it at the end. (It's nineteenth century and therefore out of copyright.)
--------------------------------------------------------------------------------
BLIGHT - a meditation

Waves of grief always begin with frequency and intensity.  Both diminish over time, though smaller, more subtle waves recur, indefinitely.

When a celebrity dies it isn’t quite the same, at least not for those of us outside that celebrity’s inner circle (as someone not close to anyone famous, I can only imagine the surrealism of mixing private with public grief). Still, many who never knew the deceased react strongly, while others observe with detached puzzlement: Why are you upset about the death of someone you didn’t know?  Illogical; no sensible explanation for why hordes of people wave candles in the air outside the Dakota Apartments or Graceland or the Apollo Theatre, heap flowers outside Buckingham Palace, glue themselves to television footage, post blips on social networking sites mourning the passing of strangers. 

Amidst all the madness, there are always those who post something (or, back before we posted things, those who said something) to remind us that the deceased individual wasn’t perfect, in some cases deeply flawed, perhaps so flawed that the current mass grieving should be unwarranted; that the media coverage is excessive and ridiculous and should be more focused on something more significant, like impending nuclear warfare; that the family of the subject celebrity should be left alone. Someone will note the irony of paparazzi and press, considering that the death itself may have been at least partially triggered by the stress of fame, paparazzi, and our toxic cult of celebrity. We build them up to tear them down.  In the course of achieving fortune and fame, many celebrities pay a price, often the ultimate price, sacrificial lambs to a society that pays lip service to equality yet demands that someone play the part of royalty.

I never disagree with anyone who points all this out.  Mourning the deaths of celebrities we’ve never met is illogical, and celebrities, like all humans, are rarely perfect; if anything, limitless wealth, power, and face recognition tend to foster an excessiveness that surges beyond the realm of eccentric and into the realm of downright scary. 

Whenever a celebrity dies, especially one with a bizarre life story, I become slightly eccentric myself, morphing into two (or more) people.  My inner intellectual, left-brained and well versed in the analysis of cultural discourse, grouses about the media, the shallow values of pop culture, the dark side of fame, the cloud of denial in which so many celebrities seem to be enshrouded.  Yet another self feels unable to pull away from the television coverage.  My inner intellectual may deplore the cheap exploitation of celebrity death, but she does so while flipping between cable news channels (no patience for commercials), or while scanning radio stations in search of one that’s playing all Michael Jackson all the time, or while downloading a copy of the Farrah Fawcett swimsuit poster as a favor to my husband (knowing he is nostalgic for his own decades-old fantasy life).  Isn’t this a sorry spectacle, I muse, knowing full well that I myself am part of that spectacle and that really, this is not the same as losing too many dear friends, not to mention my own mother and father and along with them, the innocence of pre-orphanhood.

Or maybe it is not so different.  It is not the celebrity I have lost—someone I never had to begin with.  What I have lost is one of my prior selves. That prior self is already long gone, but the celebrity death reminds me of what I had not wanted to admit.  The poet Gerard Manley Hopkins knew this way back in the nineteenth century when he gave voice to the sadness of little Margaret, distraught as she watches the leaves falling yet feels unable to put into words the “why” of her sorrow; and the adult beside her, who observes and understands. 

The self so easily sucked into the media hype also understands.  For all the excess and clichés, the crassness of pop culture, the dark side of fame—all of which seemed to culminate in the apex of Michael Jackson—what I remembered on the day he died (Farrah Fawcett’s death echoing like a grace note in the background) was music that catapulted me on a backward journey, not through Michael’s life nor through Farrah’s but through my own. 

As a child, I woke on Saturday mornings and rushed into the living room to watch the Jackson Five cartoon on ABC, fighting with my little brother while inhaling the aroma of Swedish pancakes and bacon that wafted from our kitchen. I remember lying in bed at night during early puberty, listening to a young Michael Jackson (only a couple years older than me) through the transistor radio I’d gotten for Christmas the year I was ten, crooning “You and I must make a pact, we must bring salvation back,” yearning for something that I couldn’t put into words, sensing that childhood would soon be past, that everything was about to change.  I remember buying my first curling iron at Fred Meyer and insisting that my mother purchase a hand-held blow dryer so that I could try, valiantly, to get my hair (parted in the middle, of course) to flip back like Farrah’s. 

Mom, on the other hand, dried her hair while working on crossword puzzles, sitting at the kitchen table wearing an air-filled bonnet connected by corrugated hose to a gigantic machine esconced in its own suitcase.  My mother’s generation of women never dreamed of flipping back their hair at the sides; they just needed to dry out the beehive.  I remember turning nineteen and crossing the border three hours north to spend weekends in Vancouver, British Columbia, where I could legally drink and dance to “Off the wall” and “Gotta be starting something,” even though back home—despite being already married—I was still underage.

In 1982 I finally turned twenty-one, and my husband and I began frequenting the bars and nightclubs of Seattle.  That year the Thriller album was released—everywhere.  Everybody we knew had the album, and it was an album, a vinyl disc that we dropped onto turntables and recorded onto cassette tapes.  We thought we were on the cutting edge of technology when we figured out how to record bootleg cassette tapes off our own records, and we made multiple copies so we’d have backups when the first one jammed, since every so often ejecting a cassette would leave you entangled in thin strands of film. 

Thriller wasn’t the first time “everyone” had an album; the first album I bought because “everybody” had it was Elton John’s Goodbye, Yellow Brick Road, followed by Supertramp’s Breakfast in America, Billy Joel’s The Stranger, Pink Floyd’s The Wall, and Fleetwood Mac’s Rumours.  (Older friends tell me it was the same way with Sergeant Pepper.)  Still, with Thriller, something seemed different.  Average albums contained a couple of big hits and a bunch of filler; phenomenal albums contained more than two hits and a series of excellent songs that didn’t make airplay but generated cult followings.  With Thriller, nearly every song played on Casey Kasem’s Top Ten and at clubs and discos.  It wasn’t just that every song was good—that had always been what distinguished excellent from average—but every song popular, that was new.

My husband and I were on the edge in those days, the first couple in our high school clique to get married, first to live in our own place, first to get cable TV and movies in our own living room sans commercials, which made ours the best party house. Then came MTV, and at first none one knew what to make of it.  The Thriller video, with its unsurpassed choreography, made MTV seem like art, at least occasionally. A couple of years later we recorded that video on our first VCR, a contraption that cost eight hundred bucks and spanned the same area as our coffee table.  Once again we made backup copies because the VHS tape would jam, with strips of film either flying through our living room or becoming fatally entwined in VCR heads.  The march of technology has not always been in perfect step. 

Still, I was amazed to realize that the wild predictions of my father, an avionics engineer and devotee of Popular Electronics and Isaac Asimov, had come true. During childhood my parents, devout Christians, had insisted that we attend church not only on Sunday mornings but on Sunday nights as well, and my brother and I always pouted about missing The Wonderful World of Disney.  “Someday,” Dad predicted, “we will all be able to buy machines that attach to our televisions, and you’ll be able to record TV shows while you are out and watch them at home later, on your own schedule.” 

“No way,” my brother and I would say, rolling our eyes.  After all, this was the same man who sought to solve Seattle’s increasing traffic problems by attempting to invent an affordable personal helicopter, an idea my mother airily dismissed: “That just moves the traffic problem into the air, and if the personal helicopters crash, what happens to people then?”  Still, Dad insisted on his flights of fancy:  Someday, he said, people will no longer talk on telephones but through computers; someday the postal service will become obsolete as people send each other computer messages instead of letters; someday we will be able to see each other’s faces as we do all this; someday everyone will have a telephone in their car; someday cassette tapes will disappear, as scientists learn how to store information on tiny magnetic chips.  Our friends (and Mom) used to say Dad was crazy.

Of course Dad was vindicated, though he didn’t live long enough to see it.  Dad died of brain cancer in 1993, just months after we moved to another city, where I noticed more and more people carrying cell phones.  It seemed silly then to think one should need—or want—a telephone available at all times (“What if you don’t want to be found all the time?” I often mused aloud).  By then most everybody I knew owned a CD player and had phased out vinyl and cassettes, though we still recorded our TV shows on VHS—Betamax having already lived the same short lifespan of the eight-track tape. 

Dad had lived long enough to see that particular prediction come true, but not most of the rest.  The year after he died, I got my first email account on Compuserve.  A somewhat early adopter, I’d had a PC since 1990, with a 500-megabyte hard drive and a full megabyte of RAM, outfitted with an external modem attached by nine-pin cables.  Back in the 1980s, I thought we’d reached the pinnacle of communication technology when I used a steamer-trunk sized modem to transmit data from a remote location, and I’d assumed things couldn’t get any smaller when floppy disks shrunk to three and a half inches, much easier to manage than the eighteen-inches-around, three-inches-thick Frisbees that my boss had rolled around the office in 1984.  People who knew Dad tell me that it’s a shame he didn’t live long enough to see the Internet.  An immigrant to America from India via England and Canada, he hailed from a scattered and shattered diasporic post-colonial family.  Keeping in touch had been difficult, and Dad left this earth just as it was about to become easier.

As my father lay dying, my world gradually contracted, becoming small, smaller, smallest, until all that mattered was contained within the tiny space of four walls.  Time slowed to an agonizingly languid pace; the night it finally happened, time froze.  Dad died at a hospice ward at six in the evening, and when we returned to my parents’ house, we discovered that the cuckoo clock I’d brought home from Germany as a gift for my parents had also stopped ticking at six. 

Soon Mom also stopped ticking, degenerating into dementia, then paralysis.  Mom followed Dad a couple of years later, breathing her last on a June afternoon in 1995.  Again time compressed; again my world contracted.  This time I was angry: My parents were in their early sixties, I was still in my early thirties.  This wasn’t supposed to happen yet. 

Five years later I would discover that the slowing of time and contraction of space accompany life at both ends of the spectrum.  As do waves.  The waves of labor move in an opposing trajectory to the waves of grief, beginning as tiny flutters, well-spaced, gradually increasing in intensity and frequency until they become nearly unbearable, just at the moment new life comes forth.  Eight years after the physical labor accompanying our son’s birth, we adopted our daughter and discovered that the waves of labor are not purely physical. Once again, waves surged as time slowed, space compressed, and it felt jarring when I finally returned to the wide open space of the world.  

Our own mortality, life’s fragility, the pain of creation and transmutation, time’s mutability—who wants to go through life aware of all that?  Nowadays, we don’t have to.  TV, movies, music, technology—all of this allows us to shut out memories of painful waves, distract ourselves, pretend we’re not really going there, pretend we’re not really all in this life together. 

But then, like the double edge of fame that haunts those who provide us with distractions, technology and media fold back in on themselves.  The entertainment we use to distract, to forget that time must pass, that everything must change, now becomes the very thing that most reminds us of all that.  Endless memories are evoked by one song, one picture.  Walter Cronkite dies and once again my parents are watching him deliver the news, while they play Scrabble in our oversized living room with its avocado-green wall-to-wall carpeting.  Someone says “Elvis” and I’m once again careening up and down the steep hills of Seattle on a hot summer afternoon in my father’s mammoth 1977 Buick, thanks to my two-week old first driver’s license, when the announcement comes over the radio.  Someone mentions John Lennon and I remember the breaking news that night he was shot, sitting in the living room of our newlywed rental house with its orange shag carpet.  (It was a cheap rental situated next to a dope dealer in a densely wooded area near Lake Washington; we decided to move after the dealer and two disgruntled customers exchanged gunfire in our front yard at two in the morning.)  In that same rental house I awoke at four a.m. to watch Charles and Diana’s royal wedding, doomed, though we didn’t know it at the time and, perhaps, neither did they.

No matter how bizarre the celebrity or the fan reaction, no matter how overblown and ridiculous the media coverage, how strong and even perhaps justified the backlash, Gerard Manley Hopkins knew all along, back in Victorian times with technology in its infancy, what we try to use the noise of the twenty-first century to forget: “It is the blight we are all born for.” 

It is always ourselves we mourn for.

Spring and Fall (Gerard Manley Hopkins)

Margaret, are you grieving
Over Goldengrove unleaving?
Leaves, like the things of man, you
With your fresh thoughts care for, can you?
Ah! as the heart grows older
It will come to such sights colder
By & by, nor spare a sigh
Though worlds of wanwood leafmeal lie;
And yet you wíll weep & know why.
Now no matter, child, the name:
Sorrow's springs are the same.
Nor mouth had, no nor mind, expressed
What heart heard of, ghost guessed:
It is the blight man was born for,
It is Margaret you mourn for.

0 Comments

"WHY TEACH ENGLISH?": A RESPONSE TO THE NEW YORKER

8/8/2014

0 Comments

 
Adam Gopnik’s article in The New Yorker, "Why Teach English?” (HERE) is now almost a year old, and thanks to a former student (who now teaches high school English), I finally discovered it in my newsfeed for the first time today.  It’s beautifully written—Gopnik is a prose-meister—and I found myself nodding enthusiastically with many of his points.  “No sane person proposes or has ever proposed an entirely utilitarian, production-oriented view of human purpose. We cannot merely produce goods and services as efficiently as we can, sell them to each other as cheaply as possible, and die.”  I wish I’d written that last sentence!

Gopnik continues: “Some idea of symbolic purpose, of pleasure-seeking rather than rent seeking, of Doing Something Else, is essential to human existence.”  Perfectly expressed.  And I’m particularly fond of his conclusion, which ends on the same note of perfectly harmonized resolution as Beethoven’s Ninth Symphony: “The reason we need the humanities is because we’re human. That’s enough.”

And yet—when I finished reading the piece, I felt that Gopnik had pulled up somewhat short.

For one thing, Gopnik’s understanding of the present-day English discipline appears somewhat limited; his argument considers English departments solely as sites for literary criticism and interpretation, without considering some of the broader possibilities they offer today such as cultural analysis, instruction in/analysis of writing and rhetoric, and production of new literature. 

Beyond that, he’s excessively dismissive of one argument often mounted in favor of the humanities—what he calls the “alternative better-people defense.”  Popular belief to the contrary, says Gopnik, English majors (or humanities specialists in general) do not “seem to be particularly humane or thoughtful or open-minded people.”  As support, he states that “no one was better read than the English upper classes who, a hundred years ago, blundered into the catastrophe of the Great War.” Similarly, says Gopnik, “Victorian factory owners read Dickens, but it didn’t make Victorian factories nicer.”

Up to a point, I’d agree; as anyone who has been through a graduate program in English knows all too well, advanced study in English—whether we focus on literary criticism, creative writing, cultural studies, composition/rhetoric, or some mix—doesn’t necessarily produce nicer human beings. Nor does it always produce people who are pleasurable to work with. The academic politics of English departments are legendary, and goodness knows our profession has its share of narcissists, outsized egos, and petty grudge-holders.

So, however, do departments of political science, history, psychology, social work, business, law, economics, biology, astronomy, and just about everything else.  I think about many (not all) of my own students and colleagues, and many (not all) of my friends—some who majored in English formally, some who just love reading and/or writing, some of whom have four-year degrees and some of whom do not. In fact, many of these people are particularly humane, thoughtful, and open-minded. They’re just not necessarily in the upper echelons of academic English departments (though, to be fair, along with the narcissists and grudge-holders, I have met some wonderfully humane English professors as well).  All this makes it difficult for me to toss out the “alternative better-people” argument altogether.

Of course, engaging in all that turns the crank of English geeks doesn’t automatically make us “better” people, and there are also plenty of decent people from other walks of life. But for many people, having a regular practice of placing oneself in the subject-position of another, consciously attempting to view the world through different eyes, does have the effect of making them more empathetic, more socially aware, and less egotistical.  (See my prior post on dramatic irony.)  Of course it’s true that there are no guarantees; it’s also true that reading and/or writing are not the only paths to such ends. Yet they are viable paths, and proven, even if many who take those paths stumble off the trail. 

Yes, the “alternative-better people defense” has its limitations, as Gopnik notes. If you’ll pardon my crudeness, we have all met a few well-read, erudite assholes.  But we’ve probably also met many well-read, erudite people who are not. And outside the humanities, I’ve met plenty more people who subscribe to attitudes that are utilitarian, or “bottom-line,” or “me-first,” or “my in-group alone.” I’ve met plenty of people who find it shockingly easy to diminish the humanity of those they consider different—and, therefore, less worthy—than themselves.  In short, I’ve met many people who could stand to benefit from what studying the humanities and literature has to offer.  There may be no guarantees that everyone who does so will grow more humane.  But imagine a world lacking even this possibility.

I wonder, then, if the limitations that Gopnik points out are not so much the fault of the English major as the fault of academia at large—its rigid hierarchies, its cutthroat competition, its tendency toward arrogance.  Perhaps we should take into consideration a broader swath of people, not just those at the top of the intellectual pyramid who study or produce literature professionally in an institution that, I would argue, is “always-already” somewhat dysfunctional by its nature.

And beyond academia, perhaps an even larger problem lies in elitism more generally.  Gopnik points out that the erudition and literary prowess of the well-read English upper classes did nothing to halt the Great War; could the problem not be literacy per se so much as socioeconomic class, and the hierarchies, dehumanization and blindness it breeds?  A similar argument might apply to Gopnik’s point about Victorian factory owners: perhaps Dickens failed to persuade these tycoons not because of the limits of literature, but because they were factory owners, operating from a utilitarian paradigm of profitability at all costs—a paradigm so deeply ingrained that another way of thinking simply wasn’t imaginable to them.

On this point Gopnik seems to undermine his own claim when he adds, “What made them [Victorian factories] nicer was people who read Dickens and Mill and then petitioned Parliament.” Hey, wait—you mean reading and literature can make a difference after all?  Maybe not to people who have been raised all their lives to assume they are somehow innately superior to others, and maybe not to those conditioned to view others not as “human” but as “resources” and therefore to be exploited. But that isn’t everybody, and maybe we would do well to think more about why some of those who read Dickens and Mill might have been moved to take action that mattered. 

I agree wholeheartedly when Gopnik states that “English departments democratize the practice of reading,” and that this alone is “a simple but potent act.”  Gopnik cites the example of his own father, “the son of a Jewish immigrant butcher and grocer” who became an English professor, as an example of why literacy should be democratized.  The pleasures of studying literature, says Gopnik, should not be limited to “those rich enough to have the time to do it.”  I’d go a step further and say it’s exactly those of us who are “not rich enough” who deserve the time to do it.  If we want the humanities to play a role in making our world more humane, they must be available to people from all walks of life, not just those with economic and/or intellectual privilege.

Particularly problematic, then, is the turn Gopnik takes toward the end of his piece when draws conclusions about why the humanities are beneficial: “No civilization we think worth studying, or whose relics we think worth visiting, existed without what amounts to an English department—texts that mattered, people who argued about them as if they mattered, and a sense of shame among the wealthy if they couldn’t talk about them, at least a little, too.” (Emphasis mine)  Is that really all that we’re here for? To amuse future anthropologists? To give the wealthy a cocktail-party topic that might allow them to shift focus momentarily from discussing the stock market?  There’s nothing necessarily wrong with either.  But can’t we aim for a little bit more? 

For me, then—beautifully written as Gopnik’s final sentences may be—his final paragraph ultimately pulls up short of its potential. Gopnik quickly dismisses the possibility that the humanities might “produce shrewder entrepreneurs or kinder C.E.O.’s.”  Yet those of us teaching and administrating in English departments know the futility of telling administrators that our programs are justified “because they help us enjoy life more and endure it better.”  Administrators, boards of regents, and stakeholders want balanced budgets, unconditional accreditation, students who land decent jobs after graduating, and future alumni who can donate.  Try writing a program review that rationalizes spending money on your program with “we’re human [and] that’s enough,” and you will be told, “Sorry, not enough.”  It also won’t be enough for students who are careening toward a lifetime of indebtedness, especially as tuition fees and textbook prices spiral out of control.  Of course this statement is true; it’s just that it is not enough, especially when it comes to making the arguments we must make if we are to keep our programs financially viable.

Perhaps, then, we should be thinking about how studying the humanities might “produce shrewder entrepreneurs” or “kinder C.E.O.’s.”  (Right now the world could certainly use more of both.)  Perhaps we should be thinking about the larger implications of “democratiz[ing] the practice of reading" (and writing)--rather than only considering how it might benefit society’s elite. I don’t disagree with Gopnik about the inherent value of the English major, or about the emptiness (and impossibility) of an entirely utilitarian society, or about the need for symbolic purpose— for “Doing Something Else.”  On those points, as I said, I not only agree with Gopnik, I love the way he expresses them.  But by themselves, these are not “enough.”  

If we’re going to keep English and other humanities majors alive in the academy, we must position ourselves to meet the sometimes onerous demands that characterize education in the twenty-first century.  We must help our students find viable career paths, and we must learn how to articulate some of the practical ways in which students from all walks of life can benefit by studying what we teach. We must do this in addition to (not “instead of”) the more idealized vision that Gopnik expresses.  And we must embrace, rather than resist, the “alternative better-people defense”—not in a way that diminishes other majors or pursuits, but in a way that emphasizes how and why studying the humanities holds the possibility of making us more humane. 

Not least of all, we need to reject elitism in all its forms. Rather than pointing to those at the top of the pyramid (whether financial or intellectual) as examples of how the humanities has failed to make people “better,” we need to work toward a world in which a majority of people believe that all human lives have inherent value.  I completely agree that majoring in English can “help us enjoy life more and endure it better." But that's not necessarily at odds with producing employable English majors, or being able to articulate the value of studying English not only as it pertains to the world of ideas but also as it pertains to the marketplace.

Finally, we must embrace rather than resist the belief that literature and the humanities can make us more humane.  For that will always be what makes the humanities most powerful.  While there will always be those who “miss the memo," that does not mean the memo shouldn’t be written.

0 Comments

sin and pleasure (a meditation on where to store the crayons)

8/3/2014

0 Comments

 
In preparation for what promises to be a crazy fall semester, I’ve been organizing the house—my theory being that the coming chaos will be easier to manage if it explodes against the backdrop of an orderly domestic life. This sounds good on paper. But we’re a book-addicted family with a teenager and a grade schooler, and everybody has too many hobbies, so organizational tasks always make me feel like Sisyphus, cursed by the gods to spend eternity rolling a huge boulder up a steep hill only to see it roll back down again. (Sisyphus caused a lot of mischief and I think he deserved his fate, but what have I done?, I mutter to the gods as, for the 270th time, I rearrange my second-grader’s pile of art supplies.)

I always seem to get stuck on the art supplies—especially when it comes to deciding what is for “Art” and what is for “School,” because when you’re in second grade there isn’t a lot of difference.  Crayons are used for both. Crayola markers—of the size, tip shape, and color specified on the second-grade school supply sheet—are used for both. Scissors, colored pencils, regular pencils, rulers—all are used for both.  I often worry what is happening to our children’s creativity thanks to “initiatives” like the Common Core and our increasing obsession with multiple-choice standardized tests, so I’m glad to see that in reality, teachers in the lower elementary grades still see educational value in creative projects requiring scissors and crayons.

Contrary to widespread perception, all my children’s elementary teachers (public schools, by the way) have been remarkably gifted and dedicated educators who understand what kids need and know how to recognize and tap into each child’s strengths. Legislators (most of whom have never taught a day in their lives), software-rich financiers, and for-profit educational corporations might foist their narrow visions of curriculum on the system, but those who spend their days with actual children know what kids need and are becoming more innovative than ever as they find ways to incorporate creativity into curricula geared toward Multiple Choice Madness. At least in the public elementary school that my kids have been fortunate enough to attend, “learning” and “having fun” aren’t automatically assumed to be opposites.

But for too many of us, the time comes when all that changes. Enjoyment, creativity, expression—things we think of as “fun”—become the opposite of the so-called "real" world of serious work, meaning “how we earn our money.” Enter “reality” and you must put away your crayons. Or, if you insist on keeping them--literally or metaphorically--you must separate them from your “real” (i.e., money-earning) pursuits. Real life, adult life, money-earning life, is not for having fun, expressing yourself, enjoying yourself. We’re a weirdly confused society in that on the one hand we promote utter superficiality and the mindless pursuit of escapism, yet on the other hand we still harbor a deep distrust of excessive pleasure.  If you are paid to do something, you are morally obligated not to enjoy it too much.

I’ve noticed this fear of pleasure in more than one cultural pocket. I was raised in a somewhat culturally separatist religious community that frowned upon dancing, rock ‘n roll music, playing cards, movies, and bowling (smoking and drinking were, of course, not even thinkable). One paradoxical advantage of such an upbringing: You can rebel without truly endangering yourself. My classmates might have to do something outrageous like drop acid to make a statement against The Man, but all I had to do was.... go to the prom. By the time I reached adolescence, I had the distinct impression that “sin” and pleasure were synonyms, and whenever I was in a good mood, I worried that I might be doing something evil. (As a teen I left the church I was raised in, and I spent the better part of my twenties letting loose the baggage.  I considered it a moral victory when I spent five guilt-free hours in the Swiss Alps, drinking pear schnapps.)

Having been warned throughout childhood of the evils of “secular humanism,” when I entered the university in my thirties I expected to meet people who were highly intelligent. Since the church had taught me that these people worshiped Satan, I also expected them to be a lot of fun—certainly more hedonistic than some of the dour Puritans around whom I was raised.  (I refer here not so much to my own immediate family; within the context of our community, my parents were minor rebels themselves.) Thus, I was surprised to discover an insidious neo-Puritanism had infected many in the academic crowd--people who ostensibly don’t believe in Puritanism.

For instance, I was taught early on that I must never say in an academic paper that I “enjoyed” or “liked” what I was writing about; we are not here to enjoy ourselves but to analyze, to think critically, to interpret.  I learned that including words like “passion” in your grad school application is a surefire way to land it in the reject pile.  (Not everyone who taught me this necessarily subscribed to such beliefs themselves, but they were warning me because they knew—correctly—that my application would be read by people who did.)  If you want to be taken seriously as a scholar, it’s important to demonstrate that you are suffering. Too much enthusiasm and you'll be booted out, banished to whatever passes for hell in academia.

To some extent, this makes sense. After all, education isn’t just about what you “like” and “don’t like,” and the whole point of learning is to stretch beyond our initial reactions and gut instincts. I often speak to my students about the difference between “appreciating” a work of art and “liking” it; I tell them I don’t particularly “like” Quentin Tarantino films because violence bothers me, but as someone who studies literature, I can “appreciate” the way he breaks new storytelling ground. I'm a respectable enough teacher to feel frustrated when my students’ responses to a piece of literature get stuck on “I don't like this”—or even “I loved this!”  Go a little beyond your first reaction, I tell them; that’s what an educated, thinking person does.

Yet isn’t some level of “enjoyment” the point of artistic expression?  Shouldn’t the artist (whether literary or otherwise) experience some pleasure in creating the work?  Shouldn’t the audience experience some pleasure in encountering and interpreting it? Analysis should remain at the core of what we do as scholars and educators. But are we sometimes too afraid that if we experience pleasure in the process, we might become “academic sinners”?

Could our deep cultural suspicion of pleasure be part of what fuels some of the animosity toward the arts and humanities? Business classes are “serious”—you learn about money, how to maximize profits, how to extract it from people. Serious, important stuff.  Science and math classes are “serious,” because they are difficult. Those who succeed in them must have spent many hours in misery, and now they are inflicting misery on others; clearly, they deserve their relatively higher salaries.  

But literature professors? Come on! They are just reading stories, and that’s for little kids, and it’s easy, and it’s fun. They don’t deserve money. Hell, maybe they shouldn’t even be teaching classes; didn’t we all learn to read a long time ago? Art professors? Drawing pictures, making little sculptures, are you kidding--isn't kindergarten supposed to be over?  Music professors? That flash mob looks like they’re having fun, and people who are having fun shouldn't get to make money. (But wait; some studies show that studying music raises standardized test scores and boosts your performance in math, so we might want to re-think music--but not because it's worthy on its own terms.) History professors? They’re just telling stories, about stuff that happened a long time ago; what’s real and what matters is the bottom line, right now.  Philosophy? Comparative religion? You just sitting around talking about other people’s harebrained ideas, and while sitting around talking can be fun, you certainly shouldn’t expect to make money from that (after all, ideas don't do anything to shape our society, do they?).

Money should go to pursuits that are important (i.e., geared toward making more money).  Money should go to people who are suffering.  (Not "suffering" in the sense of being an underpaid, over-controlled minion in some fast food franchise or big box chain store—we don’t mean that kind of suffering, because anyone lowly enough to have such a job probably deserves their fate anyway.)  Money should go to people who have set aside childish things. People who want to make a living doing something they might actually enjoy?—That's preposterous. We don't have enough money for that. (Let's not talk about some of the things we do have enough money for--weapons, for instance...)

I wonder to what extent such attitudes--largely unexpressed, yet pervasive--might contribute to the institutional diminishment of the performing arts, fine arts, liberal arts, and humanities. I also wonder whether a strange fear of pleasure factors into our own complicity in making our own professional lives more miserable. We ensure our courses are “rigorous” enough that our students won’t state in teacher evaluations that a course was “fun” (that's a death knell if we come up for reappointment or promotion).  We devise, or subscribe to, theories so cognitively elaborate and jargon-filled that anyone who studies them will experience the deep level of misery appropriate to any serious intellectual pursuit.  We create labyrinthine requirements, parse the definitions and connotations of words ad nauseum, develop endless committees, stir up internecine fights, and basically do whatever we can to make sure we are miserable enough to be considered worthy.
 
(And to think all this was prompted by my dilemma regarding whether to put the crayons in the “Art” container or the “School” container. No wonder my attempt at home organization is moving so slowly.)

0 Comments

    Archives

    November 2016
    October 2016
    July 2016
    June 2016
    April 2016
    February 2016
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014
    July 2014

    ABOUT THE AUTHOR

    The Underground Professor teaches English at a small private university. This blog explores how we might revitalize the humanities in the twenty-first century--and why it's important that we do so, in light of the our culture's current over-emphasis on profitability, quantitative measurement, and corporate control.


    Categories

    All

    RSS Feed

Powered by Create your own unique website with customizable templates.