Notes from the underground professor
  • Underground Professor
  • About This Blog
  • Why the Pseudonym?

"Can't you English teachers stop analyzing everything and just enjoy the damn movie?"

7/30/2014

0 Comments

 
If you teach English for a living, you've probably gotten that question more than once, whether from students, friends, or random people you meet at parties who freak out when you tell them what you do for a living. (The most common freak-out reaction is usually, "Oh, I can't be friends with you because I'm terrible at grammar," but this one is a close second.) Many people outside our profession tell me that they think English teachers "over-analyze," and that too much analysis "ruins" whatever they are reading or viewing, and that that's why they hated their English class, and  that's why they hate all English teachers, and that's why what I do for a living is stupid, etc., etc.

I always tell such people that yeah, I tend to analyze a lot (I wouldn't say over-analyze, because to me the amount of analysis I engage in feels just right, like Goldilocks discovering a perfectly heated bowl of porridge). But my penchant for analysis doesn't "ruin" anything. To the contrary, thinking more deeply about what I'm reading or watching leads me to enjoy whatever I'm doing even more. Here I'm not defining "enjoyment" as laughing my head off, though sometimes it might mean that; I mean it more in the sense of feeling deeply gratified. And for all the various theoretical and critical schools I was exposed to in grad school, sometimes I enjoy going back to the old basic literary devices, to think about how they function and why they should matter.

Just the other day, for instance, I was contemplating dramatic irony--what happens when we, the audience, know more than the characters in the story do. (Yes, I really do sit around thinking about things like that.) Sometimes the results are hilarious (“No, no, he’s not having an affair with the chambermaid—you just happened to drop in at the wrong time”), sometimes scary (“She’s really going to go back outside alone in the dark after she’s heard that noise?--DON’T DO IT!”), sometimes maddening ("Romeo, you idiot, she's not really dead!" "No, Oedipus, NOT HER!")--and sometimes mildly amusing, such as when we knew all along that Harry and Sally were never destined to be “best friends.” 

Whatever the effect, dramatic irony reminds us that if we could view situations on a larger canvas—from the bird’s eye perspective, complete with back-story, character motivations, and other elements that those within a situation can't see—we'd probably draw different conclusions and make different decisions than we do from inside a situation. On one hand, those going through something "know" it best; they're the ones experiencing the physical, psychological, and emotional sensations and consequences.  Yet on the other hand, they cannot grasp—or even see—the broader canvas on which their situation is unfolding. This isn't because they are particularly flawed, selfish or evil people (though that sometimes may be true). It's simply because they are people. 

And as people, we are all limited—sometimes even trapped—by the same subjective limitations that, paradoxically, make our lives and perspectives possible.  It is our individuality that both makes us “us,” and prevents us from seeing the grand sweep of things that, if it were possible for us to see, might well lead us to make different decisions. And yet we are, all of us, stuck here, inside our small, limited, individual selves.  We can’t escape our own skins long enough to see, and we can't see broadly enough to understand. We're all operating in the blind.

But we can read, or watch plays, or view movies.  And through the literary device of dramatic irony, we can watch other people—sometimes invented, sometimes not—live out the comical, horrific, tragic, or romantic implications of the decisions that they, like all of us, are forced to make with constricted vision and limited knowledge (because foreshadowing can only be written in retrospect).  And if we think deeply about what we encounter in stories, we still can't transcend our limitations or crawl out of our own skins--but we can imagine what it might be like to try. 

Through imagination, we might begin to realize that we exist as part of a more expansive cinescape that isn't really "all about us" after all. We might become aware that we’re not writing our stories entirely, that much of what happens in our lives is "writing us" (though sometimes we'd rather not know that). We might grasp that ours is only one perspective among many--even if we do not fully understand those perspectives, let alone agree with them. We might realize that we don't ever see the full picture because nobody ever does, and we might grow more cognizant of the folly of passing quick judgments on situations that we are neither qualified nor required to judge.  

And in the process of recognizing complexity, imagining possibilities of multiple viewpoints, hidden information, obscured motives and unpredictable futures, we might begin imagining our way out of the subjective trap.  Best case scenario: we become capable of asking more compelling questions, withholding or suspending judgment, staying open to new perspectives and the possibility of changing our minds.

As we consider additional possibilities beyond our narrow individual limitations, we may be able to grow larger—less self-absorbed. We might come to realize that the others in our lives don't exist only as minor characters in our personal dramas, but are the central protagonists in their own stories. We may begin to view those who are “other than ourselves” with a small rather than a large “O”—fellow human beings, in some ways different from us yet in other ways, maybe not so much.  We might begin to realize that other people are human beings with intrinsic value, not objects to be manipulated or resources to be commodified. That realization might change the way we treat people.

Maybe this is what can happen when we study the humanities. In one stock definition, the humanities are explained as "the study of what it means to be human," and lots of people laugh snidely at that. Contemplate our human-ness?--why would we need to do that? Let's just get on with the business of making money, and when we're done with that we'll find ways to escape the drudgery, and who needs anything more? Do cats sit around contemplating their feline-ness? (Actually, I suspect my cats probably do.) Stop over-analyzing and ruining my favorite movie, you stupid English teachers. 

Perhaps the more compelling definition is that the humanities explore what it means to be humane. And at this moment in history, when too many people fail to recognize others as fellow humans, when too many of those with wealth and power view people as human "resources" rather than as human beings, when too many see violence as some kind of answer, when so much of the human touch has been replaced by the cold, mechanistic logic of the unfeeling machine--and, not least of all, when the formal study of the humanities is under threat--right now we need the humanities more than ever.
0 Comments

Storytelling Won't Disappear

7/27/2014

0 Comments

 
Recently while on vacation, I experienced a bibliophile's dream: the opportunity to lose myself at Powell's Books in Portland, that destination "temple of reading" for book lovers everywhere. Imagine a full city block, four stories high, with books, books, books, and a few book-related gifts--and a staff who understands what's on those shelves. For a lit-geek like me, this was heaven on earth! The combination of tight-ish budget and traveling by plane forced me to control myself--sort of--but I did pick up a few things, including Jonathan Gottschall's The Storytelling Animal: How Stories Make Us Human.  

The jacket blurb sounded promising, with its assertion that "humans live in landscapes of make-believe" (I agree), and its claim that this is "the first unified theory of storytelling." (On that point, I thought the book fell short--Gottschall touched on many intriguing ideas but often trailed off before fully developing them.) Of course I'm reading from the perspective of someone who studies and teaches English for a living, and this book is too thinly researched to be called "literary scholarship." Other types of readers, however, may have a different take.  I did have the impression that Gottschall--though he is an English professor--is primarily targeting a non-academic readership to make his case for the centrality of storytelling to human life.  

That aspect of the book, I especially like: I'd love to see more academic writers aim toward a broader readership, and I strongly believe we have done much to undermine our own cause by talking primarily amongst ourselves, using overly specialized language, and distancing ourselves (often arrogantly) from other sectors of society.  Perhaps one of the most effective ways in which we can push back against our own marginalization is to disseminate what we teach beyond the classroom walls, and Gottschall deserves props for attempting to broaden the discussion in that way. The Storytelling Animal succeeds in pointing out that (a) storytelling is not going away any time soon, and (b) despite rhetoric to the contrary, we don't function as completely utilitarian/economic beings. Our psyches are too complex for that. (For one thing, we'll never be able to stop ourselves from dreaming, and dreams are structured as narratives, however surreal they may be.)

I did have problems with several aspects of this book--Gottschall's reductive gender politics, some superficial analyses, an over-emphasis on fantasy, and some strange inclusions, such as a photo of a porno film set that sheds light on ... absolutely nothing. (I'm not a prude and I wouldn't object if that picture did something to further his argument, but determining its relevance was a stretch.) I wish I could write a more enthusiastic review, since the underlying premise is so compelling. The centrality of story in human life is indeed something we would all do well to understand more deeply and discuss more widely, not just in English departments. 

After all, even people who don't see themselves as in any way "literary" are entrenched in storytelling.  Take sports, for instance. You've got all the elements of compelling narrative--heroes, villains, obstacles, a specific context (rules of whatever game is being played), back-stories, rivalries, complications, interpersonal drama, standard expectations, surprise twists. Sometimes you get predictable endings; other times you get underdog victories, expectation-shattering injuries, erupting fights, last-minute reversals of fortune. We not only get the "story" that unfolds during the game itself, but the story leading up to the game ("what do we expect to happen?"), the story after the game ("what actually happened?"), the endless pre- and post-game analyses.  Story depends on characters and conflict, and sports offers all that in droves--plus the added twist that sports offer true surprise.  They're one of the few forms of cultural storytelling in which we don't already know the ending in advance.  Seahawks winning the Super Bowl? (Yes!) Oh, the drama of it all! Narrative doesn't get any more compelling than that.

(Usually every post-game interview concludes with some version of the following dialogue:  "Q. What happened out there today, Jack?"  "A. Well, Frank, the other team scored more points than we did, and whenever that happens, you lose."  So yeah, much sports analysis is neither insightful nor particularly necessary--yet even those predictable pre- and post-game dialogues are part of the expected narrative structure.  And what would be the point of the "big game" if you didn't have anyone to talk to about it?) 

Where Gottschall does succeed, despite my quibbles, is in emphasizing the centrality of storytelling to human life. From the fantasy play and superhero scenarios we engage in as children, to sports, to TV (both reality and scripted), genre fiction, dreams, role-playing games, politics, conspiracy theories, cautionary morality tales, family lore, even the stories we tell ourselves about ourselves (which, paradoxically, often can have the effect of limiting us)--story can't be escaped. And despite our present cultural over-emphasis on all that is quantifiable, measurable, utilitarian and "practical," Gottschall is correct when he points out that storytelling today is more ubiquitous than ever, provided you look past the academic walls and beyond the typical expected places.

In one of his strongest passages, Gottschall argues against those who blame virtual worlds and technology for increasing social isolation: "Virtual worlds," says Gottschall, "are less a cause of that isolation than a response to it." Role-playing games may be a reaction to the "repellent force" of "real life" with its "bleak concrete landscape of big-box stores and fast-food joints," low-paying and meaningless work, and detached neighborhoods devoid of connection and community (195).  If "real life" continues to dehumanize us, Gottschall suggests, then "the real threat isn't that story will fade out of human life in the future; it's that story will take it over completely" (198).

He has a point; humans don't just passively accept a dehumanized world. In some quarters it's popular to pan "escapist" stories as substandard, but when too many of us are trapped in circumstances where fantasy escape is our only real option, escapism will flourish. After all, there's only so much crap we can put up with in "real life," and many people are already putting up with way too much of it; why not turn to something that allows you, for a change, to play the vanquishing superhero?

I appreciate Gottschall's broadening of the concept of narrative, and the way he reaches beyond "classical" literature (or even film, which has finally garnered respect in academic circles), to recognize the omnipresence of storytelling. I've long felt that given its centrality, it is odd that English is so often relegated to the metaphorical basement: Isn't the study of language and storytelling really where the study of everything else begins? And in terms of the employability of English majors, shouldn't the skills that we have to offer be highly valued in multiple economic sectors? (And no, I'm not just talking about proofreading, though we can certainly can be useful there as well.)  For that reason alone, I appreciate what this book is trying to do.

Yet for all my griping, The Storytelling Animal left me with some crucial questions: How can those of us who understand the centrality of narrative persuade those in power to make the study of English as central to education as it should be?  (Here I'm talking about education at all levels.)  How can we devise and successfully institute qualitative assessment methods that recognize the limitations of numerical measurement?  How can those of us with English (or related) degrees successfully market ourselves in the work force? 


Given the current grim situation, with both money and influence concentrated in the hands of a powerful few who have the means to push their agenda on billions of others, how can we take back education for liberatory rather than utilitarian purposes? (See, for example, the role of the Gates Foundation in pushing the Common Core Curriculum onto America's public schools--an approach that conceptualizes job training as the primary purpose of education and, consequently, over-emphasizes the study and production of nonfiction at the expense of fiction and poetry).  With multiple, powerful, uber-wealthy forces amassed against those of us who value that which is literary, creative, and qualitative, what we can possibly do?

The Storytelling Animal, despite its limitations, makes the crucial point that storytelling will never disappear. To me, this suggests that an educational approach too dismissive of story's centrality will ultimately fail to engage learners. I agree with Gottschall that our dehumanized environment is likely to drive human beings further into fantasy worlds rather than away from them. We're human, after all (theories to the contrary notwithstanding), and we will do what humans do: make narratives, engage in a quest for meaning. I don't believe we'll stop doing those things. I think the real question is whether all that will happen inside formal educational settings, or primarily elsewhere. 


Questions, then, so often lead to other questions: For what reasons should we continue to study the production and analysis of narrative in educational settings?  If storytelling will proliferate regardless of whether or not English departments continue to exist, how can we argue successfully for keeping literature alive within the academy? And as teachers, what kind of approach should we use if we want to keep the study and production of storytelling relevant, compelling, and alive?

0 Comments

RESPECTING THE AMATEUR (a long meditation)

7/20/2014

0 Comments

 
http://www.theguardian.com/stage/theatreblog/2012/jul/18/am-dram-professionals-acting

The link above leads to an article by Lyn Gardner in the theatre blog of the British newspaper The Guardian: “Hurrah for am-dram: why it’s time to applaud amateur actors.”  (Love that term, “am-dram”—I’d never heard that one before.)  This piece was timely since just last weekend, my family, two close friends and I attended a community theatre production of Sondheim’s “Into the Woods” while vacationing together on the Oregon Coast.

The production, we all agreed, was compelling. Sure, a couple of the voices were weaker than others, but that’s to be expected in a community production. Yet all of the voices were pleasant, and a couple of them were as strong as any professional. All the actors displayed a high level of skill, the costumes were appealing, and the set, though minimalist, was evocative. Overall, the director clearly knew what he was doing.  It made for an enjoyable evening out, and for a fraction of the cost of attending a professional version of the same show down the road a ways, in Ashland.

So what is the difference between “professional” and “amateur”? Is it only about money? And why does “amateur” have such a bad rap, especially when it comes to the performing arts? 

One point made by a friend who attended with us: “In community theatre you see regular people, with regular looking bodies and faces.” Not the Hollywood or Broadway ideal, so there is that. Look like an ordinary person, and in this society you’re more than likely to end up leading a so-called “ordinary life”—though I personally don’t believe there is such thing as “ordinary.” I think each individual life is amazing. But that’s another subject altogether.

Granted, sometimes the performances are just bad—as Gardner admits when she confesses that she recently made an excuse to depart early from an “execrably acted” production. Hey, we’ve all been there. (Sometimes you can’t get away with leaving early—because you have a family member or best friend in the show. Worst case scenario: the person you know is the one doing the execrable job.)  I remember wincing frequently while watching a community theatre production of My Fair Lady in which an alto had been cast as Eliza Doolittle, and she never could hit the high notes. I was relieved when she decided she really could not dance all night and gave up trying. (I found myself wanting to sing along with the maid as she pleaded, "I understand, dear/It's all been grand, dear/But now it's time to sleep!") Sometimes, “amateur” can mean excruciating.

But to be fair, I’ve also experienced various kinds of “execrable” at Broadway shows, and off-Broadway shows, and London shows both West End and fringe, and professional productions, both resident and touring, in several cities. (And for that matter, not even Audrey Hepburn hit the high notes as Eliza; her part was sung by Marni Nixon, who also voiced Deborah Kerr’s Anna in The King and I and Natalie Wood’s Maria in West Side Story. If you want to see what Marni looks like, she’s Sister Sophia in The Sound of Music.)  So professional does not always equate to “good,” and unlike amateurs, professionals often get help, and a lot of it.

I agree with Gardner when she points out, “Who says amateur has to mean amateurish?” In Waiting for Guffman, Christopher Guest presents a hilarious send-up of overly self-impressed, delusional community theatre types whom we probably all recognize—people with outsized egos and a zillion stories about why they’re not rich and famous when they really should be. I do love Guest’s mockumentaries. But I also wondered as I watched Guffman—as I do whenever I see a particularly compelling community theatre production, such as the stunning rendition of the controversial Spring Awakening that played in my town last year: Are these stereotypes entirely fair? Do we give amateur artists and performers too little respect? Do we buy too easily into elitism? And if we do, what does that say about the fate of the arts--and the humanities? If we want to keep the arts alive, shouldn’t we be encouraging them to be disseminated and engaged in as widely as possible, rather than making them ever more rarefied—not to mention ever more financially inaccessible to the average working person? 

Like so much of the good stuff I find online today, I found this article on Facebook, posted by a friend with whom I share several mutual interests.  Like me, she’s an amateur musician--though her current status as a “mature student” at a British conservatory marks her as a higher caliber of “amateur” than me. (I did, however, spend the better part of my Sunday morning today trying to improve my current work-in-progress, a piano solo rendition of Queen’s “Bohemian Rhapsody.” Everybody needs a hobby.)  As my friend pointed out in her post, “One could make many of the points in this article about music too.” 

True, I think: Gardner points out that “the popular notion of am-dram is still one of pensioners with too much eyeliner,” and similarly, when we hear “amateur musician,” we might conjure a middle-aged woman who was the star of her high school vocal ensemble but now only feels truly alive once a year when she solos on “O Holy Night” with the church choir on Christmas Eve. Or a forty-ish bearded dude in the midst of a midlife crisis, who gave up college in hopes of “making it” with his band, gave it all up to get married and raise kids, and has recently resumed playing his guitar at coffee shops on open mike night (for a while he tried to write his own stuff, but the unenlightened patrons just wanted him to sing covers of James Taylor, so he complies even though he actually conceptualizes himself as the second coming of Mick Jagger).  The amateurs among us often get little respect.

And if you're one of those who “professes English” for a living, I don’t even need to tell you how the academy conceptualizes the “amateur writer.” Nowadays, we all know that “anybody” can get a book into print through various self-publication platforms, whether print, online, or both. For an English professor, it's the professional kiss of death if we go that route; we may teach writing and dedicate ourselves to promoting literacy, but "anybody” is, apparently, not the person we want writing books. (The already-famous, of course, can pretend to have written books even when someone else did the actual writing, but that's a whole different subject.) 

Amateur artists in all fields, it seems, get little respect. When someone majors in an arts-related field yet goes on to make a living doing something else, it’s often perceived that the person isn't  “doing anything” with their degree—meaning they don't earn their living by doing it, even though they may well still engage in their creative pursuit--painting, singing, writing, acting.  Yet all too often,  amateur pursuits are seen not as admirable, but as slightly pathetic—delusional, even.  Doesn’t Joe, who tries out for every lead role at the local Thespian Society, realize he’s never gonna make it in Hollywood? Who does that soprano think she is, Beverly Sills? Great-aunt Millie says she’s just published her memoirs, but she didn’t write a real book, did she? How dare people play at the arts? Can’t they leave it to the pros? Don’t they recognize that they are losers?

This attitude may not always be expressed explicitly, but I’d argue that it does pervade our culture. It also seems to me that this attitude is peculiar to the arts. Nobody argues, for instance, that we should all give up exercising unless we are capable of becoming Olympic athletes. Nobody goes up to the guy tossing a football in the park with his kids on Saturday afternoon and tells him to leave the passing to Russell Wilson. Nor do most folks assume that we should all resort to microwaving frozen entrees just because we’re not Iron Chefs.

In other endeavors, we assume that being an amateur is okay. Expected, even.  Yet when it comes to the arts, those of us who engage in “amateur” pursuits are often reluctant to admit to it—and certainly, if we're not getting paid for our creative expressions, we don't dare call ourselves “artists.” That would be pretentious--and we' leave pretentiousness to the "real" artists, the professionals (who have, at least, earned the right to be pretentious).

A lot of us complain, rightly so, that the  arts are presently under attack.  Why? There may be lots of answers here. Could an insidious, pervasive elitism that too many of us have internalized possibly be a factor?  After all, creative expression flourishes all around us in spite of it all, because that is what human beings do—not always well, but maybe that’s okay.  Maybe the arts that we perceive as “lost” are not really lost--but maybe we  need to develop a less narrow way of understanding them.

And maybe we need to quit thinking that the only people who are “using” their degrees in arts-related fields are the ones who are making money by doing so. Maybe, like exercise and food, creative expression should be seen as everybody's right--something expected, even--whether we’re particularly good at it or not.

How to stop schools, from K through 12 and on into college, from cutting arts programs in this age of No Child Left Behind, Common Core nightmares, technophilia, and austerity measures? How to argue for their survival in formal educational venues? How to keep them alive outside formal settings when some of those attempts fail? Stay tuned.

Now, back to that amateur piano rendition of “Bohemian Rhapsody” . . .

0 Comments

"The map is not the territory"-the limits of the machine

7/18/2014

1 Comment

 
"The map is not the territory." That's an old military saying, I've been told by some old military folks, and it popped into mind the other day as I was writing my previous post on information, knowledge and wisdom.  Pondering this saying also brought to mind a memory.

Back in 2002, for a complex set of reasons, we took a road trip from Reno to Yosemite. In those days we were slightly pre-GPS but post-Internet, and we had Yahoo Maps. In what I believed then to be a masterful technological feat, I plugged in our starting address and our destination motel in Yosemite, pressed a button, and voila! Step-by-step directions, and a map. I printed everything out and we navigated like pros, south on the 395. Not too complicated, but once we left the highway, the map suggested that the roads were going to get windy-er. And narrower. And less well paved.  

Turn left here, turn right there, turn again--here? Really? Though we were growing increasingly dubious, we knew we'd been following the directions precisely--even though with each turn the breadth of the road shrank by a foot, or the quality of the paving surface reduced by one grade, or both. Pretty soon we found ourselves driving down a narrow twisting gravel lane, heading steeply downhill toward a tiny log cabin with a carport.  The carport was also a dead end.

A predictable driver-navigator marital spat might have ensued here, but we were spared by the timely intervention of a weary-looking bearded man, who exited the cabin's front door carrying a little stack of stapled papers. "Oh God," he sighed, shaking his head, "another damn Yahoo Maps user." The papers he handed us provided handwritten directions to the highway from his carport. 

"You get a lot of these around here, I take it."

"At least three a day.  Damn Yahoo Maps.  I keep a bunch of these things ready for the likes of you. Ya try to live off the grid and look what happens..."

The map is not the territory. The machine has its limits. (If you've ever heard a GPS trying to pronounce Hawaiian street names, you know.)  I suppose some techno-geek could weigh in here and tell me that I'm not up to date on the latest in artificial intelligence and that the machines are becoming smarter, look at all the technological improvements we've had even since 2002, etc. All true. 

But I strongly believe that in almost all endeavors, we are always going to be better off if we employ human judgment and engage in person-to-person communication. To some extent, "artificial intelligence" will always remain oxymoronic. 

If you want to learn more about why, you might want to talk to a court reporter. That was my job for two decades--which meant that when I disclosed my occupation in social situations, I was typically subjected to a round of tedious and predictable questions. One of the most recurrent was, "Why can't they just use tape recorders?" 

The short answer: "Because somebody still has to transcribe it." And when 100% accuracy is expected, you can't rely solely on a machine. A real human being knows if a statement has been obscured by outside noises and can ask for it to be repeated on the spot. A real human being can be attentive to various accents. A real human being can tune in to the vocal inflections that help determine whether the speaker implied the use of a full stop or a question mark, which in some legal situations might crucially matter. A real human being can put a stop to overlapping speech before it gets out of hand (or can at least try--if any court reporters are reading this, they're probably laughing. Let's just say people don't always listen to us.) A real human being can read back stenographic notes on the spot when a dispute arises over what was actually said. Those are just a few examples of why we need real human beings to transcribe important legal proceedings; there are more.  

The court reporting profession has been battling this dehumanizing incursion of the machine for many decades--far longer than academia has been. Yet the profession still exists, thanks to the well-informed activism of many dedicated court reporters and, not least of all, their unflagging solidarity on this issue. They have not won every single battle (especially in smaller venues), but in terms of the larger war, court reporters are still with us. If those of us in academia want to fight back effectively against the incursion of the machine into realms where real human beings are still necessary, we would do well to learn from them.

The other day I was speaking with a close friend who dispatches delivery trucks, along routes that he used to drive himself. A young newbie driver had recently griped about my friend's dispatches: "Why have you got me going here and then here and then taking the long way around here? Don't you have me going way out of my way?"  

My friend's response: "Because for decades I drove those routes myself. I know where the gullies are, I know where the ravines are, I know where you're likely to get in a traffic jam at lunchtime and how to avoid it, and at each place you stop, I know which direction you want the truck to be facing to get your hose where it's needed."

The incursion of the machine, the difference between map/information and territory/knowledge--these aren't only issues in the field of education. These are issues for everybody.

Many of the powers-that-be would like nothing more than to eliminate human beings from the workplace altogether. We're such a nuisance, after all: we insist on things like bathroom breaks and meals and time to sleep (the nerve!). We get sick, we have babies, we ask for time off to attend funerals. Some of us even have the gall to expect occasional time off for leisure--how crazy is that? All this costs money, and line item expenditures in your budget are always a bad thing.  Just look at how much money we could save if we did away with people. 

It might sound like a genius idea, if you're relying solely on the logic of the machine. But the machines aren't always right, and without a human being to make the necessary judgments and adjustments, our whole society might easily end up in a metaphorical carport, at the dead end of a narrow, steep and twisting gravel lane. 

1 Comment

Information, knowledge, wisdom: On the limits of canned curriculum

7/15/2014

3 Comments

 
A trend I find especially disturbing right now:  canned curriculum. It’s provided by educational vendors--who, of course, stand to make huge sums of money by selling their cans.  Here is the information you will convey; here are the lesson plans you will use to “deliver” that information; here are the “self-assessment tools” that students will use to figure out what they’re doing right and wrong (not through human-to-human communication, but through computer algorithm); here are the desired “student learning outcomes" (e.g., "Upon completion of this course, students will be able to . . ."); here are the tests that will allow you to “assess” whether students have met those learning outcomes. 

Deliver this package and your task is done—content “delivered.” Information that had not yet penetrated the brains of the target population has now been successfully deposited into those brains. We know this because multiple choice tests have delivered the required results: An appropriate percentage of students has “exceeded,” “met,” or “approached” the outcomes, as measured by percentage of correct answers/guesses. 

Meanwhile, the companies that have stuffed curricula into “cans,” and “marketed” them to (or foisted them upon) entire educational systems, pat themselves on the back for increasing company profits and shareholder value.  More money to plow back into more "marketing”—wining and dining those who hold curriculum decision-making authority, hosting elaborate cocktail parties at academic conferences (each for-profit educational enterprise trying to outdo the next), paying commissions to traveling salespeople, lobbying Congress to mandate standardized tests (also for-profit) issued by these same companies, which will necessitate more canned curricula and tests (and test preparation materials) that only they can provide. All for a price. Genius system.

The Brazilian educator Paulo Freire referred to the "information deposit" approach as the "banking model" of education, and there are more valid arguments against that approach than I can possibly fit into one post. Thus, this is a topic to which I will return repeatedly. For now I'll just say that in the most extreme version of for-profit, corporate, and dehumanized canned curriculum, a human teacher is eliminated. Make no mistake: Even as we speak, for-profit corporations are working their butts off to create and market electronic college courses that can be delivered without a professor at all.  At best, such "courses" will be "facilitated"--and guess what? "Facilitators" will work on a contract basis, for less than a living wage, without benefits. Sound familiar?

Dehumanized education is a profiteer's dream. And to make it work, one needs to subscribe to an educational paradigm like the one expressed by Charles Dickens' fictional school headmaster Gradgrind in the 19th-century novel Hard Times: "Now, what I want is, Facts. Teach these boys and girls nothing but Facts. Facts alone are wanted in life. Plant nothing else, and root out everything else." (Side note: On shmoop.com, someone points out that even as Gradgrind extols the virtues of "facts," he can't avoid using a metaphor. Ha.) If your vision of education varies from the banking model, it can't be delivered by machine. 

Some may ask: So what? Isn’t the transfer of information the definition of learning?  Isn’t the purpose of education to convey a body of information into an individual brain? And if corporations happen to make a few bucks off the process, so what? Everybody's gotta make a living, right?

But information is not the same thing as knowledge.  Knowledge is not the same thing as wisdom. 

It’s now beyond mundane to note that we are awash in information, that information is at everyone’s fingertips, blah blah blah. Some of the anti-college blogs I’ve skimmed lately go so far as to posit that college is now unnecessary because “anyone can Google anything.” Personally, I hope the people I interact with professionally--whether they are doctors, lawyers, dentists, accountants, school teachers, music teachers, cooks, carpenters, nurses, pharmacists, auto mechanics, or landscape gardeners--have not relied solely on Google. I also hope they haven't been taught primarily through disembodied online modules and assessed primarily through multiple-choice online quizzes (though I do believe quality online education is possible, provided there is an expert human being designing the course and interacting with the learners).  I also hope the people I deal with haven't been conditioned to look for the one “right” answer lurking among four possibilities (one or two of them probably absurd), as though all of life were an endless episode of Who Wants to Be a Millionaire. Life isn't like that and education shouldn't be either.

There’s a big difference between the great glob of disorganized facts that we call “information,” and knowledge--a deep, cognitively automated understanding of an organized, purposeful body of information that the person-who-knows can use to achieve a particular goal. Cognitive psychologists further point out the difference between declarative knowledge, or knowing "about" something--and procedural knowledge, or knowing how to do something. (The stadiums of America are filled during football season with thousands of people who think they "know" how to throw a touchdown pass, and two guys down on the field who actually do.)

How does information become knowledge? Through human-to-human interaction, and through hands-on practice. (I'm happy to provide references from cognitive psychology to anyone who asks.) People who "know" things know how to do things--whether you're talking about building a house, analyzing a historical event, preparing a tax return, explicating a novel, or fixing your overflowing toilet. Knowledge can't be captured in a can, and not all learners respond to--or need--the same can. Knowledge develops in conversation between human beings. Knowledge also requires the learner to practice doing the thing being learned, under the guidance and with the corrective feedback of the experienced mentor.

Then there is wisdom (bringing to mind the old joke about being "knowledgeable" when you know the tomato is a fruit, and "wise" when you realize it doesn't belong in a fruit salad).  Wisdom can't be canned, it can't be automated, and it certainly can't be sold. It also can't be rushed. Nor can wisdom be separated from deeper questions: of morality, of life's purpose, of values, of meaning. 

A true education leads not just to knowledge, but to wisdom.  While that can happen in school settings (though often it doesn't), it can also happen outside of school (and often does). And while an education leading toward wisdom can happen in concert with one's occupational training, that is not always the case. Not everyone with a formal education grows wise, and I've known many wise people whose formal education ceased early. (One of my personal gurus was a self-educated uncle, famous among our family for his book obsessions.) 

That being said, I still believe it's crucial to preserve the ideal of the university as a place that encourages the pursuit of both knowledge and wisdom, and the perpetuation of both for future generations. Are formal institutions of learning the only place where that should happen? No.  Should those of us who attended college feel superior to those who did not? Never. (Arrogance is obnoxious, and arrogance on the part of academic folks does much to alienate those following other life paths--something we can ill afford right now.)  Does this mean job training shouldn't matter in college? Of course not.  But if we hold to the ideal of the university as one crucial site for the pursuit of knowledge and wisdom--as well as certain kinds of job training--all of society will benefit. 

And if we don't?
 
Do we want our citizenry to be educated by machines, assessed by machines, and certified by machines? Do we want to have the whole process managed by impersonal, profit-minded corporations who pour their resources into luring educators with high-end, can-you-top-this cocktail parties (even as their ultimate goal is to render some of their own customers obsolete)?  Do we want to perpetuate the specter of Dickens' misguided Gradgrind, by arguing for an education that teaches "facts, nothing but facts"? As human beings, aren't we meant for more than that?

Have we stopped and thought, truly thought, about what kind of world we are wishing for if we turn a blind eye to the long-term implications of abandoning the pursuit of knowledge and wisdom in favor of a for-profit, utilitarian-minded, canned curriculum?

3 Comments

WHAT IS COLLEGE FOR?

7/7/2014

3 Comments

 
When I was growing up in the seventies, many things seemed like societal "givens": that colleges serve a useful purpose, that higher education is desirable for those so inclined, and that while private colleges are understandably expensive, state institutions should remain affordable and accessible to anyone with the desire and ability to succeed. Nowadays some of these once-basic assumptions seem to have been thrown into question, and like a great many of us (both inside and outside the profession), I often find myself asking, Why? 

(Not to idealize the seventies too much, tempting though that may sometimes be--those of us who lived through that era could no doubt come up with a litany of travails that plagued us at the time, from bad fashion choices to serious social ills. But for now I'm going to set aside the fact that memory skews toward nostalgia and talk about the bigger issue of what's changed with regard to general assumptions about college, and why.) 

The easy answer seems to be cost--a no-brainer. All of us know that when there's a line item in the budget that seems potentially optional, we're more likely to decide to live without it if its price spikes (especially when our income doesn't).  In 2012, Bloomberg.com (here) reported that in the prior 35 years, the cost of college (tuition, room and board) increased a staggering 1,120 percent, far outpacing inflation. Thus, the nearly daily dire warnings about student indebtedness today seem like a logical consequence: Of course we're asking whether college is worth it. Time ran a 2012 story asking that very question, and its graphs based on data from Pew (here) explore the issue in purely financial terms: cost, payoff, and so forth.

What if we're asking the wrong question?  

Buried in the same graph--yet barely addressed by the Time article--I noticed that while a "slim majority" of surveyed college graduates (55%) believed that college "prepared them for a job," far more (74%) believed they grew intellectually, while 69% believed that college helped them to mature. Thus, a far greater percentage believed they benefited from college in less tangible ways that can't be expressed through economic terms--yet the Time article itself hardly even touches on that data, as though intellectual growth and maturity matter far less than earning power and don't even factor into the question of whether college is "worth it."

The graduates' responses certainly ring true with my own experience: I began college as a thirty-year-old freshman, having entered a career-oriented training program after high school. In my twenties I was pulling down a pretty fat salary (in Wordsworth's terms, I was "getting and spending," and I was beginning to "lay waste [my] powers").  I returned to college by conscious choice because I felt, as strongly as I've ever felt anything, that something in my life was missing. I wasn't looking for more money, I wasn't looking for status, and I wasn't looking for a new job, though in the long run I ended up with one. (My timing was fortunate; I doubt the meandering path I took into academia is even available to somebody in such a position nowadays.)

Was it worth it? Absolutely, one hundred percent, without a shred of doubt or a moment of pause, yes, yes, yes and yes. (The only things in my life that have been more "worth it" are my relationships with spouse, children, family and close friends; travel and education are tied.) What was I looking for? Though it was hard for me to articulate back then--and in some ways still is--I believe now that I wanted a more profound understanding of life, a more expansive sense of the world, and a clearer understanding of my place in it. I wanted my own world to grow larger; I wanted my life to mean more. 

In all that, I succeeded--not in answering all the questions, but in understanding more deeply what the big questions are, why they matter, and my own relationship to them. Education expanded my world, and it never shrunk back to its original smallness. Paradoxically, as that happened, my own ego became less important. A quality humanities education can teach you that while your life does matter, you're also one of several billion people presently on the planet (not to mention the countless trillions that came before and will come after), all of equal worth, and whatever triumphs or travails you may have experienced, you haven't been singled out; stuff happens to all of us. Your story matters--and it's also part of a much larger tapestry of stories, which matter as much as your own.

The result of a strong humanities-based education is paradoxical: it humbles you at the same time as it empowers you. Done right, it can enrich not only the individual who experiences it, but society as a whole (imagine what might happen if  if more people come to see the big picture and work together to create a world in which all human lives have value.)  Try putting a price tag on that. 

You can't. Frankly, I would have been far better off financially had I stayed in my prior career. This I know, though I've never run the numbers to figure out exactly how much I "lost" by choosing education over earnings.  I don't intend on doing so--because it doesn't matter. In Oscar Wilde's play Lady Windermere's Fan, the character of Lord Darlington famously defines the cynic as "a man who knows the price of everything and the value of nothing."  I'm not cynical.

Far less famous is Cecil Graham's rejoinder: "And a sentimentalist, my dear Darlington, is a man who sees an absurd value in everything, and doesn't know the market price of any single thing." Graham has a point too, and I'm also not sentimental. Even if everyone were to agree  tomorrow that the real purpose of college extends beyond money, some nagging problems would remain, the most obvious being:  Who pays? Can society really afford to grant a huge swath of its citizens the opportunity to lead more "meaningful" lives? Is it easy to blather on about "meaning" when you've already got enough money to put food on the table? How do we meet people's material needs?

I fall somewhere between Darlington and Graham: yes, there is such a thing as value beyond price, and right now our society has gone way past the tipping point, accepting the fallacy that only quantifiable things matter. And yes, money matters, and only a fool would claim otherwise. Frankly, it was largely because I'd had job training before I went to college that I was able to go at all. (The story behind why my parents couldn't send me is far too long to share now, though I may get into that later.)  Somebody's got to pay, and when it's over, even the most idealistic college graduate is going to need to find a job. 

Then there's the question of why college is outpacing inflation so egregiously--a vexed and complex issue that I will address in the future. When I started college in 1991, it wasn't cheap, but at the state universities I attended, in-state tuition was at least manageable. Easy for me to say it was worth it; it was a stretch, and we made a few sacrifices, but ultimately I could afford it. Is there a "price point" (a marketing term I hate, but it's a buzzword) at which my analysis no longer applies?

Another problem is less obvious and that is: How do we express the non-material value of college in a way that does not insult those who didn't attend, whether by circumstance or by choice? Something I remember from my own fourteen-year gap between high school and college: When college-educated folks discuss how college has enriched their lives, those who didn't go often hear something like this: Your life is not as enriched/meaningful/valuable as mine. Tricky, that one.  (And it's a real thing--ask anyone who didn't go.) I realize now, three degrees later, that college did enrich my life immensely. I also refuse to believe I was a "loser" before I went--or that anyone's quality of life or value as a human being should be defined by their educational level.

(Now that I'm one of them, I also realize today that there are probably a good many college grads who never intend to come across as snobs. However, there are such people among the educated, and right now elitism isn't helping us any when it comes to persuading people of the non-monetary value of college. Just because a humanities-based education can have a humbling, ennobling effect, it doesn't automatically follow that it will--and hypocrisy is as rampant among educated idealists as among any other group of people. Hypocrisy, in fact, seems to be a rather common human failing--one reason I never tire of teaching Moliere's Tartuffe.) 

So how do we talk about the non-tangible purposes of college without sounding elitist and exclusionary? How do we design college education so that it can do two necessary things simultaneously: enrich our lives and prepare us for jobs? How do we reconfigure higher education so that it more often achieves all that it is capable of achieving--while still valuing those who do not go to college at all, whether by choice or necessity, as well as those who choose fields other than the humanities? (Because, as much as I love the humanities, other choices have value too.)

Not least of all, how do we make higher education accessible, financially and otherwise, for a wider range of people? Meaning, enrichment, value--none of those things may be tangible, yet all of them are real. Yet most of us still need to pay for the roofs over our heads, the food on our tables, the clothes on our backs. When the price of "meaning" increases 1,120% every 35 years, a lot of us are going to say "to heck with that" and accept reality television as "good enough." Hey, we're entertained at the end of a hard day's work and if life is supposed to be for more than that, well, guess we'll have to take a pass on that this time, a lot of people will say. Read the comment stream in the Bloomberg article, in which more than one commentator refers to non-j0b-training-related college courses as "fluff."  

A lot of work lies ahead. What to do? I'll be exploring that further in the days and weeks ahead. A first thought, however, is that if we want to keep alive the humanities and all they have to offer, they can't belong only to the realm of the university. They need to be out in the world, accessible to everybody, not only those with the financial and circumstantial means to obtain formal education. At the same time, we need to work toward expanded access (especially financial) to formal education, as well as an understanding of "what college is for" that both includes and reaches beyond the idea that higher education is simply training for a job. It is that.

It's also more than that.

3 Comments

Long answer to an annoying question

7/3/2014

3 Comments

 
When I was an undergraduate, I was often asked that annoying question that every English major is tired of hearing: "Why did you major in that?"  (Actually, that question didn't stop when I got my B.A.)  Apart from the fact that it's pretty rude to question someone's personal choices, at least getting pummeled with this one forced me to think: So why am I majoring in "that"?  

The answer I formulated for myself back then still holds, twenty-something years later: I'm one of those people who is interested in everything, and studying English allows me to study everything. No, we don't spend all our time studying grammar (though actually I wouldn't have minded doing more of that, but then I'm pretty geeky even by English major standards).

If English isn't just about grammar, what is it about?

English* is about language, ideas, stories, and what it all means.  (*Caveat: This is true of all language, not just English, and of expression in all cultures, not just English-speaking ones. But I'm in an English-speaking culture and English is my field, so I use the term "English" here in the broad disciplinary sense, not in the sense of "English only." If you're a specialist in Spanish or French or Mandarin or any other language--or if you work in a humanities-related field other than a language or its literature--please read this as applying to you also.)

Language. Try studying any supposedly more "relevant" subject without using language (duh); it's central to almost every aspect of human existence. (For now, I'll sidestep academic debates regarding the extent to which language constructs our very reality and assume we can all just agree that it's central.)  What could be more educationally relevant than the in-depth study of a culture's language and all that it expresses?  

Ideas shape the world; doesn't the idea of something have to exist before that thing can be brought into material reality? (Last month I viewed the Wright Brothers' plane at the Smithsonian, and the exhibit made it clear that the discovery of how to fly had to exist first as an idea.) 

Stories: How else do we make sense of our lives? How else do we teach our children? How else do we disseminate knowledge from one generation to the next, build social bonds within and across our communities, make sense of our personal and collective tragedies and triumphs? 

What it all means: Isn't that the most relevant question of all?  (And no, English teachers do not go around looking for "hidden meanings" in things--the fact that meanings are not actually "hidden" merits its own post and is something I'll get to later.)

So considering that English is about things that form the very center of human endeavor, why is it English that ends up in the institutional basement? (In some cases that location is merely metaphorical; in some cases it's also literal.)  Considering that we study and perpetuate what lies at the core of most other creative and intellectual pursuits, why are we the ones being treated like weirdos? Maybe it isn't the English majors who should be grilled at social gatherings about why we majored in "that."  Maybe it's not the English teachers who should be mocked (and underpaid). Maybe it's not those in the humanities and liberal arts who deserve to become the butt of national jokes regarding the addition of French fries to a fast-food order (though we also ought to cease mocking those who serve fast food for a living--snobbery is always ugly).

If humanities cease to matter in this world, what kind of world will it be?  Suggested reading list here: The Giver, Nineteen Eighty-Four, The Hunger Games, Fahrenheit 451, and just about every other dystopian novel ever written. If you want to know what a humanities-free world might look like, countless authors have already done the work of imagining it for us. Hint: It isn't pretty.

It may be that right now, not enough people understand why we're necessary. But we are.

3 Comments

WHAT TO EXPECT HERE

7/3/2014

0 Comments

 
Q. So what does an "underground professor" of English blog about, anyway?

A.  Everything!  English is about ideas, and ideas are welcome here.

Though this blog is a conceptual work-in-progress, I anticipate writing a wide range of posts, from short meditations on what I'm teaching or researching to book or film reviews, to discussions on the crisis in the humanities and challenges faced by our educational system. I'll riff on memes, culture trends, and cliches. I'll share stories and insights gleaned from nearly two decades of teaching (preceded by fourteen years as a post-secondary student), in hopes of demonstrating why the humanities matter. The possibilities are endless--because that's what studying English is all about, endless possibilities.  In fact, I feel a post on that subject coming on right now . . . stay tuned.

A couple of quick notes on what this blog isn't:

It's not strictly--or even primarily--for academic audiences, although I do hope that many academics will find something here of value. I won't be using a lot of discipline-specific jargon, and the analyses I make and the connections I draw are likely to be more wide-ranging than what one finds in a typical peer-reviewed scholarly journal article. In short, I won't be limiting my discussion to my academic specialty. (Gasp! The professor's gone rogue!)

My reasoning here: We academics "talk amongst ourselves" plenty, thank you.  With the current "crisis in the humanities" very much on everyone's radar screen, we're doing a fine (and justified) job of bitching and moaning, and numerous scholars have written cogent analyses of the crisis from a cultural studies perspective.  All of this is to the good: the academic analyses shed helpful light on the broader picture and catalyze more of us into action, while the griping helps us let off steam so that we can get on with our work even in the face of discouraging opposition. 

But we also we need to move beyond the analysis, internal discussion, and frustration. I don't want to go down in history as little more than the second violinist in the dance band on the Titanic; I want our culture to change. To bring that about, we need "buy-in" and active involvement from a whole lot of people, from numerous walks of life--not just those of us who teach college for a living. Let's face it, we're a numerical minority. We'll never win this one alone.  

(I'll have more to say on the relationship between academia and the so-called "real world" in future posts. As someone with bluish-collar roots who first entered college in my thirties, I've viewed it from both perspectives. For now, I'll just say that neither world is either more or less "real" than the other--and it's important that we all stop pretending that such a division exists. Academia is a real place, and those of us who work there have real jobs and real lives. End of statement.)

Finally: This isn't a place where academics of various persuasions can tear each others' heads off. (For that, we have conferences.)  :) Antagonistic argumentation can be a default setting for some of us who have been trained to be professional fault-finders, but with our whole profession in crisis, our long-term survival depends on our ability to focus on commonalities and find solutions. That's not to say we can't disagree--dialogue is essential to solutions and disagreement is essential to dialogue--but this is a space where civility will rule. Ad hominem (personal) attacks and red herring fallacies will be called out and/or deleted as necessary, and repeat offenders will be blocked. If you're looking for a fight, there are plenty of other places where you can find one. And if you want to watch one, turn on MMA--or cable news. That's not what we're going to do here.

And why would anyone want to fight anyway, when civil yet intelligent, meaningful conversation is way more fun? In the long run it works better too.

0 Comments

WHO AM I?

7/2/2014

0 Comments

 
I’m not using my name on this blog, nor am I using the name of my institution or city/state. I will never name specific individuals nor will I disclose information that would make others easily identifiable.  

That being said, I know that a lot of you who are reading this--perhaps even most of you--do know who I am, where I live, and where I teach.  Keeping my identity absolutely "secret" from all parties is not really necessary (or even possible).  However, I am publishing this blog anonymously, primarily in order to prevent my name and institution from being blasted all over the Internet and on search engines.  

There are a couple of reasons for this. First, I need to respect the privacy of my students, as well as my colleagues and family. If you know me, you probably also know some of those people--but if you don't, it isn't my place to reveal information about others. Second, I'd like this blog to be focused on issues, not on personalities or particular institutions.  The precarious status of English as a discipline, and of the arts and humanities in general, is (unfortunately) not confined to any specific place; whatever is happening in particular locations is echoing elsewhere.  For those of us who care about the survival of the humanities, the challenge belongs to all of us, regardless of where we live or where we teach (if we do).  Therefore, it's important for us to discuss these issues with the broader picture in mind. Whenever I  use specific situations for discussion or exemplification, I will use information that is already available to the public rather than disclosing anything that is confidential or proprietary.

If you write in the comment stream, please refer to me using my blog name ("Underground Professor") even if you know who I am.  At the same time, if you pass along this link to someone who knows me, there's no need to keep my identity shrouded in secrecy--just don't put it here.  :) My identity doesn't need to be a total mystery--I will stand behind anything I say here--but I do want to use discretion, for reasons elucidated above.  

Finally, please bear in mind that whatever I say here represents my individual views and beliefs, not those of my institution (whether you are aware of my institutional affiliation or not).  Although I "profess English" for a living and my professional experiences have obviously done a great deal to shape my thinking, in this blog I am expressing myself as a private citizen, not as a spokesperson for any institution or professional organization.

0 Comments

Welcome to the underground: BURNING BRIGHT

7/1/2014

0 Comments

 
In this age of bottom-line-minded administrative austerity, soaring educational costs, accompanying student indebtedness, and the increasing cultural obsession with earning “marketable” degrees and viewing higher education primarily as job training, it often feels like those of us who persist in teaching a “useless” subject like English—not to mention those "foolish" college students who insist on majoring in it—are now as relegated to the margins as the fictional "book people" who appear at the end of Ray Bradbury’s Fahrenheit 451. Cast off from a society filled with seemingly anesthetized conformists, we wander aimlessly along the (metaphorical) riverbanks quoting random literary stuff from memory.  We are disheveled, lacking shoes, irrelevant, and cast aside. 

(At least the book people get to do their wandering in an aromatic forest alongside a scenic riverbank; right now my office is deep in an interior building, devoid of natural light. Though I live in a stunningly beautiful place, while at work I don't get to see it, not even through a window. Such is the current status of the English professor in 2014.)

While the "book people" could at least see the light of day, we do have one advantage on them: the possibility to connect through technology.  I know, I know—it’s popular for people like us (i.e., people who read) to bash technology as evil, to view the computer and/or the TV and/or the video game console and/or the smart phone as contributors to the dumbing-down of humanity, the fall of literacy, and perhaps the end of civilization as we know it. Here again I think of Fahrenheit 451, when the once-professor Faber opines: "It's not books you need, it's some of the things that once were in books. . . . The same infinite detail and awareness could be projected through the radios, and televisors, but are not. No, no, it's not books at all you're looking for! Take it where you can find it . . . Books were only one type of receptacle where we stored a lot of things we were afraid we might forget. There is nothing magical in them at all. The magic is only in what books say, how they stitched the patches of the universe together into one garment for us." 

Faber makes a good point--after all, Mein Kampf was a book. Nor is there anything inherently evil in pictures that move and speak (for instance, I never grow tired of hearing Martin Luther King orating on the National Mall, speaking aloud his dream).  We all know people who use technology poorly—hey, at times we may even be those people—but we can also use it productively.  In the spirit of Faber's advice to take "the magic" where we can find it, I hope this blog will help more of us to "store" a lot of non-tangible things: literature, of course (in many forms and genres, from many places), as well as music, film, theatre, art, history, comparative religion, philosophy, and all forms of human expression and their study—the stuff we learn how to do and/or to appreciate not because we hope to make big bucks by doing so, but because the process of pursuing these endeavors adds value and meaning to our lives in ways that go far beyond making more money and acquiring more "stuff."  

I’d also like to invite positive discussion and sharing of ideas--to take the humanities out of the so-called "ivory tower" (which in my case is more like a dungeon, lacking ivory). The humanities are crucial for everybody, not just those who teach, study, and/or practice them formally.  The humanities have much to offer people from all walks of life, regardless of occupation, educational level, income, social status, religion, sexual orientation, gender, or any number of the many artificial barriers that fool us into thinking we don't share a common humanity.  It is my hope that if we join together, those of us who care about keeping the arts and humanities alive--both within and outside traditional venues such as the academy--can convince more people to care and to work toward that end. 

My goal here is ridiculously ambitious: I want nothing less than to change our cultural conversations. I want to broaden our sense of  “what education is for,” to challenge unfair stereotypes of the hapless English major (or any arts/humanities student), and to persuade people that the humanities matter.  I want to challenge the present cultural biases against anything that cannot be quantified in numerical terms. Like Bradbury's "book people," our goal is to keep cultural memory alive as we journey to a yet unknown place, anticipating a future in which a broader spectrum of society cares, as we do, about storing "a lot of things we [are] afraid we might forget." Maybe that is too much to expect, but we all owe it to ourselves to try. As Bradbury put it so well in his freakishly prescient novel: "Do your own bit of saving, and if you drown, at least die knowing you were headed for shore.” 

0 Comments

    Archives

    November 2016
    October 2016
    July 2016
    June 2016
    April 2016
    February 2016
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014
    July 2014

    ABOUT THE AUTHOR

    The Underground Professor teaches English at a small private university. This blog explores how we might revitalize the humanities in the twenty-first century--and why it's important that we do so, in light of the our culture's current over-emphasis on profitability, quantitative measurement, and corporate control.


    Categories

    All

    RSS Feed

Powered by Create your own unique website with customizable templates.