Notes from the underground professor
  • Underground Professor
  • About This Blog
  • Why the Pseudonym?

KNOWING WHAT TO KEEP

11/15/2014

0 Comments

 
I'm fascinated by the human brain--the way it works, and the way it doesn't. For instance, when I try to memorize something--a favorite Yeats poem, the Schubert piano piece I'm currently working on--my brain seems to fritz out on me. Then a song that I never even liked will play on the golden oldies station and I'll realize, sometimes to my horror, that I still know all the words to it, despite never having tried (or even wanted) to learn them. Next, those annoying lyrics, which I apparently memorized by accident, will get stuck in my head and I end up with unwanted earworms. This week, it was a line from Kenny Rogers:  "Every gambler knows that the secret to surviving/Is knowing what to throw away, and knowing what to keep."

Well, maybe there's a reason that line repeats itself in the eternal cassette player of my mind. (Yes, I'm dating myself here. That's okay.)  When I think about the subject matter I teach--English--the question of what to throw away and what to keep asserts itself constantly. What belongs in our curriculum; what should be removed? Which approaches to understanding literature are helpful and necessary; which are stodgy and outdated? And, if the majority of society no longer seems to want what we're offering (to the extent society ever did want that), how far should we go to capitulate to current tastes and trends in order to remain "relevant"?  If we go too far and cave in to the shallow values of our current society, do we dilute the potential value of what we do until it's no longer even valuable?  Yet if we refuse to adapt with the times, do we risk devaluation by becoming irrelevant? 

This week I read an Inside Higher Ed article reporting on a recent conference on "the future of the liberal arts" (HERE). While the article itself is somewhat more nuanced than the link's title--"liberal arts must return to a 'purer form' in order to survive" (quotes mine)--the conference it refers to rehashes a longstanding meme: that those of us in the liberal arts today are responsible for our own demise because we've allowed our disciplines to become "impure," tainted with nasty stuff like critical theory, postmodernism, jargon, and various contemporary concerns. 

John Agresto, former president of St. John's University, posits that the decline of the humanities is "less a murder than a suicide" and suggests that our survival depends on a "return" to a "Great Books" curriculum (such as that taught at St. John's). What's "ruined" the humanities, according to Agresto, is "hyperspecialization" and, especially, critical theory which he claims, "limit[s] students' free inquiry" by committing such travesties as "portray[ing] the founding fathers as mere 'white racists.'"  Next comes the usual "let's go back" argument, calling for us to return "to an older mode of instruction, and instilling critical thinking skills." The article posits that there was a time when the liberal arts were a 'gift' given to everyone," evoking an imagined golden age of "classic liberal arts instruction . . . in which a cat may look at a king." 

I'm still trying to make sense of bashing "critical" in one breath (Critical Theory=Bad) while praising it in the next  (Critical Thinking=Good).  To be sure, it is overly simplistic to portray the founding fathers as "mere white racists," when historical contexts and those who function within them are highly complex, always have been and always will be. There's always more to the story than just one facet.  (I would add, despite being taught by a good many critical theorists myself, I was never taught to interpret history in the simplistic manner suggested here--my learning was far more nuanced than that, and thus this characterization of critical theory as a "limit on free thinking" reads to me like a caricature.)

Yet it's equally if not more simplistic--not to mention un-critical--to ignore the huge blind spots at the core of America's founding: the bold visionary declaration that "all men are created equal," made by a group of men, many of whom owned slaves, at a time when indigenous people were being forced off their own land, while half the population--women--went unmentioned. Is it really accurate to characterize a superficial and triumphalist interpretation of history or literature as a "gift" to everyone, when during that supposedly golden age of the liberal arts, most people were actually left out of the picture being presented? My own critical thinking process tells me "No."  (Thank you to my critical theory professors.)

And when exactly was this time when "a cat may look at a king"?  When I think backward, what I recall is a time when most people didn't even have a shot at college education; and if those from society's less elite sectors somehow did manage to enter the university, they rarely found their own experiences or perspectives reflected in what they were required to study.  (Furthermore, while a few "cats" here and there may have gotten occasional glimpses of "kings," education has never been so radical that the "kings" were in turn taught to look at--let alone think about--the "cats.")

Critical theory reminds us, "Hey, there are a few other lenses through which you can understand this, rather than the single lens you've been using."  Looking through a different lens often results in a picture more complicated, and admittedly often more ugly, than the one previously seen.  We're simply killing the messenger if we blame the observations made by critical theorists for the demise of "old-school" liberal arts education that was, by nature of its inherent exclusivity, doomed to fail in an increasingly pluralistic society.  Critical theory calls for a deeper recognition of the complexity and inherent contradictions in much of what we study, whether it be our history, literature, art, political systems, religious traditions, or anything else. To "return" to an "older mode" of understanding--a more simplistic mode--means moving away from "critical thinking" rather than toward it.

(If you've been reading this and thinking "just another indoctrinated, theory-addled contemporary academic," you might like the next paragraph.  If you've been nodding along with me in agreement so far, here comes the part you might not like.)

But I do have criticisms of critical theory, too--especially when it comes to its difficult language, its jargon, its intellectual elitism. What bothers me here is the hypocrisy: It's one thing to act elitist while not pretending to be anything otherwise. Obnoxious, maybe, but at least elitists who own up to their own elitism are consistent. But when you're calling for a more inclusive framework--an approach to literature or history or art appreciation that advocates for the perspectives of the less privileged--it's ethically problematic to do so in language inaccessible to many of those on whose behalf you claim to speak. Too often, critical theory is couched in language more exclusionary and difficult than Shakespeare ever was. Too often, well-known theorists and critics advocate for exploding the "literary canon" when it's their own theories that are now canonized (through such mechanisms as curriculum and syllabi).

I actually like the idea of liberal arts/humanities as a "gift given to everyone," and if we're going to keep their study alive and vital, I'd argue that taking such an inclusive and generous approach is exactly what we need to do. We need to include everyone, engaging in ever-widening and deepening conversations that pose and explore the big questions about life's meaning, offering a challenge to the purely materialistic, mechanistic, individualistic way of understanding the world.  

But I disagree when people call for us to "go back," thereby implying that there was a time when this gift was given to everyone. Such a time never existed. Critical theory--rather than "tainting" the liberal arts, as the call for a "purer" approach would suggest--has served a vital function by pointing  out the many voices and perspectives that have long been excluded, both in terms of what will be studied and who gets to do the studying.   

When it comes to "knowing what to throw away and knowing what to keep," I'm in favor of keeping a great deal: an in-depth understanding of our history; major philosophical questions about the purpose of life; the art, music, and literature that throughout the centuries has attempted to address those questions; the notion that there's more to life than just profit-and-loss statements; the potential of the liberal arts for training us to think through vantage points beyond what our own narrow, individualistic focus would allow. Maybe my belief that all this is possible makes me rather quaint. 

Yet there is also much I'm in favor of throwing away: racism and ethnocentricity, sexism, rigid gender roles, exploitation, homophobia, elitism, class-based hierarchies, snobbery, anything that dehumanizes anyone, and approaches to knowledge that fail to address --or in some cases even acknowledge--the implications of all of the above. Maybe my belief that the "good old days" weren't so good, that critical theory has substantial merits, and that dehumanization in all its forms needs to be exposed and challenged makes me rather "strident," as some critical academics have been disdainfully called.

If understood and applied in the spirit of inclusion, I'd argue that critical theory opens up more rather than fewer possibilities for making the liberal arts relevant, meaningful, and sustainable. Yet if critical theory is approached in an exclusionary manner and couched in language inaccessible to most, then perhaps we have not come so far after all. 

I'd hate to think that we are throwing away the wrong things--curiosity, a spirit of community, humanity, the quest for meaning.  I'd also hate to think that we are keeping the wrong things--the exclusionary and dehumanizing attitudes that have historically constrained the liberal arts from being the widely disseminated "gift given to everyone" that they should be. 
0 Comments

Daily affirmations (without Stuart Smalley)

11/13/2014

0 Comments

 
There aren't many things I miss about the early nineties, but Al Franken's Saturday Night Live character Stuart Smalley is one of them: "I'm good enough, I'm smart enough, and doggone it, people like me!" That skit always cracked me up.  Poor, delusional Stuart and his shallow sound-byte solutions to complex problems. It's so easy to laugh at him, and at anyone who thinks they need "daily affirmations." 

That is, until you find yourself operating in an environment where you don't get affirmed very often. 

If you're living in an environment devoid of encouragement, you might not immediately notice its ill effects. When you do, you might not recognize them as such. It's hard to get going in the morning...well, that's normal, right? Energy levels sag, we need more naps...well, we are getting older.  We're less motivated to do things that used to turn our cranks...should we get our hormones checked? We hear ourselves snapping with the people we claim to love most...well, kids are irritating, aren't they? Or maybe we're just rotten people? When our health seems to be diminishing and our attitudes need adjusting, we tend to do one of two things: accept it as normal, or blame our flawed selves.

Curiously, we don't do that this with plants. If a plant fails to thrive, we don't call it normal, and we don't blame the plant. We recognize, rightly so, that something's gone wrong in the environment--that the soil lacks nutrients, or the plant was attacked by a disease or pest or severe weather event, or the amount of light or water received was too little or too much based on the needs of that particular plant. We don't dowse ferns in ethyl alcohol rather than water and expect them to thrive (or even survive); we don't plant bulbs in sand and expect them to bloom; we don't put our herb gardens in a windowless basement and act surprised when they wilt. We know that living things require the right mix of soil and water and light. 

The fact that human beings are also living things--well, sometimes we seem to forget that. But we need the same stuff that plants need.

A couple of days ago, a journal accepted one of my articles for publication. Of course hearing "yes" is always a good feeling, especially in an endeavor where "no" is more the norm. But this time, "yes" felt especially energizing--so much so that it stunned me to realize how deprived of professional encouragement I've been lately.  (As a crazy but weirdly wise relative once told me, "You drink water every day and don't think about it, until you find yourself parched in a desert. If someone gives you a drink then, that's the one you'll remember.") It made me realize just how serious the consequences of long-term demoralization can be, not just on the individual level but on the wider scale as well.

Affirmation, encouragement...what am I, some kind of new age guru who wants everyone to hold hands and sing "We Are the World" while wearing dorky matching outfits like the silly folk group in A Mighty Wind? (Great movie, by the way.)  Nope. Am I advocating empty praise, just telling people "You're great" whether they are or not? Not at all. (In fact, quite a few social science studies suggest that empty praise is quite damaging.)  

What I'm advocating is actually that sacred currency of the academic realm--"critical thinking." Yes, critical. Not in the sense of Random House definition number one--"inclined to find fault or to judge with severity, often too readily."  The world has plenty of that already. I'm talking about "critical" in the sense of Random House's third definition-- "involving skillful judgment."  (In academic circles that's what we mean--or claim to mean--when we talk about being "critical," though at times we all lapse into operating under the other, more widespread definition.)  

Skillful judgment is exercised by good leaders who know how to bring out the best in people--not by pretending flaws don't exist, but by noticing where improvements need to be made and where strengths lie. True leaders know how to offer specific guidance, and when growth isn't happening, they figure out and identify what needs to be done differently. When it comes to skillful judgment, part of the "skill" includes knowing how to say things in a manner more likely to encourage than discourage. Effective leaders know that "fault-finding and severity" is like pouring ethyl alcohol on your plants instead of water.  They understand--through critical thinking--that living things thrive in environments designed to meet their needs, that we all need the right amount of light, soil and water, and that individual needs will vary.

Academics are no exception. Despite having undergone the doctoral process, we're still human (presumably). We are living things, and we need what all living things need.  But those things can be tough to find here.  We tend to be highly trained professional fault-finders.  Peer review can be brutal; expectations can feel impossible.  It's easy to feel that whatever we do, it won't be enough.  There are always more books or articles we should have read, more potential counter-arguments  we should have anticipated, more nuances we should have addressed, more artful ways we could have articulated, more pedagogical tricks we could have employed.  And in this new academic climate that somehow sneaked up on us, with its emphasis on perpetual assessment, we often sense the implication that we're doing our jobs badly, and that when students don't do so well, it's presumed to be our fault (even if, say, said students skipped fifty percent of our classes or tried to skim by without reading any textbooks). 

It's also easy to project this fault-finding ethos into the rest of our work. How often do we read student papers not with an eye toward noticing what is promising--"exercising skillful judgment"--but through the lens of looking for mistakes, as though we are searching for evidence that the students really are as lousy as we suspect they are?  "Wrong," we write in the margins; "vague," or my personal favorite, "awk." (What's more awkward than the abbreviation "awk"?) 

So what, you may say? Serious intellectual pursuits aren't for the soft or faint of heart. If you can't stand the heat, get out of the conference room or the classroom. 

That's true. Serious intellectual pursuits demand a lot of us. All the more reason we need to think more critically about, and strive to create, the kind of environment where such pursuits can thrive.  I know that I'm far more effective, as a scholar, educator, parent, spouse and friend, when I feel encouraged rather than diminished.  Poor Stuart Smalley, though funny, is actually quite sad when you think about it: He's reduced to looking in the mirror and telling himself he's "good enough" precisely because he's been told far too often that he's not.  Perhaps if we lived in an environment more conducive to healthy growth, there would be far less need for people to stare at themselves in the mirror.

Some say, the world is a tough and cruel place; our job is to help people prepare for that harsh reality.  But there's no brilliance in telling students (or anybody else), "Someday, somebody is going to be mean to you--so I'm going to help you out by being mean to you now."  There's not much critical thinking in that approach.

Yes, it can be a tough and cruel world.  That's the whole problem.  But the job of teachers, and of leaders, shouldn't be to add to the cruelty.  Instead, the task should be to help those under our influence to grow hardy enough to thrive in that cruel world, perhaps in the process even growing strong enough to make a start at changing it.  We do that by creating an environment that is conducive to thriving--an environment rich in encouragement and honest affirmation.

0 Comments

WELCOME TO DYSTOPIA (A TRUE HORROR STORY)

11/10/2014

0 Comments

 
This Halloween week, I was made too aware that when it comes to horror, it’s not the prospect of supernatural monsters that we need to fear. The horror is right here, right now.  

In Fahrenheit 451, Ray Bradbury imagined a world that was, back in the 1950s, considered “futuristic.”  Unfortunately, nowadays there is nothing farfetched or sci-fi about Bradbury’s bleak vision. For, just as in Fahrenheit 451, our purposes have become inverted. We may not have firefighters who are tasked with starting fires rather than extinguishing them (yet). But we do have the metaphorical equivalent: Specialists in many fields are now expected to do the opposite of what was originally intended, to destroy that which we were once expected to nurture.

Librarians, for instance, were once hired for the purpose of developing their collections. Now, many of them are being tasked with culling those collections and deciding which databases to eliminate. Educators once helped learners to expand their worldviews. Now, we are often required to truncate our course offerings (“students hate choices,” some have been told).  Those at the educational helm once served as the guardians and champions of the liberal arts. Now, many of them seem fixated on destroying the very foundation of the institution that makes their positions possible.  Meanwhile, books are disappearing.  So far, the bonfires may still be more metaphorical rather than literal, but that doesn’t mean they’re not equally destructive.

And, just as in Fahrenheit 451, a huge swath of the population numbs itself to the growing dehumanization.  Today most people are being treated as commodities rather than human beings, in a world that has lost its way and reduced everything to that which can be bought and sold. Most of us are manipulated, commodified, and dehumanized. More of us ought to be outraged, and perhaps if so many weren't numb, more would be. But too many people stick buds in their ears, stare in a daze at the giant screens on their walls. Like Montag's sad wife Mildred, too many numb  themselves to the point where thawing out might prove too painful.

Lest you think I exaggerate: Take the article (if you can call it that) that appeared in Time.com this week: “Why Ph.D.’s shouldn't teach college students,” by Marty Nemko, who is described as a “life coach.”  (Au revoir to Time--I remember when you were halfway respectable.)  Nemko begins by citing the usual alarmist memes: almost half of college freshmen don’t graduate within six years, some studies show students learn little in college, one-quarter of graduates were found to be “living at home” two years after finishing college, almost half said their lives “lacked direction,” and twenty percent made less than $30,000. 

I’ve certainly noticed this dearth of well-compensated jobs and clear career paths, and I know people who have had to move home after college.  But whose fault might all that be? The fault of college professors? Are we the ones who supported the systemic destruction of unions, outsourcing, a stagnant minimum wage, and the erosion of pensions, medical care and other benefits? Are we the ones who decided that “corporations are people,” that elections should be buyable, and that we should turn a blind eye to white-collar crime? If the economy is becoming more difficult to navigate, the blame for that rests on those who are in charge of the economy. (I'll address issues such as time to graduation and whether students are learning at a future date.)

Nemko goes on to state that “college hasn’t changed much in centuries”—a preposterous claim, as if education today is still delivered in Latin to Anglo-Saxon Protestant males only, and as if we have stuck with the trivium and quadrivium rather than adding any new fields of study. “There’s still a research-oriented Ph.D. sage on the stage lecturing on the liberal arts to a student body too often ill-prepared and uninterested in that,” says Nemko--as if the liberal arts are central to today’s university experience (don’t I wish!), as if student “ill preparation” is our fault, as if the point is to cater to student “interests," and as if the “sage on the stage” model is used exclusively. (And for that matter, as if most undergraduates are being taught by full-time, “research-oriented” professors instead of by underpaid adjuncts.)

The longer Nemko argues, the more illogical his statements become, until he asserts that Ph.D.’s shouldn’t even teach because “the gap between [Ph.D.’s] and their students’ intellectual capabilities and interests is too great.” I’m slightly amused at his backhanded acknowledgement that we “snobbish” Ph.D.’s might actually know some stuff (so much stuff, apparently, that our intellectual prowess has rendered us incapable of communicating with our fellow human beings). I'm less amused by the fact that he's calling today’s students stupid.  

So who should be teaching students, according to Nemko?  “Bachelor’s-level graduates who themselves had to work hard to get an A.”  What an excellent idea; Find recent graduates who struggled with the course material themselves and have them do the teaching, without striving to understand the more complex material conveyed in graduate school. Show of hands time: How many of you would like to go under the knife of a surgeon who, as an undergraduate, was taught biology by someone with a B.A. who “had to work hard” to get it? (What’s that you say? No, thank you?)  Nemko also suggests that prospective teachers “complete a pedagogy boot camp, a one-weekend to one-semester intensive.”
 
Oh, so that’s how long it should take to train teachers: one weekend! (To think of all those years I wasted...)  Or maybe a semester, says Nemko (I suppose that's if you’re learning to teach something really hard, like logical argument).  Then, as if Nemko hasn’t tied himself up in enough conceptual knots already, he claims that such training is required of teaching assistants "but not of professors”—a curious assertion, since teaching assistants are the ones who become professors.

So what is Nemko’s solution to the higher education dilemma? He suggests that most courses should be “taught online by interactive video." Why? Because “the online format allows for . . . exciting simulations impossible to provide in a nation’s worth of live classes.”

Well, now we really are back in the inverted world of Fahrenheit 451—where firefighters start fires, and face-to-face, live interactions between human beings are less important than simulations of same.  The virtual is somehow more valuable than the real--just  as Mildred grows more attached to the imaginary television “family” that appears on her wall screens than to the actual husband standing in front of her. News flash for Mr. Nemko and all those who think like him: Education is not about “exciting simulations,” but about real relationships, between real people.  Yes, even now.

***
Despite my visceral and, I admit, angry reaction to this screed, I'll concede when Nemko suggests that the U.S. emulate Europe by expanding its apprenticeship programs for skilled labor. Guess what?  We used to have more apprenticeships in the U.S.—thanks to unions.  (I know whereof I speak; I married a man who completed a rigorous formalized apprenticeship with the carpenters’ local.)  But guess what’s happened to unions? 

Back before the greed of the 1980's created policies that have steadily eroded the benefits enjoyed by working people, we all had more choices. Colleges were expanding access, tuition was reasonably affordable, and for those otherwise inclined, apprenticeships were available. And we still need those apprenticeships. I’ll fight as vigorously for the dignity and fair treatment of those who engage in physical labor as I do for those who engage in the life of the mind.  All of us are needed.  For starters, those of us who are fortunate enough to work indoors need people to build, maintain, and clean the buildings that house us. Those who perform necessary physical work need those of us who are trained in other areas—law, medicine, pharmacy, law enforcement, education, literacy, the arts, and much more (and let’s not forget the old line, “No farmers, no food"--isn't that where it all starts?)  Bottom line: we all need each other.  There is dignity in, as well as the need for, all types of work.  No human being has the right to consider himself or herself more “valuable” than someone who works in a different capacity.  

I also believe that the arts and humanities, and all the advantages that they confer on us, should be available to everybody regardless of how we make our living, rather than confined to educational institutions. There’s no reason a carpenter shouldn’t enjoy studying history, or a janitor shouldn’t write poetry, or an ironworker shouldn’t play the string bass.  But it’s also vital that the arts and humanities continue to flourish within the university system—as the place where knowledge can be nourished, expanded, disseminated, and perpetuated.  

If that is to happen, we need to change our priorities.

Too often, those of us who teach in the humanities are threatened with extinction because of our relatively small numbers. When we ask for help with recruiting students, we’re told there is no point; the decline of the humanities is said to be a “nationwide” problem, and we are told students don’t “want” to study “useless” subjects anymore (“use” being defined here as “directly leading to the making of money, along a predictable straight-line path”).  It’s not surprising that during tough economic times, many people—especially the less well-off—prioritize job training over deeper fulfillment (insert the basic principles of Maslow’s Hierarchy here). It’s also not surprising that the humanities is marginalized in a culture like America’s, which tends to be anti-intellectual, materialistic, impatient, hyper-individualistic, and in other respects antithetical to all that the humanities stands for. 

But America has always tended toward all of the above. Yet despite that, for several decades following World War II, America offered (arguably) some of the highest-quality liberal arts education in the world, all the while expanding access. What was different?

When I read institutional histories nowadays, it appears there was a time when many college administrators themselves believed in, promoted, and protected the liberal arts. Many of those at the helm had decided that some things are worthwhile even when a direct, immediate financial benefit is difficult to calculate. Institutions had also decided that those with expertise in an area should be the ones teaching it, and that curricular decisions should be made by those with expertise, not by those new to the field, like students.

Now, anyone who knows me knows that I’m almost fanatically student-centered. I’ll take a bullet for the younger generation, whom I see as not as instigators of mischief but as victims of my own generation’s bad choices. However, as a teacher I’m also aware that student preferences about what they think they “want” should not always be the deciding factor when it comes to curricular planning or student advising. Can students know they want something when they may not yet  know it exists? How can they know what will be “relevant” to their futures when nobody can see the future?  

We old geezers can’t see the future either, of course.  But with age, many of us have gained the advantage of retrospect. We can look back and recognize how things we didn’t think we “needed to know” at the time have turned out to be invaluable. (I often wince when I remember my own teenage self, so “sure” about where my life was headed and what I “wanted,” when it turns out I knew almost nothing.)  Today I’m grateful for my teachers and parents and elders, older and wiser than me, who sometimes forced me to abandon what I thought I wanted in favor of what they knew I needed.  As much as I respect my own students, I also understand that there are times when, as an educator (not to mention as a parent), I’m obligated to steer them away from what they think they “want.”  University administrations should do the same and advocate, publicly and vigorously, for the values that have sustained the liberal arts for centuries, despite countless changes (Nemko's assertion to the contrary).

I don’t think I’m being foolishly nostalgic if I recall a time when our society seemed to have collectively decided there are some things worth keeping, some things that matter besides the "bottom line." We could decide that again. We could decide to treat people as human beings rather than as commodities. We could decide to promote human dignity, the quest for meaning, and the creation of a world in which all human lives have intrinsic value and all people have an opportunity to find a sense of purpose. Institutions of higher education could (and should) play an important role in fostering all of that.

We could decide that institutions of higher education will serve as the guardians of intangible things of value: our histories, literatures, religions, mythologies, artistic expressions, and ideas, across cultures and centuries. We could decide that some things matter more than squeezing every last dime of profit out of each other, and more than treating other human beings as commodities to be bought and sold or as “others” deserving of contempt. We could choose real relationships over “exciting simulations.” We could choose to value expertise and intelligence rather than expediency and popularity. We could choose to create institutions whose members are encouraged to nurture rather than destroy. We could choose to acknowledge, and work to change, the viciousness of the current economic system--the underlying reason there is “something rotten” in the state of higher education today.

We not only can choose those all this; we need to.  In order to do so, we need to find educational leaders who are committed to doing all of the above. We need people in charge who are committed to creating than destroying, and to building institutions rather than tearing them down.  
 
0 Comments

RUMINATING ON RUBRICS--or, WHAT ARE YOU LOOKING FOR?

10/27/2014

0 Comments

 
Last week my life seemed to be all about rubrics--a helpful tool, or so it is believed, for conveying to students what the teacher is "looking for." Last week I designed them, assessed with them, questioned them, wrote about them, and considered what happens when I decide not to use them. 

First, I finished revising and submitting a journal article about the limits of outcomes-based assessment.  I'm not arguing against outcomes-based assessment per se--I'm simply arguing that it has its limits. I'm pointing out that we shouldn't take a rubric-based analysis of a course's "effectiveness" as the final or definitive word on student learning.  I'm trying to remind readers that (a) much of what happens through education cannot be quantitatively measured, and (b) some of education's most significant manifestations do not reveal themselves until years, even decades, later. Sometimes, what we're "looking for" is as invisible as a tiny seed germinating underground. But the fact that we can't see something with our physical eyes doesn't mean it doesn't exist. And sometimes, things appear that we might not have realized we were looking for. 

Next,  I used a rubric to participate in yet another required outcomes-based assessment project. Most of our students appeared to "exceed" our expectations--begging the question of whether this is even possible, or whether we are falling prey to the "Lake Wobegon" trap of believing "all the children are above average." Or could it be--God forbid--that our program actually did a decent job of teaching this concept, and the students did a decent job of learning it? Could it possibly be that students who choose to major in English actually are better at writing than students who choose less writing-intensive majors? Hmmm.... Sometimes, even when we find what we're "looking for," we start to question whether we're really seeing it.  If not enough students meet outcomes, we suspect we've failed. If too many students exceed outcomes, we also suspect we've failed. 

Then, I sat through parent-teacher conferences for both of my own children. The younger one, her teacher tells me, is performing well on most measures--but the rubric shows she's "developing proficiency" in a couple of areas. I'm concerned, until her  teacher reminds me that she's the youngest member of her class, she's still little, and "developing proficiency" is exactly  what she's supposed to be doing at this age. Her teacher isn't concerned. Thank goodness she's not yet hit the Age of Ubiquitous Standardized Testing, he tells me; that's the time to "panic." I'm grateful that she has a teacher wise enough to understand that what we should be "looking for" in younger elementary-age kids is progress, not mastery. Let kids be kids.  (That is, until it's time to panic.)

The conference for my older child is a little bit more complex. His test scores are not a concern--he's exceeding outcomes all over the place, thereby helping his school make Adequate Yearly Progress under the dictates of No Child Left Behind. His results place him at the top of his class academically, demonstrating that he is probably paying more attention than we sometimes think. 

His first-quarter grades, however, don't place him at the top--even though he clearly grasps all the course content. Why? Because some of his assignments "don't correspond" with the criteria on the rubric. He thinks a little differently. His teachers, all of them excellent, recognize this. But the system no longer gives them the discretion to issue grades based on content mastery. They're constrained by the rubrics, limited to assessing what they should be "looking for" (dictated by others), not anything else they see.  

We're working to help him improve his rubric-targeting abilities. For whatever I might think of this system or of rubrics in general, my job as a parent is to teach my children how to succeed in whatever context they find themselves in.  If their work needs to be tweaked to better "match" the rubric, that's what I will help them to learn how to do. Doing otherwise would be a form of negligence.  We all know that whether we like it or not, much of adult life involves figuring out what people are looking for, and complying.

And yet?

Where would the world be if nobody had ever dared to bust out of the rubric and give the world something that nobody knew they were "looking for"? Where would progress and innovation come from? Where would growth originate? How would change come about? If we've quantified everything, created rubrics to cover every potential contingency, devalued everything we haven't pre-determined to be necessary, and removed any potential for surprise (whether as learners or as teachers), what has education become? What has life become? Nothing more than those in authority telling underlings what they are "looking for," demanding they produce it, checking off the appropriate boxes, and ranking the underlings according to their level of compliance?

Another of the week's tasks was issuing an assignment for my creative writing students--without a rubric. I can't honestly say there's nothing I'm "looking for"--I issued an assignment sheet, because nobody likes to be in the dark. But I kept some things open-ended.  After all, we're trying to make art, and what is more antithetical to the creation of art than a rubric? The whole point of creativity is to break new ground, to go beyond the expected and the familiar, to make us "see" in a different way--and in order to see something differently, we have to not see it coming. Good art, whether literary, visual, musical, or whatever, surprises us, provoking a "Wow" reaction--"I never thought about it that way before." Ergo, if I've never thought about "it" that way before (whatever "it" may be), I won't have been able to design a rubric to measure "it." 

If I wanted to have some fun while making a point, I could easily design rubrics that would cause William Shakespeare, Ernest Hemingway, Virginia Woolf, Salman Rushdie and Toni Morrison to flunk college English.  (We could probably do that for just about every renowned writer in history.) I'd tell Hemingway his sentences are too short, I'd tell Woolf and Rushdie their sentences are too long, I'd bash the Bard for using unnecessarily complex vocabulary and inverting his sentence structure, and I'd accuse all of them of choosing inappropriate subject matter.  (Hey, maybe that would be a great project to engage in. You know, in my spare time.) 

So, no rubrics for creative writing. You'd think the students would be thrilled.  But sometimes a funny thing happens when, instead of specifying what you are "looking for," you open up possibilities: Some students panic. If I haven't told them what I'm "looking for," they aren't sure what they are supposed to be doing. I tell them, what you're supposed to be doing is making art. You tell me what matters to you. You decide what form suits your subject matter; I've issued a few guidelines, but it's your voice I want to hear in the piece, not mine.

I did the best I could to soothe the nerves. I provided model pieces from past assignments and from professionals. I further elucidated my philosophies of teaching and writing. I reminded them that (a) it's a draft, not a final, and (b) when it comes to creative writing, my grading methods are non-traditional. I hope I was successful in quelling at least some anxiety. I hope they did some writing this weekend, and I hope they didn't panic. 

Rubrics have their place. In certain situations, I find them helpful as a teacher. For my children, they can be useful tools for understanding what needs to be prioritized in their assignments. This is all to the good.

Yet it's also possible to lose our sense of balance--to forget that rubrics should be designed to serve our needs, rather than dominating educational culture so much that instead of rubrics serving our needs, we must submit to the demands of the rubric. Certainly there are times when it's helpful to know what others are "looking for." But there are also times when it's vital to do, say, or create something that perhaps no one was "looking for." How else does the world move forward?

Rubrics seem to have become the "training wheels" of K-12 education, and training wheels--like rubrics--have their place. But leave those training wheels on too long and the cyclist might never learn how it feels to achieve the sense of balance necessary to ride independently. When our children are behaviorally conditioned throughout their formative years to believe all that "counts" is what someone in authority is "looking for," even the most creative among us may become prone to paralysis, unsure of what to say when nobody has told us ahead of time what it is that we're "supposed" to be saying.    
0 Comments

LEARNING to WAIT

10/19/2014

0 Comments

 
A couple of weeks ago we held a debate in our freshman argumentative writing course: "Is the millennial generation more narcissistic than previous generations?" To prepare, I had my students read several articles, some of which argued "yes," some of which argued "no," and some of which argued "it's complicated." 

While looking around for articles, I happened upon one (of many) that harped upon the millennials' impaired ability to delay gratification. I ended up not assigning it, since the goal of this particular project was not to exemplify the overgeneralization fallacy. I am not convinced that my own generation is morally superior to the current one just because we used to wait three days for the film to develop.

On the other hand, I can't totally disagree that delaying gratification is a challenge for young people today. I'm a parent as well as a professor, and I know what a constant battle it is to persuade my children to wait. It would be very easy to lay the blame on them--or at least on the hyper-technical, innovatively disrupted, constantly changing world in which they are being raised. (Which of course begs the question of who created this hyper-technical, perennially disrupted world in the first place. It wasn't the kids.)

As I was revising an article yesterday in response to a "Revise & Resubmit" request, I noticed a recurring thread in my scholarly writing: a plea for more patience. And no, I'm not addressing "the millennials"; I'm talking to people who are at minimum over thirty, usually older.  So many of the educational trends that I perceive as misguided are grounded in wanting results and wanting them NOW. What could be a better case study in collective cultural impatience than NCLB? Once again, it isn't the kids.  It's adults who want to see measurable results and want to see them now. It's adults who question the value of the liberal arts and humanities because they can't see a direct, immediate link to that kind of education and the making of money. It's adults who make pedagogical and institutional decisions that don't allow for the factor of time.

Take the teaching of writing, for instance. A a couple of years ago I heard a presentation by a noteworthy cognitive psychologist, Ronald Kellogg, whose research demonstrates that the development of expert writing skills "takes many years of deliberate practice." One of my recent articles built upon this and other findings from cog-psych research. Yet all those of us who teach writing have probably had some conversation along the lines of, "You teach writing? So how come our students can't write?" Or, "Why are they such crappy writers? Didn't they take freshman comp?" 

The implication here is that we writing teachers must have done a bad job--not that we are trying to achieve in 14 weeks something which takes more like 14 years. Unfortunately, if you give kids a bunch of K-12 writing instruction that teaches writing in a narrowly prescriptive way, rarely veering beyond the five-paragraph essay or the aspects of literacy that can be assessed on a standardized bubble test, most students are unlikely to morph into expert writers in even the most effective 14-week course.  What does work? Patience, practice, and process. 

The concept of patience also reared its head in the article I was revising yesterday, about the long-term effects of a liberal arts education. In this piece I quote extensively from the retirement/graduation speech given by my high school drama instructor--one of the most truly gifted and dedicated educators I have ever known, who was both insanely creative and tough as nails (much like my son's current middle-school music teacher). He summarized the lofty goals he held for all of his students with the preface, "This is a review for a very long take-home quiz"--and, he said, this "quiz" will last for the rest of our lives. 

And from reading comments on our alumni Facebook page about this man and other effective teachers, the length of the "take-home quiz" is what stands out to me. Some of the people posting comments there now, by their own admission, were hardly paying attention thirty years ago. Many comments were variations on the refrain: "It took me  years to understand."  

My drama teacher's speech took place 25 years ago, before the "millennial" generation was even born, and in it he harked back to his earliest teaching days, in the fifties. What he aimed for--what he always aimed for--was for his students to learn how to back up their words with actions, to conquer fear and prejudice, to achieve self-confidence through self-discipline, to "trade in tunnel vision for a wide-angle lens," and to "learn the value of patience." Apparently he perceived that those were qualities we needed more of, even in the supposed "good old days" of camera film and vinyl. Teenagers have always been impatient. Patience has always had to be taught. 

Of course, doing that requires adults who are willing to model patience themselves.

Digital photography, downloaded music, instant access to full-text articles in databases--actually, I don't have problems with any of that. If I'm visiting with people whom I see maybe once every five or ten years, I'd prefer to know if all our group poses are crappy before the photographic moment passes. If I hear a symphony on Pandora that sends my spirits soaring, I'd rather download it now rather than try to remember later who the composer was (or where I put the piece of paper on which I wrote it down). And when it comes to research, no one will ever convince me that I might achieve some kind of moral benefit by browsing in library stacks for journal articles to photocopy (a task that required driving time and the burning of fossil fuels), rather than clicking on "Full Text" and downloading the article into my computer where I can locate and reference it forever. Take that convenience away from me and, given my heavy teaching and admin load, I probably wouldn't be publishing at all.  

In short, I think are many situations in which "instant" (or at least rapid) gratification isn't necessarily a problem. When it comes to the little everyday matters, efficiency does make it possible for us to do more. Yet there are many bigger-picture situations in which patience is not optional because there are no shortcuts: long-term relationships, gardening, pregnancy, raising children, healing (whether physical or emotional), mastery of complex skills--i.e., education. The key lies in knowing which is which--when to go for the "now," and when to understand that "now" is too soon to expect results. Not all seeds sprout on the same day they were watered.

When our class debate concluded with a somewhat mixed verdict, our discussion began with narcissism but soon veered into many other realms. My students collectively acknowledged that they are a little too technology-dependent as well as a little too impatient (though also far more self-reflective on these issues than most older folks give them credit for), They also recognized--correctly--that they are still teenagers, and they anticipate that patience will be a virtue they continue to develop as they mature. 

I like to think that will happen, and thinking back to my own adolescence gives me hope. Sometimes when I remember some of my more remarkable teachers from thirty-plus years ago, I realize that in some ways I'm only just now beginning to grasp all that they were really trying to teach me. My own growth, both intellectual and otherwise, has taken time--something I must remind myself on those days when it feels like I'm talking to the walls and I'm not sure whether anyone is listening to me.

I also think about legislation like No Child Left Behind, quick two-year studies like Academically Adrift, and optimistically worded 14-week student learning outcomes, and I wonder if some adults have ever grown out of their adolescent impatience. I also wonder, if the millennial generation truly is less inclined toward delayed gratification than ever, which generation might really be responsible for that.

I don't believe we have to learn how to wait for everything in order to be worthy human beings. But we do need to understand that even as our world becomes more technology efficient, some things remain that are not only worth waiting for, but are impossible to achieve without waiting.
0 Comments

WHEN "MEAN GIRLS" (AND BOYS) GO TO COLLEGE, KEEP GOING, AND END UP TEACHING IT...

10/13/2014

4 Comments

 
A few months ago one of our graduating seniors, in a farewell moment, posted on Facebook her belief that her fellow English majors are "the greatest people you will ever meet." Comment stream discussion echoed this sentiment until, through a convoluted series of exchanges, the students had resurrected our once-defunct creative writing club, adopted this saying as our official slogan, and enlisted a colleague and me to be faculty advisors. Dinosaur that I am, it was only after the fact that I learned the origins of this slogan. It's from the movie Mean Girls, which I missed during its theatre run (probably because it was released while I was raising a toddler). Thus, Mean Girls has also become our official club movie. 

If a student club that I'm going to co-advise has claimed an official movie, I'd better see it. So I did. And apart from the fact that even now I hate being mentally transported back to the social jungle of high school, I found the film surprisingly worthwhile, if only for the way it dramatizes power dynamics. The aptly named "queen of the universe," Regina George, terrorizes everyone throughout the film and when she is confronted in the climactic scene--a school assembly in the gym--she insists she's never victimized anyone. When the teacher Ms. Norbury asks, "How many of you have felt personally victimized by Regina George?", the students begin raising their hands until everyone is included--and then they are joined by the faculty, followed by the principal. Regina, of course, still doesn't get it.

I love this scene because it dramatizes a truth about power differentials: those with relative power and privilege are too often oblivious to the fact that they have it. Once in grad school, a professor asked us to consider the power differentials within our own classroom, a mix of MA and PhD students. After we discussed the usual--differences in class, gender, ethnicity and so forth--an MA student said, "You know, there's also that MA versus PhD student thing." All of us MA students (that was me then) nodded knowingly--"Oh, yeah, that thing"--while the PhD students looked at us blankly until one of them asked, "What MA/PhD thing?" These weren't clueless people by any means. It's just that when you're the one at the center of things, it can be hard to know there's a center at all, let alone that you're in it.

For the last few days I've been at a conference, and afterwards I thought about all this. One of my co-panelists analyzed the advice given to grad students in public venues such as blogs and books. Our culture is so hyper-individualistic, he argued, that we have difficulty conceptualizing "success" or "failure" as anything other than individual triumphs and disasters. (This point seems closely related to the "Just World Hypothesis" I posted about below.) If someone with a PhD fails to land a full-time position and must work as an adjunct, many would prefer to believe it's because that person is either an inferior scholar/teacher, or doesn't "want it" badly enough. Similarly, those who do have such positions often prefer to view themselves as more worthy, rather than recognizing that certain structural advantages (such as timing) may have factored in to their success. Combined with the individualistic blindness to increasing structural inequities, people in relatively privileged positions often project their own, necessarily idiosyncratic experiences onto others--people whom, frankly, they probably barely even know.

The result? Conversations like the one a friend of mine--a colleague from another institution, and an adjunct--had to overhear yesterday on the airport shuttle from our conference hotel. (She shared this with me.) They began by complaining about their teaching loads, wondered "how adjuncts do it," and then one professor insisted that "they" "like it that way," "they" don't want tenure track because "they" actually "prefer the freedom," "they" don't want to "have to" go to conferences or publish because it's exhausting" ... and, one of them added, "We're mostly adjunct now."  (Side note: If you're using the vague pronoun "they," you just might be committing a fallacy of some type.)

My friend interjected that this is a "nice story, but complete fiction--most of us would love something stable, and we want time to publish."  The professors stared at her briefly, then continued talking, ignoring her. 

Rudeness aside, my friend was disturbed by their lack of structural awareness. If their institutions are now "mostly adjunct" (i.e., the majority of their classes are now taught by woefully underpaid part-timers who receive no benefits and have no job security), this is an administrative cost-cutting decision. It didn't happen in spontaneous response to the sudden emergence of a lazy crop of doctorates who prefer a poverty-stricken itinerant life of "freedom" to the "exhausting" work of publishing and attending conferences. No pack of new graduates descended upon admin en masse, demanding that they be given the opportunity to work for chump change and zero benefits so they can enjoy their "freedom," to which administrators replied, "Oh--thanks for the heads up. We were about to create a bunch of new tenure-track permanent positions with benefits and opportunity for professional growth, retirement, and stuff like that--but since it sounds like today's doctorates would really prefer to be 'free' from exhausting demands, maybe we should take a second look and while we're at at, we can realize some cost savings."

But I don't want to single out these particular "mean people," because the overuse of adjunct professors is part of a larger cultural and economic landscape--as is the too-frequent blindness of those with relatively more privilege. Most of those who work in the service economy are subjected to similar economic exploitation, as well as to the same hyper-individualistic culture that blames lower-paid workers for their own situation (while, by implication, valorizing the economically "successful"). And this is starting to happen across the board, not just in service  industries. 

Somebody close to me has been working part-time now for several years. This person doesn't want much--just a full-time job with benefits that pays the costs of an extremely modest lifestyle, with which this person is content. Twelve bucks an hour would do it. But despite being a fine employee, this person has been cobbling together part-time work for nearly ten years now. What's worse: it's difficult to get a predictable enough schedule from one part-time job to incorporate a financially necessary second part-time job. "We need our part-time workers to be flexible," insist the bosses. Final insult: This person and most of the part-time coworkers have specifically requested that they be moved to full-time positions when they become available. But when companies need to add more staff-hours, rather than moving their existing part-time workers into full-time jobs, they hire more part-time workers. Why? Because when the feds come to collect their data, they can look like "job creators."

We've all seen people who have blown certain opportunities, whether in academia or elsewhere, and I'm not going to argue that lack of success is always a structural problem. Sometimes people do make mistakes, and sometimes they pay the price. But it's equally blind to argue that lack of success is always an individual problem--and the way our culture is, we skew toward the second form of blindness.

"Mean people" (disentangling the gender assumptions of the movie title) exist everywhere, not just in academia. It's terrible what institutionalized greed is doing to our society. Things are made more terrible by the fact that our hyper-individualistic society often shifts our focus away from institutionalized greed and toward victim-blaming, leading to a kind of cultural blindness that we can find anywhere, not just on the airport shuttle bus as my friend experienced yesterday. These two professors aren't the only mean ones, and frankly, in the larger scheme of things they are probably pretty minor players.

What bothers me, though, is that as academics, we are the ones who are supposed to have studied power differentials and systemic structures. We are the ones who are supposed to know better. We should be the ones on the front lines arguing for economic justice for all--including, but not limited to, our underpaid adjunct colleagues. Because they are our colleagues, and we need to remember that.

4 Comments

"such stuff as dreams are made on..."

10/5/2014

0 Comments

 
In more ways than one, this wasn't an easy week. Nor did it end in an easy way. 

On Saturday afternoon, about 25 faculty, students, and former students from our small department gathered for a literary reading in memory of one of our English majors, who died a few weeks ago of complications from a hereditary illness. In her too-short life, this remarkable young woman had endured more pain than any human being deserves--and yet she remained bubbly, enthusiastic, loving, and generous, despite having every valid reason not to be any of the above. Amidst all that, she produced some truly beautiful writing, some of which we were able to hear today.

As is so often the case with the best memorial events, this occasion was uplifting despite the sadness.  One of the most heartening things for me, as I expressed in my opening remarks, was to see so many of our former students return to their old campus to remember their classmate. Seeing that the friendships forged in class are still intact some four years later, realizing that the interactions we shared inside and outside the classroom might actually have been meaningful to those involved--what could be more satisfying for a teacher? To think that what we did might actually have mattered in people's lives--what else is there?

Unfortunately, there is a great deal else. Enrollment numbers. Student learning outcomes. Aggregated assessment data. Observable, measurable learning results. Development of marketable workplace skills.  Et cetera. 

I'm not arguing against any of those things per se (beyond pointing out that not all meaningful learning, especially in the humanities, is equally amenable to "observable, measurable results"). I'm not saying that numbers don't matter or that achieving desired learning outcomes doesn't matter, and I'm especially not saying that developing marketable skills doesn't matter. 

I'm arguing against the implication that nothing else matters. I'm arguing for the significance of qualities that certain powers-that-be neglect to consider because they are not as easily measurable or marketable--healing, relationships, meaning, community. Today we used our knowledge of language and literature to help sustain one another after a loss that feels cruelly random. We shared the writing that we had done and she had done, and excerpts from the literature we had studied together. 

I do wish somewhere out there, we could find a handful of administrations or accreditation agencies or stakeholders or, God help us, even employers, who care about human experience in its full dimension--people who, had they attended this afternoon, would have recognized some remarkable "outcomes" beyond the SLO's.  I hope such people are out there, for we need such people in positions of influence if we are to develop appropriate parameters for funding, assessing and sustaining education in the humanities.  Our disciplines may not be as easily "measurable" nor as narrowly marketable. They may not yield the largest numbers on the spreadsheet. But are those things all that matter?

A couple of weeks ago I led an educational discussion forum on Dead Poets Society, and knowing that it isn't universally loved by English professors, I decided I'd better prepare. In digging around for articles to see what academic authors had said about the film, I came across Kevin J.H. Dettmar's scathing review in The Atlantic last February.  Dettmar has his reasons--quite a few reasons--for calling the film a "terrible defense of the humanities." One note that resounds throughout his piece is that he finds the film "anti-intellectual." If the humanities is to be taken seriously, argues Dettmar, we must get away from emotional/sentimental responses to poetry and instead emphasize serious intellectual analysis.

Well, of course I'm in favor of serious intellectual analysis. That's what I'm doing with my life. And in the current cultural climate, I find few things more frightening than true anti-intellectualism. So how do I make sense of my intellectual commitments, and the fact that I still see incredible value in the kind of communal and yes, emotional experience that I shared with students and colleagues this afternoon? 

As I thought of Dettmar's critique, it occurred to me: Maybe the problem is that we incorrectly view "intellectual" and "emotional" as opposites. But the opposite of "intelligent" is not "emotional"--it's more like "stupid." The opposite of "emotional" is not "intellectual"--it's more like "cold." Today's reading included works that, for deeper understanding, require serious intellectual analysis--Shakespeare, Emily Dickinson--along with pieces written by students and work by the young woman we were remembering. The writings selected were both intelligent and moving; the ideas we grappled with were intellectually challenging and emotive. We were not stupid, and we were not cold. We weren't anti-intellectual. But we weren't only intellectual.

What we do in the humanities might not be easily measurable in quantifiable terms, but that doesn't mean it's not intellectually significant--even if it also provokes powerful emotions in a way that, say, calculus does not. The sense of community I note among our remarkable group of students may not be an institutional priority in any part of the country right now. Yet for them, it may be one of the aspects of their education that they value most.  

Right now, few institutions are taking account of anything beyond the immediately measurable. And the humanities--the study of what it means to be human--would appear, if one is merely looking at a spreadsheet, to be expendable. 

But I fear for a future in which humanities education is expendable--or in which it is only for the elite. I'd also hate to think the day is coming when we memorialize those we've lost by gathering together to analyze their spreadsheets, 

0 Comments

why? (a meditation on the "just world" hypothesis)

10/1/2014

0 Comments

 
Last spring, one of my freshman research writing students wrote her semester project on the issue of rape and why it is so often the case that people blame the victim. In addition to exploring some of the expected reasons--misogyny, patriarchy, entrenched and unquestioned beliefs about gender roles--my student investigated the cultural pervasiveness of the "Just World Hypothesis" put forward by Melvin J. Lerner in his 1980 book, The Belief in a Just World: A Fundamental Delusion. An online article published by Claire Andre and Manuel Velasquez (Santa Clara University) summarizes the hypothesis as follows: "People have a strong desire or need to believe that the world is an orderly, predictable, and just place, where people get what they deserve."  

Everyone gets what they deserve. Everything happens for a reason. The unfortunate must have done something to deserve their fate; inversely, those who experience good fortune must deserve it.  Couch it in the reward-and-punishment terms of various monotheistic theologies, or in the concept of karma espoused by eastern religions, or in the "we create our own reality through our thoughts and energy" bromides one finds in New Age-y memes--or keep it purely secular and posit that the poor, unhealthy and powerless have brought it on themselves while the wealthy, healthy and powerful have undoubtedly earned their success. Each of these is another iteration of the Just World Hypothesis.

And today I was reminded, yet again, repeatedly and painfully, of why I don't believe in a Just World; why I never will be able to believe in it; why I believe the world would be much better off if everybody stopped believing in it. 

I know--and have known--too many people who are suffering in ways they don't deserve. One of the many things I've learned from studying literature is to question whether this is so. I've also learned that compassion is usually a better path than judgment.

* * * 
Last fall as part of an early world lit course, I taught--as I have many times--"The Book of Job," I'm no theologian, so I apologize in advance to anyone who has a more theologically sophisticated understanding of this text than I do. I also apologize to those of you whose religious views differ from my own; I'm not asking you to understand it the same way I do. I'm an English professor, and I approached it as literature--as a story, in which one guy suffers way more loss and torture than any human being should have to endure. 

Skirting over the whole question of why an omniscient God would agree to a bet with Satan in the first place, as a class we honed in on what happened when Job's so-called friends came to visit. Here's Job cursing the day he was born, and Eliphaz pops in to tell him, "Who, being innocent, has ever perished? Where were the upright ever destroyed?" (Job 4:7-8)  Just what you always want when something's going terribly wrong:  "It must be your fault. It's gotta be something you did." 

And on and on it goes; Job wonders what he did to deserve this, he questions God, and he complains that there is no justice in the world. (If you've actually read this book, you know that the phrase "the patience of Job" makes about as much sense as calling Romeo and Juliet "a romantic story." Job isn't patient, and Romeo and Juliet end up dead.) While Job complains more and more vehemently, Eliphaz is joined by Bildad and Zophar and eventually Elihu, all of whom insist that the world, run as it is by a just God, is a fair and sensible place, and that Job's suffering must be his fault. To be suffering this much, he had to have done something wrong.

Our class turned this into a Twitter feed, complete with hashtags. "Job: I wish I'd never been born. #pityparty." "Eliphaz: You need to get over yourself. #godisjust." And so forth. Every single one of my students gave Job a hashtag that suggested less than full sympathy.  In addition to #pityparty, we had #whiner, #firstworldproblems, #whyme. The "friends" got hashtags like #trustgod, #godisgood, and the inevitable #everythinghappensforareason. 

I typed the students' "Twitter feeds" onto the computer, projected it, then asked the students to "play God," giving "thumbs up" to the posts they thought God would "like." Job's "comforters" fared well; Job, not so much. Whiner, they called him. (Never mind that he'd just lost all his children, his home, his livelihood and his health...who wants to listen to anyone whine?)  After we explored the poetic yet puzzling Speech out of the Whirlwind--which I won't get into here--we read the epilogue, where God tells Eliphaz, “I am angry with you and your two friends, because you have not spoken the truth about me, as my servant Job has." 

Whoa, I said. Who just got the thumbs up from God? Guess it wasn't Eliphaz or his pals.

So I went back to the computer and began reversing all the "thumbs up" my students had given to the friends' tweets. Turns out it was Job the Whiner that God liked after all. Scratch all those friends' speeches--they weren't "speaking the truth" about God. Scratch "everything happens for a reason." My students looked crestfallen, and confused.

We spent the rest of the class period talking about compassion. 

And no, compassion is not on the student learning outcomes, and our discussion that night may not specifically have prepared our students for the 21st century work force. As if work is all that we're here for. As if an education that touches on things that matter can be contained by a list of bullet-points. 

* * *
That night as I drove home, I remembered a long weekend five years earlier, when I flew to another town to spend a few days with a dear friend who had just been diagnosed with Stage IV cancer that had metastasized everywhere. I knew what this meant; I'd been through it with my father. It was past the time for a miracle. I didn't try to make sense of anything and I didn't try to ask "why." I already knew, from enduring too many losses already, that "why" is the wrong question to ask; at this stage the question becomes "how." How are we going to get through this? 

That weekend, "how" included a lot of talking, sometimes about important things and sometimes not. It included listening to Rachmaninoff piano concertos, powerful pieces that express everything that reaches beyond words. "How" included watching Jane Austen movies and comparing Austen's characters to people we knew. For her, "how" included sleeping a lot, and when she did, I went for long walks or wrote in my journal. I didn't try to ask why. If there is a "why," we humans don't get to know.  (That much, I did grasp from the Speech out of the Whirlwind.)

As our time together drew shorter, she thanked me for being the only person to have avoided asking "why." Her other visitors, she told me, thought they knew. Her traditionally religious friends told her this illness was due to her lack of faith and suggested she pray; there will be a miracle, all of which is part of God's plan. Her non-religious "hippie" friends wondered whether she'd eaten wrong or held on to "negative thought patterns"; they wanted her to give up meat, try some kind of herbal supplement, meditate, whatever else.  She was that open-minded kind of person who accepted people as they are and consequently had friends from many walks of life, so her friends' suggested "solutions" varied. Yet they all touched on the same theme:  You must have done something. And, Maybe there's still time to fix it, if you do the right things now.  

Same thing I got when my dad was dying. I lost track of the number of people who asked me, "So your dad got brain cancer? What did he do?"  

"Nothing," I'd say, and then I'd get the grilling, "Well, he must have eaten something..." "He didn't work out..." "His faith was weak..."  Eliphaz. Bildad. Zophar. God gives you all a thumbs down. 

The last night, we said goodnight and hugged, and my friend went to bed. My taxi for the airport arrived at four in the morning, and I was grateful for the darkness. We spoke by phone every day for the next two weeks, and once she told me, "Thank you for not trying to make any sense out of all this." Her last words to me, a few hours before she died, were: "I have to go now."

* * *
Andre and Velasquez say, "If the belief in a just world simply resulted in humans feeling more comfortable with the universe and its capriciousness, it would not be a matter of great concern for ethicists or social scientists. But Lerner's Just World Hypothesis, if correct, has significant social implications. The belief in a just world may undermine a commitment to justice."  (Emphasis mine- read the article HERE,)  What are the consequences if we continue to believe that everyone gets what they deserve, and no one gets what they don't deserve? Of course it's not very nice to suggest to dying people that they brought misfortune on themselves through bad living. But are there potential consequences to the Just World Hypothesis that are even more serious? 

Today, as I reeled from still more evidence that the Just World Hypothesis is wishful thinking, I walked down the street outside my office, heavily populated by the down-and-out among us: homeless people, poor people, folks struggling with physical disabilities and apparent mental illnesses. So many of the forces around us say, "They must have done something to deserve it." 

Maybe they didn't.

But if we start questioning that, we might start questioning a whole lot of things. We might have to change a lot of things about the way we've organized the world. Asking why people suffer might be an un-answerable question, but there are other questions we might ask. Why do we so often want to blame victims? Why do we so desperately want to believe "everyone gets what they deserve"? What would happen if we stopped doing that?

Perhaps it's no coincidence that some of the powers-that-be would prefer to limit education to what can be contained in bullet-point "learning outcomes." 
0 Comments

SO EVERYone gets A TROPHY? (GET OVER IT)

9/21/2014

1 Comment

 
This Week in Freshman Composition: My students are preparing to write argumentative essays by reading a series of articles on the subject of "The Millennial Generation." I've assigned articles from a variety of perspectives. Some of them claim that this is the most narcissistic, "entitled" generation ever to walk the face of this doomed planet, while others state that this is not necessarily so. 

The "no-need-to-panic" folks argue from a variety of angles: Haven't teenagers always been more self-absorbed than adults? Isn't that just the nature of being young and inexperienced?  Is some of the "evidence" of self-absorption really a reflection of attitudes found among the most economically and racially privileged Millennials, rather than in a broader cross-section of society? Haven't older folks asserted the "inferiority" of the up-and-coming younger generation for as long as humans have walked on planet earth? Can our generation really claim to be "better" people on the basis that we didn't take "selfies" or use social media--considering the only reason we didn't is that those things hadn't been invented yet?  

"Evidence" of Millennial self-absorption likewise draws on multiple factors, many of which relate to technology--social media, cell phones, reality TV. But some discussion focuses on other issues, such as "helicopter parenting" or changes in the way we conduct youth sports. One of the most prevalent mantras I notice whenever I read articles on this topic?  "Nowadays, everyone gets a trophy," sigh the adults of my generation, wringing their hands and shaking their heads. And then someone adds, "Yeah, and just for showing up!"    

Quelle desastre! Here endeth civilization as we know it!  Thank God that, as a result of clever detective work on the part of us intrepid Boomers/Gen X-ers, we have finally located the source of all evil in our degenerate society: Some of the kids whose bedroom shelves contain an array of cheap plastic trophies are not actual winners!  Some of those trophies don't point to any accomplishments! Those dorky little statues acknowledge nothing more than showing up!  

Let's take another look at this, shall we?

My son, now a teenager, has a bunch of trophies in his room for playing soccer and basketball in elementary and middle school. And no, he wasn't what you'd call a "winner." Many of the teams he played on had winning seasons, but he wasn't one of the superstars. He had his moments: he once drained a bucket at the buzzer to lead his team to basketball victory, he scored a few soccer goals, and as goalie, he made a lot of saves (and broke a couple of wrists). But unlike a few of his teammates, he didn't stand out in either sport--though he wasn't "bottom of the pack" either.

What were those trophies for, then? What could he possibly have been rewarded for if he wasn't a standout superstar? I mean besides showing up (although "showing up" isn't necessarily the worst thing to commend someone for doing--sometimes that turns out to be the hardest part). If our son wasn't the one-and-only superstar, what did he learn that could possibly be worth acknowledging?

Well, for starters, there were these minor little character traits like sportsmanship, teamwork, cooperation, persistence, regular practice, commitment--all kinds of stupid little qualities that we certainly wouldn't want to foster in our children by "rewarding" them.  

He learned to accept defeat graciously, high-fiving the opposing team after the game and congratulating them on the win. Because we all have to learn how to do that, and sometimes it's not easy, even now.  

He also learned to accept winning graciously, high-fiving the opposing team and telling them, "Great game--you'll get it next time, dudes." Because, despite the antics of some of our high-paid professional athletes--"winners" who apparently "deserve" their accolades, even though some of them are also known for beating people up--kindness, especially when you're in a power-up position, might actually be more beneficial to society than a winning touchdown. (Blasphemy!)  

He learned how to get back in the game after a disappointment like getting scored on--rather than wallowing in momentary disappointment.  (His parents learned that too.  On our team we had a joke: The worst position to play? Goalie. Second worst? Goalie's mother.)  

He learned to encourage rather than demean teammates when they made mistakes. Because,  apocalyptic discourse about the dangers of "niceness" aside," people actually do learn more effectively from encouragement than they do from being disheartened.  (News flash.)

He learned that collaborative endeavors require commitment (i.e., showing up), and that he owes it to his teammates to be there for games and practice.  He also learned that the only way to improve any skill is to practice regularly,

He learned to see things through to the end, even when things don't go as planned. In one memorable season, a Series of Unfortunate Events culminated in our son wearing casts on both arms, at the same time. Pulled by his doctor from playing for the rest of the season, he still ran on the field during practice, resembling the Black Knight from Monty Python and the Holy Grail.  He still came to games, cheering on his teammates in victory and encouraging them in defeat, joining in the communal celebrations afterward, and high-fiving the opposing teams with his casted arms. 

Surely we shouldn't have rewarded any of that.

Oh, yeah, and he was getting physical exercise.  And since his inclusive soccer and basketball leagues promoted encouragement over verbal abuse (and didn't allow the kind of bullying to which some of my own P.E. teachers turned a blind eye back in the day), perhaps when he hits adulthood he won't harbor the kind of antipathy toward physical activity that often plagues those of us who took heaps of verbal and even physical abuse on the field, back when it was believed that only "winners" deserve accolades.  For all the hand-wringing about video games, childhood obesity and lack of physical activity, you'd think more people would support giving incentives for all kids, not just athletic superstars, to participate in sports.  But I digress.

Guess what? My son--and the many teammates I met during the sports years--did not emerge from childhood inclusive sports leagues with delusions of athletic, or any other, grandeur. Amazingly, despite their supposedly video-game-addled brains, kids today can tell the difference between being acknowledged for qualities like teamwork, sportsmanship and persistence, and being acknowledged as a star athlete. They know the difference, and they are cool with it. 

Most kids are not superstar athletes, and most of them don't aspire to be (even if their parents are clinging to hare-brained fantasies). The kids who do want that, and who might have it in them to be that, are usually tapped by scouts from the more competitive leagues. Those kinds of higher-performance opportunities are also out there, for the kids who want that--as they should be. We need both inclusive and competitive leagues, to meet the needs of many different kinds of children. And guess what? We've got them. If your kid wants, and can handle, a sports experience where trophies only go to the champs, he or she can still find that. 

To the superstar athletes among us? Power to you! You can do something most of us can't, and I salute you. (But please do the rest of us a favor: If you do make it big, please try to role-model some of the other qualities that we're trying to instill in our own kids, not just "winning" at all costs. Because some costs are too high.)

As for the rest of us? Everyone needs physical activity, and everyone needs a place to belong. Everyone needs an opportunity to achieve and accomplish in some area--but let's face it, no one can excel in everything, and everyone also needs opportunities just to participate, whether we are super-achievers or not. Participation helps us to develop qualities beyond just "achieving"--qualities that are too often in short supply. When we participate in a team endeavor, when we grow and develop stronger character as a result, what's wrong with acknowledging it?

Nowadays my son is more interested in music than in sports. The trophies are still in his room, and for him, they are not inappropriate ego-boosters but nostalgic reminders of his childhood. Sometimes when we look at the trophies and team pictures, we get to talking: about time spent with friends, Friday nights on the basketball court and Saturday mornings on the soccer field, snack time afterwards with his buddies and their parents and bratty little siblings, horseplay after the game to expend pent-up energy, camaraderie on team picture day, and yeah, those occasional great days where he did something memorable, even if he won't go on to play in the World Cup.   

Though I haven't noticed my own son doing this, I suppose I should allow for the possibility that somewhere out there, there are kids who wake up each morning and spend several minutes contemplating the array of trophies on their shelves while thinking, "Look! I'm all that!"--without realizing they are not. Perhaps these miniature egotists then strut off to school thinking they should rule the world, all because they got trophies that they mistakenly thought they deserved when really, the trophies were "just for showing up." Perhaps these kids emerge as entitled brats with the capacity to destroy society.  (If so, imagine the magical superpowers they must have--still in their teens, lacking any kind of real-world power, and somehow they have managed to destroy our social order.)

Or today's many problems could, I suppose, actually be the fault of the people who have tons of power and tons of money, who are spending billions to hold governments captive to their interests and weasel their way out of environmental and workplace regulations, who are going through elaborate machinations to avoid contributing to the tax coffers and while slashing the safety net, generating a bunch of "jobs" that pay poorly and have no benefits and are pretty much guaranteed to keep the poor in poverty and ...

Nah, that couldn't be it.  Let's blame teenagers. Especially the ones who've gotten a trophy "just for showing up."




1 Comment

for the love of books

9/14/2014

2 Comments

 
My writing students do a lot of in-class focused freewriting, and I always write along with them, I've got a number of rationales for doing so. It fits well with my teaching philosophy: I'm modeling the writing process, saying "do as I do" rather than "do as I say." It makes me a better teacher: when I put myself through the same process as my students, I understand the blocks and pitfalls they might encounter because I experience them too. (Funnily enough, finishing dissertations, publishing articles, and writing books don't eradicate those challenges.) Last but not least is the purely practical rationale: Some weeks, my schedule is so tight that in-class writing time is the only writing time I get.

Such was the case last week, when I felt like I was in one of those circus juggling acts where some dude keeps throwing the juggler an extra item to add into the mix--and then lights one of them on fire. My students are in the process of writing a piece that uses reflection on place as a starting point, so during those ten sane minutes of freewriting, I joined them in responding to the prompt, "X is a place that only I could appreciate, because..." 

As I wrote, I found myself--mentally, anyway--thousands of miles away, on the south coast of England, in a walk-up flat on the second level of a boxy brick building on one of southeast England's largest postwar council housing "estates."  This flat was once occupied by my father's bibliophile brother, Uncle R, who died nine years ago, and I still can't believe it's been that long. (When we think about those we love who have died, it's always hard to believe "it's been that long," no matter how long "that long" is. My hypothesis is that in our minds--the place where most of us actually live--they have never left us, and therefore it really hasn't been "that long.")  

Though small and plain, Uncle R's flat was also occupied by no less than several thousand flatmates: fictional characters, who lived in his countless books. 

Now these were not orderly stacks of books, catalogued and placed alphabetically upon attractive designer shelves. These books, almost all of them used before Uncle R even owned them, overflowed the shelves, crawling over couches and chairs, sneaking into nooks and crannies, trembling in precarious towers that climbed the walls. I never went there without taking meds for my dust allergies, and whenever I arrived, my first order of business–assuming I wanted to sit down–was excavation. Dig deep enough and  I could sometimes find furniture buried beneath the books. 

Uncle R would always apologize, blaming our predicament on the books themselves.  "Those ridiculous books," he'd complain, "they're getting completely out of hand,” as if they were reproducing themselves in furtive after-dark encounters, unassisted by him. As we drank the requisite "nice cup of tea," Uncle R would shake his head and say, "Ah, well. Something must be done about these silly books.”  Occasionally something was done about the silly books–-usually, moving them around in order to make room for . . . more books. He never stopped shopping for them. If you were out with Uncle R, there was always a moment where he would disappear. One minute you'd be walking down the street with him, the next minute he had vanished. I learned not to worry that I was in some kind of sci-fi novel and instead, to scan my surroundings for the used bookshop that must be nearby. There always was one, and he was always there. 

If my uncle were alive today, he might have ended up on reality TV. But I didn't think of him as a hoarder, because he clung to nothing else: only his books. And he didn't collect just to have things; he read them, and he remembered them. To him, books were not objects but doors to other worlds, occupied by characters who were in some respects real.

For two decades my husband and I traveled frequently to England, even living there briefly. Our experiences were always far different than the usual American tourist itinerary, not only because we slept in book-infested council flats but because we enjoyed customized tours that can’t be sold or purchased.  A pub would look like any other until Uncle R pointed out, “Dickens mentions that pub in Great Expectations, doesn’t he.” Endless green hills dotted with sheep would blur together until Uncle R announced, "Ah, this hill--it plays a key role in an Austen plot."

One of R's younger brothers, Uncle M, taught O- and A-level literature and died tragically young (fittingly, while in the library). Back when both uncles and my father were still with us, they would discuss literary characters and their authors so casually that in my pre-university days, I'd eavesdrop on their analysis of a character's motives and mistakenly think they were gossiping about a distant relative. The same phenomenon worked in reverse, with literary quotes often brought in while discussing family members. "You know what old T.S. Eliot would say about [so-and-so]," one of them would say, and back before I learned otherwise, I thought T.S. must be an old friend of the family.

Some might say this isn't the greatest way introduction to literature--too colloquial, insufficiently analytical, even slightly juvenile. (And that's before we analyze the underlying psychology of the messy apartment).  Some might criticize the canonical bent of the literature I was introduced to. (Here I'd add that in later years, Uncle R happily added contemporary authors from a variety of cultures to his never-ending collection--female authors too--but since England was where he lived, the English literary locations were the ones I got to see.) 

Quibbling aside, nowadays I consider my less-than-conventional introduction to the British literary canon to be a remarkable gift. Think of what world we might live in if more people had been fortunate enough to receive a similar gift.

Of course personal experience is not a place for literary study to stop. If you want to pursue its study seriously, you need to stretch beyond having an uncle who knows where to find a genuine Dickensian pub and do more than revel in the eccentric chaos of a bibliophile's flat. But feeling a personal connection to some of the most resonant stories that have been committed to writing; deepening your understanding life through the prism of fictional characters who have been rendered real through the skillful use of language; learning a love for literature through osmosis from an enthusiastic mentor--all this is a great place to start, whether you end up pursuing the study of English professionally as Uncle M and I did, or keeping it personal as Uncle R did. 

Too many at the helm of education today have imbibed the same ethos as the censorious characters in Rushdie's Haroun and the Sea of Stories who ask, "What is the use of stories that aren't even true?" Unable to devise a quantitative answer to a question calling for a qualitative response, such people assume there must be no use; that literary study is superfluous; and that if it doesn't lead to directly measurable "outcomes" or a straight line to lucrative employment, literary study is a disposable budgetary line item.  What if such people had had their own Uncle R, or a book-infested flat somewhere in their psychic pasts? Would they already understand the "use of stories that aren't even true"? Would such an understanding foster a different set of priorities and decisions, even for those who pursue other lines of work? (In his working days, by the way, Uncle R was an accountant.)

I miss my uncles, as well as my father--also a book lover, though he worked as an engineer and, accordingly, kept his own books in an orderly fashion.  (My husband and I used to refer to the two of them as "Felix and Oscar.")  I wish I could clone those who influenced my own affinity for books and stories, sending them out as supernatural emissaries to the technocrats and bureaucrats of the world to whisper like miniature angels in their ears, counter-voices to the miniature devils who whisper in the opposite ear saying that storytelling is frivolous. 

Perhaps it's too easy for those of us who "profess English" to dismiss the love of books that started it all. I like to believe that if more of us had been fortunate enough to grow up loving books and stories, we might not be in as many of the collective messes that we find ourselves in today.

2 Comments
<<Previous
Forward>>

    Archives

    November 2016
    October 2016
    July 2016
    June 2016
    April 2016
    February 2016
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014
    July 2014

    ABOUT THE AUTHOR

    The Underground Professor teaches English at a small private university. This blog explores how we might revitalize the humanities in the twenty-first century--and why it's important that we do so, in light of the our culture's current over-emphasis on profitability, quantitative measurement, and corporate control.


    Categories

    All

    RSS Feed

Powered by Create your own unique website with customizable templates.