Notes from the underground professor
  • Underground Professor
  • About This Blog
  • Why the Pseudonym?

KNOWING WHAT TO KEEP

11/15/2014

0 Comments

 
I'm fascinated by the human brain--the way it works, and the way it doesn't. For instance, when I try to memorize something--a favorite Yeats poem, the Schubert piano piece I'm currently working on--my brain seems to fritz out on me. Then a song that I never even liked will play on the golden oldies station and I'll realize, sometimes to my horror, that I still know all the words to it, despite never having tried (or even wanted) to learn them. Next, those annoying lyrics, which I apparently memorized by accident, will get stuck in my head and I end up with unwanted earworms. This week, it was a line from Kenny Rogers:  "Every gambler knows that the secret to surviving/Is knowing what to throw away, and knowing what to keep."

Well, maybe there's a reason that line repeats itself in the eternal cassette player of my mind. (Yes, I'm dating myself here. That's okay.)  When I think about the subject matter I teach--English--the question of what to throw away and what to keep asserts itself constantly. What belongs in our curriculum; what should be removed? Which approaches to understanding literature are helpful and necessary; which are stodgy and outdated? And, if the majority of society no longer seems to want what we're offering (to the extent society ever did want that), how far should we go to capitulate to current tastes and trends in order to remain "relevant"?  If we go too far and cave in to the shallow values of our current society, do we dilute the potential value of what we do until it's no longer even valuable?  Yet if we refuse to adapt with the times, do we risk devaluation by becoming irrelevant? 

This week I read an Inside Higher Ed article reporting on a recent conference on "the future of the liberal arts" (HERE). While the article itself is somewhat more nuanced than the link's title--"liberal arts must return to a 'purer form' in order to survive" (quotes mine)--the conference it refers to rehashes a longstanding meme: that those of us in the liberal arts today are responsible for our own demise because we've allowed our disciplines to become "impure," tainted with nasty stuff like critical theory, postmodernism, jargon, and various contemporary concerns. 

John Agresto, former president of St. John's University, posits that the decline of the humanities is "less a murder than a suicide" and suggests that our survival depends on a "return" to a "Great Books" curriculum (such as that taught at St. John's). What's "ruined" the humanities, according to Agresto, is "hyperspecialization" and, especially, critical theory which he claims, "limit[s] students' free inquiry" by committing such travesties as "portray[ing] the founding fathers as mere 'white racists.'"  Next comes the usual "let's go back" argument, calling for us to return "to an older mode of instruction, and instilling critical thinking skills." The article posits that there was a time when the liberal arts were a 'gift' given to everyone," evoking an imagined golden age of "classic liberal arts instruction . . . in which a cat may look at a king." 

I'm still trying to make sense of bashing "critical" in one breath (Critical Theory=Bad) while praising it in the next  (Critical Thinking=Good).  To be sure, it is overly simplistic to portray the founding fathers as "mere white racists," when historical contexts and those who function within them are highly complex, always have been and always will be. There's always more to the story than just one facet.  (I would add, despite being taught by a good many critical theorists myself, I was never taught to interpret history in the simplistic manner suggested here--my learning was far more nuanced than that, and thus this characterization of critical theory as a "limit on free thinking" reads to me like a caricature.)

Yet it's equally if not more simplistic--not to mention un-critical--to ignore the huge blind spots at the core of America's founding: the bold visionary declaration that "all men are created equal," made by a group of men, many of whom owned slaves, at a time when indigenous people were being forced off their own land, while half the population--women--went unmentioned. Is it really accurate to characterize a superficial and triumphalist interpretation of history or literature as a "gift" to everyone, when during that supposedly golden age of the liberal arts, most people were actually left out of the picture being presented? My own critical thinking process tells me "No."  (Thank you to my critical theory professors.)

And when exactly was this time when "a cat may look at a king"?  When I think backward, what I recall is a time when most people didn't even have a shot at college education; and if those from society's less elite sectors somehow did manage to enter the university, they rarely found their own experiences or perspectives reflected in what they were required to study.  (Furthermore, while a few "cats" here and there may have gotten occasional glimpses of "kings," education has never been so radical that the "kings" were in turn taught to look at--let alone think about--the "cats.")

Critical theory reminds us, "Hey, there are a few other lenses through which you can understand this, rather than the single lens you've been using."  Looking through a different lens often results in a picture more complicated, and admittedly often more ugly, than the one previously seen.  We're simply killing the messenger if we blame the observations made by critical theorists for the demise of "old-school" liberal arts education that was, by nature of its inherent exclusivity, doomed to fail in an increasingly pluralistic society.  Critical theory calls for a deeper recognition of the complexity and inherent contradictions in much of what we study, whether it be our history, literature, art, political systems, religious traditions, or anything else. To "return" to an "older mode" of understanding--a more simplistic mode--means moving away from "critical thinking" rather than toward it.

(If you've been reading this and thinking "just another indoctrinated, theory-addled contemporary academic," you might like the next paragraph.  If you've been nodding along with me in agreement so far, here comes the part you might not like.)

But I do have criticisms of critical theory, too--especially when it comes to its difficult language, its jargon, its intellectual elitism. What bothers me here is the hypocrisy: It's one thing to act elitist while not pretending to be anything otherwise. Obnoxious, maybe, but at least elitists who own up to their own elitism are consistent. But when you're calling for a more inclusive framework--an approach to literature or history or art appreciation that advocates for the perspectives of the less privileged--it's ethically problematic to do so in language inaccessible to many of those on whose behalf you claim to speak. Too often, critical theory is couched in language more exclusionary and difficult than Shakespeare ever was. Too often, well-known theorists and critics advocate for exploding the "literary canon" when it's their own theories that are now canonized (through such mechanisms as curriculum and syllabi).

I actually like the idea of liberal arts/humanities as a "gift given to everyone," and if we're going to keep their study alive and vital, I'd argue that taking such an inclusive and generous approach is exactly what we need to do. We need to include everyone, engaging in ever-widening and deepening conversations that pose and explore the big questions about life's meaning, offering a challenge to the purely materialistic, mechanistic, individualistic way of understanding the world.  

But I disagree when people call for us to "go back," thereby implying that there was a time when this gift was given to everyone. Such a time never existed. Critical theory--rather than "tainting" the liberal arts, as the call for a "purer" approach would suggest--has served a vital function by pointing  out the many voices and perspectives that have long been excluded, both in terms of what will be studied and who gets to do the studying.   

When it comes to "knowing what to throw away and knowing what to keep," I'm in favor of keeping a great deal: an in-depth understanding of our history; major philosophical questions about the purpose of life; the art, music, and literature that throughout the centuries has attempted to address those questions; the notion that there's more to life than just profit-and-loss statements; the potential of the liberal arts for training us to think through vantage points beyond what our own narrow, individualistic focus would allow. Maybe my belief that all this is possible makes me rather quaint. 

Yet there is also much I'm in favor of throwing away: racism and ethnocentricity, sexism, rigid gender roles, exploitation, homophobia, elitism, class-based hierarchies, snobbery, anything that dehumanizes anyone, and approaches to knowledge that fail to address --or in some cases even acknowledge--the implications of all of the above. Maybe my belief that the "good old days" weren't so good, that critical theory has substantial merits, and that dehumanization in all its forms needs to be exposed and challenged makes me rather "strident," as some critical academics have been disdainfully called.

If understood and applied in the spirit of inclusion, I'd argue that critical theory opens up more rather than fewer possibilities for making the liberal arts relevant, meaningful, and sustainable. Yet if critical theory is approached in an exclusionary manner and couched in language inaccessible to most, then perhaps we have not come so far after all. 

I'd hate to think that we are throwing away the wrong things--curiosity, a spirit of community, humanity, the quest for meaning.  I'd also hate to think that we are keeping the wrong things--the exclusionary and dehumanizing attitudes that have historically constrained the liberal arts from being the widely disseminated "gift given to everyone" that they should be. 
0 Comments

Daily affirmations (without Stuart Smalley)

11/13/2014

0 Comments

 
There aren't many things I miss about the early nineties, but Al Franken's Saturday Night Live character Stuart Smalley is one of them: "I'm good enough, I'm smart enough, and doggone it, people like me!" That skit always cracked me up.  Poor, delusional Stuart and his shallow sound-byte solutions to complex problems. It's so easy to laugh at him, and at anyone who thinks they need "daily affirmations." 

That is, until you find yourself operating in an environment where you don't get affirmed very often. 

If you're living in an environment devoid of encouragement, you might not immediately notice its ill effects. When you do, you might not recognize them as such. It's hard to get going in the morning...well, that's normal, right? Energy levels sag, we need more naps...well, we are getting older.  We're less motivated to do things that used to turn our cranks...should we get our hormones checked? We hear ourselves snapping with the people we claim to love most...well, kids are irritating, aren't they? Or maybe we're just rotten people? When our health seems to be diminishing and our attitudes need adjusting, we tend to do one of two things: accept it as normal, or blame our flawed selves.

Curiously, we don't do that this with plants. If a plant fails to thrive, we don't call it normal, and we don't blame the plant. We recognize, rightly so, that something's gone wrong in the environment--that the soil lacks nutrients, or the plant was attacked by a disease or pest or severe weather event, or the amount of light or water received was too little or too much based on the needs of that particular plant. We don't dowse ferns in ethyl alcohol rather than water and expect them to thrive (or even survive); we don't plant bulbs in sand and expect them to bloom; we don't put our herb gardens in a windowless basement and act surprised when they wilt. We know that living things require the right mix of soil and water and light. 

The fact that human beings are also living things--well, sometimes we seem to forget that. But we need the same stuff that plants need.

A couple of days ago, a journal accepted one of my articles for publication. Of course hearing "yes" is always a good feeling, especially in an endeavor where "no" is more the norm. But this time, "yes" felt especially energizing--so much so that it stunned me to realize how deprived of professional encouragement I've been lately.  (As a crazy but weirdly wise relative once told me, "You drink water every day and don't think about it, until you find yourself parched in a desert. If someone gives you a drink then, that's the one you'll remember.") It made me realize just how serious the consequences of long-term demoralization can be, not just on the individual level but on the wider scale as well.

Affirmation, encouragement...what am I, some kind of new age guru who wants everyone to hold hands and sing "We Are the World" while wearing dorky matching outfits like the silly folk group in A Mighty Wind? (Great movie, by the way.)  Nope. Am I advocating empty praise, just telling people "You're great" whether they are or not? Not at all. (In fact, quite a few social science studies suggest that empty praise is quite damaging.)  

What I'm advocating is actually that sacred currency of the academic realm--"critical thinking." Yes, critical. Not in the sense of Random House definition number one--"inclined to find fault or to judge with severity, often too readily."  The world has plenty of that already. I'm talking about "critical" in the sense of Random House's third definition-- "involving skillful judgment."  (In academic circles that's what we mean--or claim to mean--when we talk about being "critical," though at times we all lapse into operating under the other, more widespread definition.)  

Skillful judgment is exercised by good leaders who know how to bring out the best in people--not by pretending flaws don't exist, but by noticing where improvements need to be made and where strengths lie. True leaders know how to offer specific guidance, and when growth isn't happening, they figure out and identify what needs to be done differently. When it comes to skillful judgment, part of the "skill" includes knowing how to say things in a manner more likely to encourage than discourage. Effective leaders know that "fault-finding and severity" is like pouring ethyl alcohol on your plants instead of water.  They understand--through critical thinking--that living things thrive in environments designed to meet their needs, that we all need the right amount of light, soil and water, and that individual needs will vary.

Academics are no exception. Despite having undergone the doctoral process, we're still human (presumably). We are living things, and we need what all living things need.  But those things can be tough to find here.  We tend to be highly trained professional fault-finders.  Peer review can be brutal; expectations can feel impossible.  It's easy to feel that whatever we do, it won't be enough.  There are always more books or articles we should have read, more potential counter-arguments  we should have anticipated, more nuances we should have addressed, more artful ways we could have articulated, more pedagogical tricks we could have employed.  And in this new academic climate that somehow sneaked up on us, with its emphasis on perpetual assessment, we often sense the implication that we're doing our jobs badly, and that when students don't do so well, it's presumed to be our fault (even if, say, said students skipped fifty percent of our classes or tried to skim by without reading any textbooks). 

It's also easy to project this fault-finding ethos into the rest of our work. How often do we read student papers not with an eye toward noticing what is promising--"exercising skillful judgment"--but through the lens of looking for mistakes, as though we are searching for evidence that the students really are as lousy as we suspect they are?  "Wrong," we write in the margins; "vague," or my personal favorite, "awk." (What's more awkward than the abbreviation "awk"?) 

So what, you may say? Serious intellectual pursuits aren't for the soft or faint of heart. If you can't stand the heat, get out of the conference room or the classroom. 

That's true. Serious intellectual pursuits demand a lot of us. All the more reason we need to think more critically about, and strive to create, the kind of environment where such pursuits can thrive.  I know that I'm far more effective, as a scholar, educator, parent, spouse and friend, when I feel encouraged rather than diminished.  Poor Stuart Smalley, though funny, is actually quite sad when you think about it: He's reduced to looking in the mirror and telling himself he's "good enough" precisely because he's been told far too often that he's not.  Perhaps if we lived in an environment more conducive to healthy growth, there would be far less need for people to stare at themselves in the mirror.

Some say, the world is a tough and cruel place; our job is to help people prepare for that harsh reality.  But there's no brilliance in telling students (or anybody else), "Someday, somebody is going to be mean to you--so I'm going to help you out by being mean to you now."  There's not much critical thinking in that approach.

Yes, it can be a tough and cruel world.  That's the whole problem.  But the job of teachers, and of leaders, shouldn't be to add to the cruelty.  Instead, the task should be to help those under our influence to grow hardy enough to thrive in that cruel world, perhaps in the process even growing strong enough to make a start at changing it.  We do that by creating an environment that is conducive to thriving--an environment rich in encouragement and honest affirmation.

0 Comments

WELCOME TO DYSTOPIA (A TRUE HORROR STORY)

11/10/2014

0 Comments

 
This Halloween week, I was made too aware that when it comes to horror, it’s not the prospect of supernatural monsters that we need to fear. The horror is right here, right now.  

In Fahrenheit 451, Ray Bradbury imagined a world that was, back in the 1950s, considered “futuristic.”  Unfortunately, nowadays there is nothing farfetched or sci-fi about Bradbury’s bleak vision. For, just as in Fahrenheit 451, our purposes have become inverted. We may not have firefighters who are tasked with starting fires rather than extinguishing them (yet). But we do have the metaphorical equivalent: Specialists in many fields are now expected to do the opposite of what was originally intended, to destroy that which we were once expected to nurture.

Librarians, for instance, were once hired for the purpose of developing their collections. Now, many of them are being tasked with culling those collections and deciding which databases to eliminate. Educators once helped learners to expand their worldviews. Now, we are often required to truncate our course offerings (“students hate choices,” some have been told).  Those at the educational helm once served as the guardians and champions of the liberal arts. Now, many of them seem fixated on destroying the very foundation of the institution that makes their positions possible.  Meanwhile, books are disappearing.  So far, the bonfires may still be more metaphorical rather than literal, but that doesn’t mean they’re not equally destructive.

And, just as in Fahrenheit 451, a huge swath of the population numbs itself to the growing dehumanization.  Today most people are being treated as commodities rather than human beings, in a world that has lost its way and reduced everything to that which can be bought and sold. Most of us are manipulated, commodified, and dehumanized. More of us ought to be outraged, and perhaps if so many weren't numb, more would be. But too many people stick buds in their ears, stare in a daze at the giant screens on their walls. Like Montag's sad wife Mildred, too many numb  themselves to the point where thawing out might prove too painful.

Lest you think I exaggerate: Take the article (if you can call it that) that appeared in Time.com this week: “Why Ph.D.’s shouldn't teach college students,” by Marty Nemko, who is described as a “life coach.”  (Au revoir to Time--I remember when you were halfway respectable.)  Nemko begins by citing the usual alarmist memes: almost half of college freshmen don’t graduate within six years, some studies show students learn little in college, one-quarter of graduates were found to be “living at home” two years after finishing college, almost half said their lives “lacked direction,” and twenty percent made less than $30,000. 

I’ve certainly noticed this dearth of well-compensated jobs and clear career paths, and I know people who have had to move home after college.  But whose fault might all that be? The fault of college professors? Are we the ones who supported the systemic destruction of unions, outsourcing, a stagnant minimum wage, and the erosion of pensions, medical care and other benefits? Are we the ones who decided that “corporations are people,” that elections should be buyable, and that we should turn a blind eye to white-collar crime? If the economy is becoming more difficult to navigate, the blame for that rests on those who are in charge of the economy. (I'll address issues such as time to graduation and whether students are learning at a future date.)

Nemko goes on to state that “college hasn’t changed much in centuries”—a preposterous claim, as if education today is still delivered in Latin to Anglo-Saxon Protestant males only, and as if we have stuck with the trivium and quadrivium rather than adding any new fields of study. “There’s still a research-oriented Ph.D. sage on the stage lecturing on the liberal arts to a student body too often ill-prepared and uninterested in that,” says Nemko--as if the liberal arts are central to today’s university experience (don’t I wish!), as if student “ill preparation” is our fault, as if the point is to cater to student “interests," and as if the “sage on the stage” model is used exclusively. (And for that matter, as if most undergraduates are being taught by full-time, “research-oriented” professors instead of by underpaid adjuncts.)

The longer Nemko argues, the more illogical his statements become, until he asserts that Ph.D.’s shouldn’t even teach because “the gap between [Ph.D.’s] and their students’ intellectual capabilities and interests is too great.” I’m slightly amused at his backhanded acknowledgement that we “snobbish” Ph.D.’s might actually know some stuff (so much stuff, apparently, that our intellectual prowess has rendered us incapable of communicating with our fellow human beings). I'm less amused by the fact that he's calling today’s students stupid.  

So who should be teaching students, according to Nemko?  “Bachelor’s-level graduates who themselves had to work hard to get an A.”  What an excellent idea; Find recent graduates who struggled with the course material themselves and have them do the teaching, without striving to understand the more complex material conveyed in graduate school. Show of hands time: How many of you would like to go under the knife of a surgeon who, as an undergraduate, was taught biology by someone with a B.A. who “had to work hard” to get it? (What’s that you say? No, thank you?)  Nemko also suggests that prospective teachers “complete a pedagogy boot camp, a one-weekend to one-semester intensive.”
 
Oh, so that’s how long it should take to train teachers: one weekend! (To think of all those years I wasted...)  Or maybe a semester, says Nemko (I suppose that's if you’re learning to teach something really hard, like logical argument).  Then, as if Nemko hasn’t tied himself up in enough conceptual knots already, he claims that such training is required of teaching assistants "but not of professors”—a curious assertion, since teaching assistants are the ones who become professors.

So what is Nemko’s solution to the higher education dilemma? He suggests that most courses should be “taught online by interactive video." Why? Because “the online format allows for . . . exciting simulations impossible to provide in a nation’s worth of live classes.”

Well, now we really are back in the inverted world of Fahrenheit 451—where firefighters start fires, and face-to-face, live interactions between human beings are less important than simulations of same.  The virtual is somehow more valuable than the real--just  as Mildred grows more attached to the imaginary television “family” that appears on her wall screens than to the actual husband standing in front of her. News flash for Mr. Nemko and all those who think like him: Education is not about “exciting simulations,” but about real relationships, between real people.  Yes, even now.

***
Despite my visceral and, I admit, angry reaction to this screed, I'll concede when Nemko suggests that the U.S. emulate Europe by expanding its apprenticeship programs for skilled labor. Guess what?  We used to have more apprenticeships in the U.S.—thanks to unions.  (I know whereof I speak; I married a man who completed a rigorous formalized apprenticeship with the carpenters’ local.)  But guess what’s happened to unions? 

Back before the greed of the 1980's created policies that have steadily eroded the benefits enjoyed by working people, we all had more choices. Colleges were expanding access, tuition was reasonably affordable, and for those otherwise inclined, apprenticeships were available. And we still need those apprenticeships. I’ll fight as vigorously for the dignity and fair treatment of those who engage in physical labor as I do for those who engage in the life of the mind.  All of us are needed.  For starters, those of us who are fortunate enough to work indoors need people to build, maintain, and clean the buildings that house us. Those who perform necessary physical work need those of us who are trained in other areas—law, medicine, pharmacy, law enforcement, education, literacy, the arts, and much more (and let’s not forget the old line, “No farmers, no food"--isn't that where it all starts?)  Bottom line: we all need each other.  There is dignity in, as well as the need for, all types of work.  No human being has the right to consider himself or herself more “valuable” than someone who works in a different capacity.  

I also believe that the arts and humanities, and all the advantages that they confer on us, should be available to everybody regardless of how we make our living, rather than confined to educational institutions. There’s no reason a carpenter shouldn’t enjoy studying history, or a janitor shouldn’t write poetry, or an ironworker shouldn’t play the string bass.  But it’s also vital that the arts and humanities continue to flourish within the university system—as the place where knowledge can be nourished, expanded, disseminated, and perpetuated.  

If that is to happen, we need to change our priorities.

Too often, those of us who teach in the humanities are threatened with extinction because of our relatively small numbers. When we ask for help with recruiting students, we’re told there is no point; the decline of the humanities is said to be a “nationwide” problem, and we are told students don’t “want” to study “useless” subjects anymore (“use” being defined here as “directly leading to the making of money, along a predictable straight-line path”).  It’s not surprising that during tough economic times, many people—especially the less well-off—prioritize job training over deeper fulfillment (insert the basic principles of Maslow’s Hierarchy here). It’s also not surprising that the humanities is marginalized in a culture like America’s, which tends to be anti-intellectual, materialistic, impatient, hyper-individualistic, and in other respects antithetical to all that the humanities stands for. 

But America has always tended toward all of the above. Yet despite that, for several decades following World War II, America offered (arguably) some of the highest-quality liberal arts education in the world, all the while expanding access. What was different?

When I read institutional histories nowadays, it appears there was a time when many college administrators themselves believed in, promoted, and protected the liberal arts. Many of those at the helm had decided that some things are worthwhile even when a direct, immediate financial benefit is difficult to calculate. Institutions had also decided that those with expertise in an area should be the ones teaching it, and that curricular decisions should be made by those with expertise, not by those new to the field, like students.

Now, anyone who knows me knows that I’m almost fanatically student-centered. I’ll take a bullet for the younger generation, whom I see as not as instigators of mischief but as victims of my own generation’s bad choices. However, as a teacher I’m also aware that student preferences about what they think they “want” should not always be the deciding factor when it comes to curricular planning or student advising. Can students know they want something when they may not yet  know it exists? How can they know what will be “relevant” to their futures when nobody can see the future?  

We old geezers can’t see the future either, of course.  But with age, many of us have gained the advantage of retrospect. We can look back and recognize how things we didn’t think we “needed to know” at the time have turned out to be invaluable. (I often wince when I remember my own teenage self, so “sure” about where my life was headed and what I “wanted,” when it turns out I knew almost nothing.)  Today I’m grateful for my teachers and parents and elders, older and wiser than me, who sometimes forced me to abandon what I thought I wanted in favor of what they knew I needed.  As much as I respect my own students, I also understand that there are times when, as an educator (not to mention as a parent), I’m obligated to steer them away from what they think they “want.”  University administrations should do the same and advocate, publicly and vigorously, for the values that have sustained the liberal arts for centuries, despite countless changes (Nemko's assertion to the contrary).

I don’t think I’m being foolishly nostalgic if I recall a time when our society seemed to have collectively decided there are some things worth keeping, some things that matter besides the "bottom line." We could decide that again. We could decide to treat people as human beings rather than as commodities. We could decide to promote human dignity, the quest for meaning, and the creation of a world in which all human lives have intrinsic value and all people have an opportunity to find a sense of purpose. Institutions of higher education could (and should) play an important role in fostering all of that.

We could decide that institutions of higher education will serve as the guardians of intangible things of value: our histories, literatures, religions, mythologies, artistic expressions, and ideas, across cultures and centuries. We could decide that some things matter more than squeezing every last dime of profit out of each other, and more than treating other human beings as commodities to be bought and sold or as “others” deserving of contempt. We could choose real relationships over “exciting simulations.” We could choose to value expertise and intelligence rather than expediency and popularity. We could choose to create institutions whose members are encouraged to nurture rather than destroy. We could choose to acknowledge, and work to change, the viciousness of the current economic system--the underlying reason there is “something rotten” in the state of higher education today.

We not only can choose those all this; we need to.  In order to do so, we need to find educational leaders who are committed to doing all of the above. We need people in charge who are committed to creating than destroying, and to building institutions rather than tearing them down.  
 
0 Comments

    Archives

    November 2016
    October 2016
    July 2016
    June 2016
    April 2016
    February 2016
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014
    July 2014

    ABOUT THE AUTHOR

    The Underground Professor teaches English at a small private university. This blog explores how we might revitalize the humanities in the twenty-first century--and why it's important that we do so, in light of the our culture's current over-emphasis on profitability, quantitative measurement, and corporate control.


    Categories

    All

    RSS Feed

Powered by Create your own unique website with customizable templates.