Tuesday, March 31, 2009

The Highest "F"

A religion prof who is a colleague of mine recently used a marvelous method to teach the concept of Grace to his class on the Reformation. Just after the first major assignment, he entered the room wearing a dour expression and announced that only one student had received an A. The rest earned grades in a range from F to B. Then he turned to the student who earned the A and asked, "Would you be willing to take an F if everyone else in the class could have an A?"

He knew the student well. She was a young woman named Amy. For a moment or so, Amy thought it over and then said, "Yes, I would." Immediately a debate broke out about whether this was fair. Those who had earned Bs were slightly unnerved that their effort was now regarded on equal terms to those who earned lesser grades.

"But it's not really about your effort, is it?" my colleague pointed out. "You didn't earn an A. Amy gave it to you." The ensuing debate was a great lead-in to the theological debates of the Reformation.

I heard about this from the student, Amy, who told me it was a powerful lesson for her as well. She said she understood the idea of grace like never before. I asked, "So did you really get an F?

"Yes," she said. "I really did."

Friday, March 27, 2009

What about God?

The following is a letter I wrote to a student who was upset about having to read Darwin in my course. This occurs every year and I always feel the need to respond with as much compassion and understanding as possible. It's tough, though, because passions on this subject are often quite strong and many fine students come from backgrounds where the idea of evolution and natural selection are tantamount to Satanism.

I really sympathize with these students and always try to imagine how it must look from their perspective. Something they deeply value seems imperiled and they quite naturally want to defend it. I have to be careful not to polarize the situation when pointing out the weakness of their case against Darwin. Some just shut down, though, and there's nothing you can say. Others may admit they are confused. The letter I have excerpted below was written in response to a student who had actually written me a long letter filled with recycled arguments from Creationist and Intelligent Design websites. I have changed her name and omitted my rather tedious response to her specific arguments.


Thank you for your thoughtful response. I am delighted that my comments directed you into more study and reflection. That is exactly what higher education at its best should create: debate over issues of real importance for students and professors. You may be surprised to learn that I think Intelligent Design should be a part of high school and college science curricula. It’s a wonderful way to engage students in the question of what science is and is not. So I am happy to respond to your comments. These are issues that are passionately important to you and many of my students; consequently they are the very issues we ought not to avoid or dismiss in education...

Generally speaking, I am seldom upset by the fact that people argue against evolution. I do confess, however, to some frustration with the weakness, inconsistency, and lack of basic scientific understanding that often typifies their attempts. The scientific process is in many ways a brutal one. Something doesn’t get to be science because we want it to be. And thus far, I’ve yet to encounter a really good anti-evolution argument. Some are a bit more sophisticated than others, but every one I’ve looked at so far crumbles upon close examination. In fact, most of the new arguments (including ID) are really old arguments in new clothes that were refuted long ago.

I think what fuels the Intelligent Design movement is ultimately a misplaced fear. There is among some people a fear that evolution and natural selection are a threat to religion. They do not wish their children to be exposed to these ideas and often preemptively try to warn their children away from them. They see these ideas as some kind of undermining of a God-centered worldview and an orderly and moral society. As I mentioned, I believe this fear to be entirely misplaced.

Why is it misplaced? I think you touch on the reason in your concluding paragraphs. Here you move away from discussing the scientific evidence and into the question of purpose in the universe. Why are we here? How are we to live? What is our purpose? These are some of the most profound questions a human being can ask. They are questions to which people will forever need answers. The theory of evolution may be modified and perhaps even someday abandoned (though I wouldn’t bet on it), but the questions and the answers that have emerged in our religious traditions will be part of the human experience for as long as we exist.

I personally do not think we can be well educated without seriously wrestling with these questions, and I would never dismiss them as irrelevant because they cannot be answered by the processes of science. Believe me, Diane, I would argue passionately against anyone who thought religion could be dismissed because it lacks a scientific foundation. I would fight harder against that argument than I ever would against someone seeking to dismiss evolution (and I have). In my view, dismissing the religious dimension of life is a form of arrogant and perhaps dangerous ignorance.

This is because the issues you raise in your concluding paragraphs are what the theologian Paul Tillich called “ultimate concerns.” They go to the very heart of what human existence means. As a person of faith, I struggle with them all the time. How am I to make sense of suffering in this world? How am I called to respond to others? What is sin? What is the meaning of my life and, just as importantly, my eventual death? Our faith traditions (including my own) are beautiful articulations of answers to these questions, and I look to these articulations in my life.

I suppose I frame the issue this way: is it possible that we are creatures living in a brutal and Godless universe with animal savagery and endless strife for our fate? Is there a possibility that there is no transcendent hierarchy in the universe, no ennobling goal for humanity, no home for us other than this one? If I am to look for the answer in science alone, that answer is always going to be “maybe” or at the very least “I cannot say.”

If, on the other hand, I look in my faith tradition with a humble and open heart, the answer is “maybe not and this I must say.” And that’s enough to fill me with an animation of hope for this sad, weary, sinful world. So when I say have faith in faith, I mean that we stop asking religion to justify itself by the standards of science (and stop asking science to justify itself by the standards of religion). These two kinds of human understanding are what the late Stephen Jay Gould called “non-overlapping magisteria.” When we conflate them, as I believe happens in the Intelligent Design thesis, we do a kind of unconscionable violence to their distinctive value, beauty, and coherence.

You ask how I can get up and go to work each day carrying around in my mind the possibility of a Godless and random universe. Well, I do it every day as an act of faith that this is not so, that I am called to love as Christ loved, and to be humble in the presence of this amazing mystery. To my mind, it isn’t Nature’s laws that are a mystery; it’s that there is a Nature at all. The theologian is perfectly right to ask the scientist why is there something instead of nothing? And on this question, the scientist must at last remain humbly silent.

I realize, of course, that my view of faith is quite different than yours. Mine must appear to you a bit abstract. There is no “personal” relationship with Christ, no doctrinal acceptance of Christ’s exclusive gateway to salvation, no fallback to special revelation and the authority of scripture. Mine must appear an attenuated or bloodless faith. I accept that. What you may want to consider from me is the narrow folly of drawing such doctrinal lines. More than once Christ showed himself to live beyond that narrowness of definition: supping with tax collectors, speaking with the woman at the well… Mine is a view that attempts as much as is humanly possible to open rather than close. In my own tradition, we recall the words of the Congregationalist minister John Robinson, who said to the pilgrims as they set forth to the new world: “There is still more light to break forth from God’s holy word.”

And thank you for reading and considering what I have written. That two people who see things differently can talk and wrestle together with big issues is a hopeful thing. The very act of trying to make our deepest beliefs and principles understood to others can allow us to see one another as thoughtful and serious people worthy of mutual respect, even and especially when we disagree. So I honor you for being such a thoughtful and wonderful student, Diane, one who cares deeply enough to wrestle with these matters alongside me.

Thursday, March 26, 2009

Inward Larches

We start Charles Darwin's The Descent of Man in the freshman honors seminar next week. First published in 1871, it was Darwin's second big book, and this time he didn't dance around the implications of his theory. It made a lot of people in his day nervous and it will unsettle a few of my students in the weeks ahead as well.

Indeed, Darwin fundamentally changes the debate about human nature. Since at least the time of the Greeks, human beings had taken for granted that they were essentially different from animals, if not in fact partially divine. For Plato and Aristotle we were distinct because of reason, for early Christians because we were formed in the image of God. Darwin collapses the distinction between humans and animals and argues that the difference is one of degree, not kind. Furthermore, he contends that human nature, like the nature of all other forms of life on this planet, is the result of a long process of change in response to natural selection.

Of course it's common to hear people speak of "Darwin's Theory of Evolution," but this is technically a mistake because the idea of evolution was in the scientific air for many decades before Darwin published his works. What Darwin contributed was an explanation for how evolution occurs, not that it occurs. Evolution is merely the idea that species evolve from other species and change over time (descent with modification). The logic that supports this idea is based upon three facts:

First, every living creatures ever observed came from a living parent. No evidence has ever existed that life appears spontaneously. Thus, every living animal, plant or bacterium has come from another living form.

Second, species differ. This point is self-evident. Some have backbones (vertebrates); some do not (invertebrates). Even those people who chose not to believe in evolution agree that species differ.

Third, every piece of evidence ever collected agrees that relatively simple animal and plant species existed before more complex ones. The entire fossil record consistently shows that invertebrates preceded vertebrates. Keep in mind that all that has to happen to disprove the theory of evolution is for a single older vertebrate fossil to appear, but one never has.
When you put these three facts together, you can only reach one conclusion. Life forms on earth today evolved from ancestors that were physically very different. Thus species change over time. So arguing against evolution is a bit like arguing against gravity. You are still free to disbelieve evolution, just as you are free to think that gravity won't pull you down when you jump, but you can't call your belief scientifically sound. It just doesn't account for the facts.
What Darwin contributed to our understanding of evolution was his theory of natural selection. Essentially, he argued that species change in response to their environment. They are, in fact, constantly struggling against the environment and one another in order to survive. This struggle eliminates those unable to compete or adapt. Consequently, given enough time, a species will be physically modified as it takes on the hereditary features of those of its members that have successfully survived and reproduced in the prevailing conditions.
It's a simple idea and backed up by a host of observable evidence, yet many people remain troubled by his theory, and not simply because it seems to contradict the creation story in Genesis (by the way, most cultures have a creation story. Even if evolution is wrong, Genesis is not without competition). No, Darwin's ideas and evolution continue to disturb because they suggest that we aren't as special as we once believed. In this sense his work can be seen as a blow to our species' egocentrism.
Moreover, the process of natural selection is essentially random. There is no overarching goal we're heading toward, just continual adaptation or extinction. This means that who and what we are as a species is contingent upon random chance, a not altogether heartening notion. Lastly, some of the implications of Darwin's theory lead us to the view that to a significant degree our biology controls our destiny, which means our will is not ultimately free.
On the other hand, some people are enormously excited by Darwin's revolutionary view of humanity and nature. They take comfort in the idea that all life is intimately connected and interdependent. For them, it is a perspective as awe-inspiring as any creation story, a view filled with wonderment and surprise. Darwin himself expressed this sentiment in the final lines of his first great work, The Origin of the Species:
There is a grandeur in this view of life, with its several powers having been originally breathed into a few forms or into one; and that whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful have been, and are being, evolved.

Wednesday, March 25, 2009

Through a Glass Dimly

Just recently I rewatched a videotape that had been sent to me by a childhood friend. About five or six years ago he had transferred some old films we had made as kids onto VHS and he wanted me to have a copy. These were silly, amateur films from the 1970s when we were in junior high and high school. I sat and watched them again, nearly two hours worth of material. I’m not sure why, but there was something ineffably sad about watching these old super eight millimeter films.

It wasn't just looking back on who we were all those years ago, although that’s certainly a part of it. No, it’s something to do with the technology and its limitations. The films are silent, of course, which distances you from the images. Faces soundlessly mug for the camera as if behind heavy glass. And the images are of low quality, often out of focus and poorly lit. This causes you to struggle to see what’s there, so that you not only watch through a glass but, as they say, through a glass dimly. My friend had dubbed in some music to accompany the films. This ocassionally added poignancy, but not as much, I think, as the whirring clatter of an old projector in a darkened suburban basement half a lifetime ago.

Tuesday, March 24, 2009

The Pseudos of Due Dates

The word pseudos in ancient Greek meant an untrue belief and was used to characterize a wide array of falsehoods: lies, delusions, biases, ignorance but also, interestingly enough, works of art like epic poetry and drama. Indeed, a philosopher like Socrates made it his life’s mission to extract pseudos from thinking and hold it up to the light of reason. Plato went so far as to suggest the exile of the poets in The Republic. These Greeks argued that any belief failing to meet reason’s demand for consistency, logic and evidence ought to be scrapped. And just recently I have scrapped a bit of false belief about the pedagogical benefit of firm deadlines.

A colleague of mine, who is a bit of a Socrates-type, occasionally drops by my office. Inevitably we end up kvetching about the more onerous parts of the job. So a while back I was complaining about students who don’t meet deadlines, and he responded by asking, “Why do you need deadlines?”

“What do you mean? You must have a deadline or students will never learn time management.”

“But they already know how to manage their time,” he said. “You aren’t letting them do it. You’re managing it for them.”

It instantly struck me that he was right. What logic or evidence supported my merciless adherence to deadlines? Did I know for a fact that they taught time management? I hadn't a clue, but I nevertheless have believed for years that deadlines were essential. It took only 10 seconds of reflection to see how wrong this idea is. My students are enrolled for anywhere from 14-18 credits with instructors who never coordinate the scheduling of their assignments. Moreover, nearly all the students work and, in many cases, have family obligations that are equally demanding. So what time is left for them to manage? About the only thing they can control is what obligation gets to suffer this week.

So I have been experimenting with new ways for students to hand in work that offer greater flexibility. I tried the idea of "due weeks" for a while, but over the past year I've moved to a system with even more flexibility. Here are its main features. First, anything that comes in on time can be revised as many times as the students like. In other words, if they hand in anything at all at the due date--a single page, even a paper with just their name on it--they have until the end of the course to revise it. Think of it as a place marker that signals to me that they intend to do the work. This allows them to manage their work load by shifting my assignment to a less hectic time. If they hand in nothing at all, it's a zero. There's no going back or make-up work.

I have also tried to build in more opportunities to score points than are needed to pass the course. So, for example, the completion of daily response questions is not a daily requirement in my freshmen honors seminar. They are free to turn in as many or as few as they want. There is a set number they need to complete for an A, B, C, etc., but it won’t necessarily imperil their chances of passing if they decide to skip a day or even a week. They do have to plan ahead and decide which responses they plan to answer given what's happening with the rest of their schedule. In other words, they get to manage their time, their workload and even their final grade.

You might think that students would take advantage of this system by place marking everything and dumping it all on me on the last day of class, but that hasn’t happened. Most who put in place markers turn in the work within 10 days of the original deadline, or they wait until they have a free weekend and revise three or four place-marked assignments. More than once, too, they’ve told me how much they appreciate the flexibility.

It has also somewhat alleviated the chore of grading. I used to get hit with periodic waves of essays that I felt pressured to have back in a week. My office would contain oppressive-looking three foot high stalagmites of ungraded work. The papers don't come in waves now; they are spread more evenly throughout the semester. To be sure, my stack is at any moment an inch or so deep, but my office is happily free of stalagmites these days.

This system may not work for everyone, but so far I like it.

Monday, March 23, 2009

Little Anthropomorphic Birdhouse in Your Soul

Birds will be building nests over the next two weeks in this part of the world. Indeed, a pair of house wrens has moved into the birdhouse affixed beneath the eaves of our home. It’s fun to sit on the front porch and watch them get everything ready for the big event. I was out on the porch the other day reading as they flitted in and out of the birdhouse with bits of dried grass and straw. Ironically, I was reading Keith Thomas’ Man and the Natural World, a masterful overview of the changing attitude toward nature in England from 1500-1800.

Thomas is one of that vanishing breed of scholars who have personally and exhaustively read everything written on a particular subject. He also eschews computers in favor of index cards and a filing system! His book traces human understanding of the natural world from its ancient and theological foundations in the late Middle Ages through the growth of the natural sciences in the Renaissance and Enlightenment. What strikes me is how anthropomorphous the view of nature has always been.

In one section on taxonomy, Thomas describes the ways animals and plants were classified by early modern naturalists. Some classified them by their degree of usefulness to humanity, with the major categories being tame or wild; other naturalists classified animals by their physical beauty or personality traits (bravery, loyalty or gregariousness, etc.). Songbirds, like my two little house wrens, were sometimes classified by the sound quality of their calls: melodious, melancholy, vivacious, etc. The framework always began with human traits or the animal's relationship to humanity.

Thomas also mentions the degree to which theological dogma resisted any close comparison of humans to animals. The study of anatomy through the dissection of human corpses was particularly frowned upon by the church because it uncomfortably revealed the similarity between the internal organs of humanity and other mammals. The obvious similarity of bodily functions also made early modern thinkers uneasy. Thomas recounts a passage from Cotton Mather’s diary in 1700:
I was once emptying the cistern of nature, and making water at the wall. At the same time there came a dog, who did so too, before me. Thought I; what mean and vile things are the children of men… How much do our natural necessities abase us, and place us… on the very level of dogs!
Mather resolved in the future to fix his mind on “thoughts of piety" during his toilet to remind himself that he differed from the brutes (which in the actions themselves he did very little).

At the same time people were thought-policing the boundary between humans and animals, they were paradoxically looking to the animal kingdom for justifications of human social arrangements. The thrift and industriousness of bees and ants was appealed to in 18th century tracts about the poor. Indeed, honey bees were clearly indications that monarchy was the form of government favored by nature, and, if favored, obviously ordained by the creator.

The rhetorical appeal to nature as evidence for cultural norms is still around today. One hears the argument that same sex gender orientation “is not natural,” with natural used to mean morally in line with some universal standard. Many people today would turn my little house wrens into an argument for marriage and family (despite the fact that the conjugal relations of wrens more closely resembles serial adultery).

In my freshmen honors seminar, the students begin the semester by writing their initial view of human nature and its relationship to nature. It’s astonishing how closely their views parallel the pre-scientific, anthropomorphic idea of nature as a static, universally-ordained hierarchy with humanity at the top where God's universe intended them. Talk about being homers for your own kind. Any fair estimation of successful species on this planet would rank bacteria higher than humanity. They are far more numerous, more resistant to extinction, and they thrive in a much wider variety of climates.

Sometimes you hear various pessimists talk about the onset of a new dark age, but just as often I think to myself that we haven’t really left the dark ages, middle ages or Renaissance behind. They are right there in the classroom staring back at me every single day.

Sunday, March 22, 2009

Yakking at Power Points

Sometimes when I am teaching at night I'll give the class a five minute break to split up the evening. I’ll walk down the hall toward the water fountain and pass classroom after classroom with a professor up front yakking away before a Power Point slide. I made a promise to myself not to fall into this practice when I began teaching nights, but it has not always been easy.

It's taken me a while, but over the past year I have slowly developed a system that seems to work a bit better. Students are assigned reading before class, but we do not discuss the reading until they complete an in-class analysis task. I write four or five key questions about important passages in the reading before each class. I distribute the questions at the beginning of the period and assign each student the task of answering question one, two, or three, etc.. Answering the questions requires students to re-analyze the material with a highly specific aim. They must identify patterns, look for connections or distinctions and make judgments about evidence. In short, I’m trying to teach critical reading skills as much as the material itself. After they have done this, they write an answer that uses evidence in the text to support their view.

The students have 20 minutes to half an hour to write their responses, which may be anywhere in length from a few paragraphs to two hand-written pages. Then I break the class into groups of four. Each person in the group has wrestled with the same question. The students explain their response and the evidence that supports it to the others members of the group, and then they collectively summarize their findings, noting any disagreements or the points of consensus. I give them 15-20 minutes for this discussion, and I flit around the room listening in or prompting any of the groups that seem stuck or getting off task. When the group work is complete, I lead a discussion of the four questions, using the groups who have discussed a particular question as “experts” on that subject. Others can chime in, but I mostly address the questions toward the experts. The last part of the method is the best. The students are assigned to revise and polish their written response and hand it in next time.

Here’s what I like about this method. It's active learning because students are doing something in class rather than just sitting there listening to me. It's interactive and collaborative because students have to share ideas and build group consensus. Best of all, it is reiterative. The students read the material once, and then re-read it again with a specific analytical aim. A further iteration occurs when they wrestle their understanding into language, and still another when they do it again in small group and large-group discussion. Lastly, they get one more iteration when they revise their in-class responses, which often change as a result of issues raised during class discussion. That’s five engagements with the material using different modes for communicating their understanding. Moreover, it makes them responsible to each other and not just me for producing an answer.

The method has certain advantages for adult night students as well. It makes efficient use of their time in class. And it gives them the most precious commodity for a working adult: thirty-minutes of undistracted attention on their coursework (in other words, they are not trying to do course work over their lunch hour or when the kids need attention). Sometimes I say to them that their homework is 85 percent completed when they leave class. They really like that. Plus, it emphasizes a process approach to writing, which I further stress with an infinite revision policy. If they don’t like how I score their response, they can revise it as many times as they wish. They really like that too.

Best of all, the in-class discussions are far richer than they used to be because I’m not dealing with people who have skimmed or perhaps skipped the reading. Everyone has something to contribute because everyone has already said it on paper and aloud. Moreover, the students don't have to worry about nailing the "right" answer after a single reading because the process allows for (indeed, encourages) a lot of revision of one's views.

It’s not a panacea (nothing in teaching is), but it seems to work better yakking at Power Points.

Thursday, March 19, 2009

Better Mendacities

A few years ago, when I was in Russia, I got into a conversation about Pushkin. He is the guy who Russians effuse over and cite as their national poet, but ask people in the West what Russian authors they admire and you'll hear Tolstoy, Dostoyevsky, and Chekov (the latter especially for his contributions to the short story form). Few Westerners, however, read Pushkin. I suspect it's a translation thing. I've often wondered why some authors translate and others don't.

People who can't read classical Greek are still blown away by Homer. Goethe, on the other hand, not so much. I know great admirers of Flaubert who can't speak a lick of French, but try as I might I can't get into an English translation of Rimbaud. All this makes me think it's related to genre. Poetry loses its "sound sense" in translation, a loss that is not as central to drama and the novel. So even though Shakespeare and Homer wrote in verse, there are compelling enough plots, characterizations, and themes to support them outside of their original languages (ever see Kurosawa's Ran?).

Still, can you even imagine reading Lear in Spanish? The loss would be immense, although, I suppose, still bearable to those who didn't appreciate what they were missing. But just try to imagine reading someone like Gerard Manly Hopkins, Wallace Stevens, or even e.e. cummings outside of English. It would be almost pointless. Here, the "sound sense" is vital to forming an understanding (and I would argue Whitman--our national poet--would be lost in translation as well).

I once heard a classics professor say that it was profitless to read Aeschylus without an awareness of the richness of Greek verb tenses. I don't know. Maybe, but I think there's still enough meat on the bones to matter with Greek drama. I also read somewhere that the difference between reading the New Testament in the original Greek and reading it in even a good English translation is like going from grainy black and white to the most intense Technicolor you could possibly imagine.

Sometimes, too, I recall a quote from Ezra Pound: "Better mendacities than the classics in translation!" But what are you going to do? It is a great pity that we can't spend our entire lives learning languages.

Wednesday, March 18, 2009

Whatever Happened to Short Stories?

Not too long ago a guy at work and I were talking about all the great short stories we read as kids: Melleville’s Bartleby the Scrivener, London’s How to Build a Fire, Conrad’s Typhoon. I used to get collections of short stories out of the school library: Sake, Twain, O’ Henry. When I was older I read all of Hemingway’s short stories, and I still think of Big Two-Hearted River every time I heat up a can of pork and beans.

In my late teens I fell in love with Fitzgerald’s stories (The Ice Palace, Bernice Bobs Her Hair, A Diamond as Big as the Ritz). And Young Goodman Brown by Hawthorne is still one of the scariest stories I ever read. Then there are the stories whose authors I can’t recall but whose images are still with me. There was one called My Father Sits in the Dark about a young kid who keeps finding his father alone in the kitchen in the middle of the night – just sitting there in the darkness. This is not to mention all of the science fiction short stories I read as a kid: Heinlein’s The Nine-Billion Names of God, Ellison’s A Boy and His Dog, and many of the stories in Bradbury’s Martian Chronicles. I haven’t read sci-fi in years. In fact, the last short story passion I can recall was discovering Isaac Babel in my mid 20s.

I stopped reading short stories after that, and I’m not sure why. Maybe my interest in them died during an MA program in creative writing, when I had to read everyone’s dreadful efforts. Everybody said in those days that the short story was dying. There were fewer and fewer magazines buying the genre, and book publishers would never take a chance on an unestablished author’s collection. The short story, everyone agreed, was mere apprentice work for the novel.

Indeed, in all of the years I have read the New Yorker, I’ve never bothered much with its fiction. The “literary” stories that appear there always seemed such a rarefied taste: all those rudderless adulterers slouching toward some bleak irresolution and another divorce. One writing instructor in grad school told me the key to getting published in the New Yorker was to write a perfectly plotted story and then lop off the last three pages. That always struck me about right.

“Literary” fiction always seemed so earnest. It lacked the zest of the stories I loved as a kid. But maybe that’s not right either. A “literary’ short story (whatever that may be) just seems to lack something, but I really can’t say what. Maybe it just lacks the energy of being a part of a genre that doesn’t believe itself to be played out.

Funny then that when I went to the library last week to get a collection of short stories I should return with John Cheever, that quintessential New Yorker writer. Still, Cheever’s world of gin-swilling upper-middle class, manic-depressive screw-ups is an interesting world, if only that it’s one receding so rapidly into the past. At any rate, it feels good to be reading short stories again. It’s as if old reading synapses that haven’t fired in years are reawakening. I feel an enjoyment I seldom feel with novels. Maybe it’s because the pleasure is so immediate; maybe it’s because the time investment to aesthetic payoff ratio is so reader-friendly.

I am also in some strange way dimly remembering the reader I once was, the one who read with a kind of fiercely naive belief that a simple well-told story could change my life. Somewhere along the line, of course, that reader also began to recede into the past.

Tuesday, March 17, 2009

The book of moonlight is not written yet

I started teaching an eight-week accelerated version of the senior seminar a few weeks ago. The students are mostly adults, a few national guard soldiers, people who work all day. It’s a dog watch course, too, which means it begins at 8:00 pm and runs to 10:20, perhaps the worst time to teach on the college’s scheduling system. So here I am, at the end of my students’ day, trying to gin up a discussion or engage the class with new ideas. By 10:00 I know everyone is thinking only one thought: “How much longer is this guy going to go? I have to get up and go to work in the morning.”

Ostensibly the college offers classes at this god-awful hour because today’s consumer-oriented adult market wants to complete a degree fast, faster, fastest. And offering late course allows them to cram in an early and a late section, and do an entire year of credits in just 16 weeks. (Heck, why not just lock them into a rented hotel ballroom and shout at them round the clock like they used to do in EST seminars?) The other bit of conventional wisdom is these students like “real world,” “hands-on,” “nuts and bolts” instruction by people who have been out there in the hands-on, nuts and bolt, real world.

Three words: Bo-log-ney.

If anything this demographic of student is even hungrier for traditional liberal arts courses. Little that I teach is hands-on, and I defy you to find how my subject matter easily applies to the real world. I show them Cezanne, we read novels and poetry, discuss aesthetics. What I teach could not be more useless –and they love it. Last fall I taught The Iliad. One woman, a Bosnian immigrant, said to me, “I usually sell my books back at the end of the term, but I think I’m going to hang onto this one.”

I could have kissed her.

Saturday, March 14, 2009

Diogenes of Sinope, our nation turns its lonely eyes to you

As a kid I read about the Greek cynic Diogenes, and he became a kind of childhood hero. Diogenes famously argued that you don't own your possessions; your possessions own you. And since he valued his freedom, he decided not to own anything. For a spell he did own a cup from which he drank, but he threw it away after seeing a beggar drinking with cupped hands. There's also the story that Alexander the Great came to see him one day and found him sunning himself outside the upturned wine vat that he used for shelter. Alexander asked if there was anything he needed, and Diogenes said he needed the young king to move out of the way because he was blocking the sun.

So at 16, in imitation of Diogenes, I decided to throw away everything I owned. I got it down to a grocery bag of clothes. For a spell when I was single, I kept to the rule of never owning more than three things worth over $100. For many years I had a 10-year old car, a computer, and a microwave oven. Nothing else I owned was worth much. I remember once moving into a new apartment and managing it with a single load in the trunk of my Toyota.

For a spell I didn't even bother to lock my apartment because there was nothing in it worth stealing. That all changed the weekend I got married. I remember buying a barbecue grill and some patio furniture that weekend, things I never thought I would own. And now our attic is crammed with boxes of who-knows-what. It reminds me of a line from Walden Pond. Bemoaning how much his fellow Americans own, Thoreau writes, "It would defy a man today to pick up his bed and walk." But it's a waste of time trying to be an ascetic in American society. It's like being a prude at an orgy.

Besides, materialism is not devoid of a certain brand of spirituality, though the two are often counterposed. Each fall I have my Humanities sections read Satyricon. I introduce them to Veblen's concept of "conspicuous consumption" and let them use it to analyze Roman society as it appears in Petronius' work and contemporary America. They then have to decide whether we are more or less materialistic than the ancient Romans. One character in Satyricon, Trimalchio, is really disgusting but also sad and pathetic. Like any good materialist, he's a seeker longing for personal transformation. And, like Trimalchio, we Americans are really after something more than wealth and pleasure. Martha Stewart was not selling cookware or wreaths made from cedar twigs. She was selling a vision of "home" with all of the concept's attendant values of security, love, centeredness, connection, and freedom from anxiety.

There's a reason the working title of The Great Gatsby was Trimalchio in West Egg. Like Gatsby, we Americans believe that if we can get everything just right--get the car that is our freedom, the mutual fund that assures our newborn will be safe--then one fine day we'll finally burst into what Fitzgerald called the "orgiastic future that year by year recedes before us." In The Affluent Society (1957), John Kenneth Galbraith argued that modern economies of scale more than meet our physical needs. Thus the only way for them to continue expanding is to manufacture psychological need, which has the wonderful benefit of being inexhaustible.

Modern advertising and marketing--advertising is a little over 130 years old and marketing is a post-war phenomenon--are simply engines for manufacturing these psychological needs. Heck, most TV commercials are structured as a Biblical parable. They tell a little story about a problem in need of a solution. The difference is that the payoff is not spiritual insight; it's the promise that the solution to our spiritual anxiety is only one purchase away. Thus materialism is not the antithesis of spirituality. It's just a variant form of it that speaks to the same longings as popular religion: change me, bless me, make me new again...

Friday, March 13, 2009

The Judgment of the Young

Yesterday I engaged in a bit of cheap cynicism. In twitting my students for their over-dependence on the latest technology and fear of being "off the grid," I failed to mention that at least half of the room did crave the experience I described. I often forget how much the students are like me. Indeed, most of the time I feel alienated from their concerns and lives, which inevitably produces some middle-aged snorting about the next generation. A lot of times, too, I feel like my efforts go into a void. There have been so many moments in empty classrooms, as I'm packing up papers, erasing the board, and straightening the desks, when I think, "Well, that didn't work."

In The Courage to Teach, the educator Parker Palmer names the secret fear that permeates teaching. He writes,
In unguarded moments with close friends, we who teach will acknowledge a variety of fears: having our work go unappreciated, being inadequately rewarded, discovering one fine morning that we chose the wrong profession, spending our lives on trivia, ending up feeling like frauds. But many of us have a fear we rarely name: our fear of the judgment of the young.

Day after day, year after year, we walk into classrooms and look into younger faces that seem to signal, in ways crude and subtle, "You're history. Whatever you value, we don't--and since you couldn't possibly understand the things we value, we won't even bother to tell you what they are. We are here only because we are forced to be here. So whatever you have got to do, get it over with, and let us get on with our lives."
On some level to teach is to be constantly reminded that your time is passing. What Shakeseare's King Lear says of his own hand could easily be said of teaching:"it stinks of mortality."

Thursday, March 12, 2009


Yesterday in the freshman honors seminar we were discussing Emerson’s The American Scholar Address, in which he calls for a new kind intellectual heroism. At the time he gave the address, 1837, there was some question about the scholarly mettle of Americans. Giants like Hegel, Goethe, and Hume had lately bestridden the European intellectual stage.

So Emerson’s task was to call for a new kind of scholar, one as at home on the brawny frontier as the lecture hall. This scholar-- dubbed with the progressive tense name of Man Thinking--was to be influenced primarily by contact with nature and the seminal influence of the great minds of the past. Emerson famously warned against slavish adoration of books, arguing that “one must be an inventor to read well.” He also demanded his scholar lead an active life. No bookworms or note-taking recluses for him. Man Thinking was to be as much a brawniac as a brainiac.

So I asked the students how we might design a curriculum for the creation of Man-Thinking. One idea was for the college to buy several isolated cabins in some beautiful wilderness and require all graduates to spend a month alone there with nothing but a shelf of five or six great works and a fat blank journal. No computers, no cell phones, Facebook, Twitter , I-pods or TV. Just you, a few well-chosen books and nature. You could read the books or not. That was up to you. There would be a month’s worth of food and a pile of unchopped fire wood (for the active life component). At the end of the month you would return to campus, hand in your journal, and that’s it. Course over.

Now I find this a deeply attractive idea, as did a few of my students, but you should have seen the look of horror that appeared on some faces. “You’d have to leave me a cell phone—just for emergencies,” one student insisted. "What if there were an accident or someone needed to reach me?"

“Could we make just one call a day?”
"Why allow books but ban an I-pod? It would be great to have your tunes while you were cruising through the woods."

“Why would anyone pay tuition for that?”

(Sigh) So much for the new American brawniac.

Wednesday, March 11, 2009

The Art of Responding to Student Writing

I have something of a reputation among students for giving lots of written feedback on papers (sometimes writing more in response to student work than the student wrote to begin with). Indeed, by this semester's end I will have graded nearly 1,800 pieces of student writing. That's everything from a few paragraphs to full-blown major papers. I use no tests, no group projects, no presentations: it's write, write, write in my classes. So I feel that I have to write, write, write in response. It makes grading a slog, but there's just no way around it. It also pays off. At least that's what students tell me on my teaching evaluations.
My philosophy on writing comments has evolved over the years, but I've never formally laid it out. I'll admit, too, that I am not pedagogically virtuous all of the time, so what follows is only what I aim at, not always what I accomplish:

Affirm that you understood the content of what students wrote
. After all, the writers tried to communicate something. Make sure they know you got it. I always try to mention a point they made in my commentary, and compliment them on the insight (or respond to it in some way).

Use paper comments as a teaching tool. Despite what professors often believe, students do read comments closely. Think about it. Didn't you read your professor's comments closely (if only to see how how the old gasbag justified the grade)? I know I have their full attention in my comments, and that may be the only time I do. So why not exploit the opportunity? Paper comments are a great place to tie together large thematic strands of the course, to show them how an idea they surfaced in their work might tie into other material or issues. Paper comments can also provoke further thinking. My favorite phrase in writing student comments is "What think you?" I use it so often the students have begun to parrot it back at me in class.

Close with some strategy for improving the work
. This only makes sense. Don't hit them with everything. Give them one or two things to work on, and then follow up on the next paper with either "I am thrilled to see that..." or "Hey! Learning curves have to start curving." I have been to known to follow up on work that isn't improving this way: "Grrrrrrr!"

Don't be afraid to offer your own struggles, conflicts and frustrations with the material. Model the kind of reader and thinker you want them to be. I often admit that I have a problem with a text. It gives them permission to have a more nuanced reaction. Sometimes I confess to students that Plato scares the hell out of me. It's amazing how letting them know that you also struggle can change the student/professor dynamic.

Of course this is an enormous amount of work. My comments file on the freshmen honors seminar this semester is now 60-pages long (single-spaced). That's 20,288 words as of this morning. And that's just one section of 16 students. It doesn't include revisions or my handwritten comments on daily responses. Admittedly, I can re-use and cut and paste some comments, but this still takes time. Some days I think there has got to be a better way.

I just haven't found one yet.

Thursday, March 5, 2009

Cheap Gimmicks? Oh yes, dear yes...

E.M. Forster famously lamented the need for plot in Aspects of the Novel. He imagined the response of a bus driver to the question "What does a novel do?" The fellow sputters and replies, "Well--I don't know--it seems a funny question to ask-- a novel's a novel--well, I suppose it tells a kind of story."

Then Forster envisioned asking a second man, one playing golf and put off by the interruption: "What does a novel do? Why tell a story of course, and I've no use for it if it doesn't. I like a story. Very bad taste on my part, no doubt, but I like a story. You can take your art, you can take your literature, you can take your music, but give me a good story. And I like a story to be a story, mind, and my wife's the same."

Finally he imagined a third man, who answers by drooping his head regretfully and saying "Yes--oh, dear yes, a novel tells a story." Forster says he respects and admires the first man, fears and detests the second, and is the third. He writes,

Yes--oh, dear yes--a novel tells a story. That is the fundamental aspect without which it could not exist. That is the highest factor common to all novels, and I wish it was not so, that it could be something different--melody, or perception of truth, not this low atavistic form.
When I first read Aspects of the Novel years ago, I immediately recognized myself as someone Forster would detest and fear. I am, for whatever reason, one of those people who have an affection for plot. Call it immaturity, call it artistic dishonesty, call it nascent philistinism, but I cannot deny it. I know the contrivances of storytelling are manipulative and gimmicky, but I can't help myself. I love artifice, clever button endings, surprises, and all manner of twists, turns and big fakey dramatic fade outs. I sometimes find myself saying with a drooping and guilt-laden voice, "Yes, dear yes, a novel should aspire to something greater."

Once upon a time I even imagined myself a fiction writer, but my efforts never amounted to much because I never aspired to anything beyond resolving the plot. There was no melody, no perception of truth. Eventually I gave up writing fiction. But when my son reached the age of three, he developed an insatiable appetite for stories. His mother and I often tried to satisfy this craving with stories made-up on the fly or retellings of whatever childhood myth or fairy tale we could bring to mind. I must have told the story of Jack and the Beanstalk from every conceivable point of view (the giant’s, Jack’s mother’s, even the cow’s). Still, the boy’s appetite for stories showed no sign of diminishing. Each afternoon when I picked him up from preschool, he asked to hear a new tale.

Inventing these from scratch, especially after a long day at work, was often more intellectual effort than I cared to exert. So one day I hit upon the idea of telling a single story that could stretch for days, weeks, or even months, which spared me from having to begin anew each afternoon. And so began the Marco Mystery Stories.

The gist was that Marco was a fellow whose telephone number was one digit different than the number for The Great Western Detective Agency. When people misdialed, he took the case. The first Marco stories were silly affairs. Marco solved the case of a missing caterpillar (it had become a butterfly), and he cracked the theft of some bug-shaped diamonds. I am not even sure I remember most of these early efforts. Eventually, however, the stories became longer and more involved. In one that lasted weeks, Marco tracked down the infamous mobster Charlie the Tuna, who was found smuggling yachts in Mexico with his moll, La-La Laroushe, the famous chanteuse.

Whenever I grew too tired (or too lazy) to devise an original Marco story, I shamelessly cannibalized the plots of just about every film and piece of literature I had ever encountered. Indeed, Marco saved Henry Baskerville from the hell hound, and he survived on a secluded island where the guests were vanishing one by one (a la Agatha Christie’s Ten Little Indians).

Once, while telling a very long story, it occurred to me that I had managed to steal elements from Raymond Chandler’s The Big Sleep, Ian Fleming’s Dr. No, James Buchan’s The 39 Steps, the Indiana Jones films and an episode of Gilligan’s Island. Indeed, the stories were always cobbled together affairs. At one point I had Marco, his valet Rudy, and noted Sherpa guide Tensing Norge, held captive by sky pirates in a secret Zeppelin base hidden inside Mount Everest. Oh, and did I mention Dr. Watson was there, too? (Yes, that Dr. Watson).

I was surprised by how satisfying it was to immerse myself again in all the gimcrack and phoniness of storytelling. As I mentioned, I once fancied myself something of a writer, but two years in a creative writing program disabused me of this notion. I realized just how much discipline and dedication it took to be a serious writer, and I also knew myself well enough not suffer illusions about the caliber of my ambition. Even so, inventing and telling the stories did satisfy some itch that had long gone unscratched.

The best part was pulling into the driveway each night, dropping a cliffhanger on the boy, and then hearing him say, "Dad! You can't stop there!" In the end, there's just no hope for my cheesy, artless, gimcrack-riddled soul.

Wednesday, March 4, 2009

Under the Lofty and Beautiful

"Only man can curse (it is his privilege, the primary distinction between him and other animals)." --Fyodor Dostoyevsky

The idea of a purpose, aim, function or goal is summed up in the Greek word telos. And in one way or another, most thinkers on the subject of human nature assume we have one. Even Darwin saw a point to human survival and propagation (just not a unique one). During this past week I’ve been reading Emerson’s Nature for class, and it strikes me that what’s really radical about Emerson is his view that human beings are free to define their own telos. It’s we who decide what our purpose is. It’s neither immanent, nor ordained. We just need to trust ourselves, he argues. For Emerson, a human being is an autotelic object, although few of us ever realize it.

After Emerson, of course, my class moves on to Mary Shelley's Frankenstein, but I suddenly wish it were Dostoyevsky, whose portrait of the Underground Man in his novel Notes from the Underground offers another view of autotelic freedom, one that’s isolated, lonely, and self-obsessed. The Underground Man wishes to be free like Emerson's artistic hero. He longs to realize his dreams of a unique “lofty and beautiful” telos, but his wishes confront an indifferent reality, cold equations, the otherness of brute nature. The mass of people who stroll St. Petersburg's Nevsky Prospekt blithely disregard him.

In the mirror in his cellar apartment he sees his own sneering, abject mouth and knows he is incapable of making anything outside of his mind conform to his dreams. For the Underground Man, a human being is a creature gifted with imagination enough to define its own telos, but so what? It will never happen. So he curses both his dreams and the world that defeats them.

In Part Two of the novel we see the younger Underground Man’s inept and self-defeating attempts to get the world to take notice of him and, ultimately, we discover his fear of actually being noticed for who he is (and not who he wishes to be). It’s true that our dreams and fantasies about life keep us going. Most of us could not get out of bed in the morning without some half-baked trust in the lofty and the beautiful.

Indeed, I have a friend who once related a vision of all of her dearest loved ones gathered lovingly around her deathbed. With each, she told me, she would share a private moment of laughter, remembrance and closeness. This woman is a minister who’s been in more than one hospital room with doped, unconscious, emaciated patients that have tubes jammed down their throats and up their backsides. She’s seen them unceremoniously flatline, or silently kick off while the nurses chatter down the hall. But that’s not what she imagines for herself. No, her death will be the lofty and the beautiful (as if death ever played fair).

In the end, what recommends the Underground Man to me (and not much does) is his willingness to be utterly undeluded about life and its desperate need for delusion, yet to curse them both.

Tuesday, March 3, 2009

The Nature of Things

We begin reading the American essayist Ralph Waldo Emerson on Friday, which will provide my freshmen honors seminar with a sunny change of pace after the morbidity of King Lear and the caustic satire of Jonathan Swift. Some of the students may even have encountered Emerson in high school, where his essays sometimes remain required reading. Indeed, many critics see Emerson as the quintessential American thinker: forward-looking, optimistic and an advocate for individualism. His influence on later writers and thinkers has proven profound and long-lasting. He was certainly an important influence on Walt Whitman, a poet the students read last fall.

Born to a family of New England clergymen, Emerson was slated early for a career in the church. As a boy he had been nurtured with a rational version of Christianity, but over time he became disenchanted with attempts to justify faith through rational means. As a young man, he also suffered grievous personal tragedies, losing his brothers and his first wife to tuberculosis in a short span. After the death of his wife, he sailed to Europe where he met important literary and cultural figures. He remarried upon his return to America and began to write. His essay "Nature" was published in 1836 and has become a touchstone of American literature. Indeed, Emerson is considered the chief progenitor of American Romanticism, a cultural stance that he gave a particularly American slant, but whose broader roots stretch back to Europe and into the 18th Century.

Romanticism was in many ways a reaction to the 18th century. That century, also known as the age of reason, had witnessed a series of prolonged attacks on many fronts to the idea of an individual self. During these years, for example, Europe underwent rapid industrialization. This in turn created socioeconomic changes that shifted the population from a rural, agrarian way of life to a more depersonalized urban lifestyle. Consequently, people became distanced from nature. In addition, large-scale mechanized industries sprang up and changed the character of work. The factory system with its regimented hours and mindless repetition seemed to make human beings mere interchangeable parts in an impersonal process.

The 18th Century also saw the rise of radically skeptical philosophers who doubted the very existence of an individual self. The philosopher David Hume argued that the self Descartes believed himself to have found was only a product of cultural situatedness, and the German philosopher Imannuel Kant saw the self as an empty fiction accompanying on-going thought. In the face of these social and philosophical attacks, Romantic writers, philosophers and artists tried to reassert the importance of the self. Their art and criticism focused on self-analysis and self-reflection. They went inward to examine the human mind's relationship to the world. Oddly, even though he was not a Romantic philosopher, Kant's ideas about consciousness, especially his Pure Critique of Reason, influenced many Romantic thinkers.

Kant had been interested in how we can know that the world we experience is real and not just the product of our minds. As you may recall, this problem also concerned Descartes. In the end, Kant concluded that we could never be fully certain about the reality of the world outside of our minds. He did think, however, that we could be somewhat more certain about the categories our minds tended to impose upon the world.

Romantic thinkers like Emerson seized on this idea because it seemed to suggest that the individual mind does play a role in ordering, shaping and imposing meaning on the world. From Plato to Kant, one important goal for philosophers had been to describe the nature of the reality that we inhabit but cannot agree upon. The Romantics, however, saw the problem differently. They were not trying to grasp what was really out there. Rather, they sought to express the power of the individual mind to give shape to what was out there. In short, they wanted to put the "self" back in the driver's seat. Nature for them was a set of building blocks for the creative mind. Through it, the individual expressed his will and unique being.

As a result, nature became an important focus of Romanticism, but not in its naturalistic or scientific sense. Indeed, one important idea to keep in mind while reading Emerson is that the word "Nature" has more than one meaning. Commonly we use it to mean the external world in its entirety: trees, rocks, mountains, the ocean. But we also use the word to mean the inherent character or basic disposition of a person. For example, we might say, "It's just not in his nature to lie."

For Emerson, nature comprises both of these meanings simultaneously. Thus the forest or the mountainside exist, but they have no meaningful existence without a person's individual nature to behold them. In the end, Emerson argued, human consciousness is the giver of meaning to nature. It is our mind, our thoughts, and our imagination that are forever creating the meaning of this world.

Emerson's "transparent eyeball" represents the thin, permeable membrane between the external world in its entirety (the NOT ME) and the inherent personal nature of the human being who perceives it (the ME). What's tragic, in Emerson's view, is that so many people get locked into only one way of viewing the world. They fail to realize that they are radically free to re-envision the world's meaning, for nature is at once their own being and a playground for their creative minds.

So what does a Romantic like Emerson offer us in terms of an understanding of human nature? Well, he suggests a new way of conceiving our relationship to the world around us. He argues that what makes us distinctively human is our almost God-like ability to refashion the world with our imaginations, an ability only limited by the human predisposition to dull thinking and social conformity. He also argues (unlike Plato) that truth is an on-going discovery, one that can never be finalized; for each generation must experience creation like the first human beings who ever gazed upon the world.

This is heady, intoxicating stuff. You can open up to almost any page of Emerson and find something quotable and inspiring. This is not to say that Emerson is beyond criticism. Over the centuries the "otherness" of the external world has proven a fairly durable concept. Some critics also feel he verges dangerously toward solipsism (the philosophy that nothing exists beyond the self) or the heresy of pantheism (seeing God in everything). Others see him as overly optimistic and having no workable concept for evil. And yet, after Swift, Emerson does seem a bracing and even liberating force.

He certainly had this effect on Walt Whitman, who wrote,"I simmered, I simmered, then Emerson brought me to a boil."

Monday, March 2, 2009

Doing the Job

Last Friday I had an 8:00 am meeting to discuss the revision of the core. Got there early and was reading the newspaper. In walks a colleague, so I rather automatically asked how she was. Tears came streaming down her face. "I so want to quit this place," she responded. It seems someone had said something cruel to her and she was feeling utterly bereft.

Later one of my senior students came to see me in my office. He had turned in some work filled with errors, and I had growled back in my paper comments that this was unacceptable for a 400-level course. He sat in the chair opposite me filled with remorse. He knew his writing was beset with problems, but he had managed to get to his senior year in spite of this. I reassured him that his problems weren't fatal. There was still time to take action in the Writing Lab. He left my office feeling a little better and resolved to tackle the issue. Then, just before noon, I strolled over to the book store, bought a card, and jotted a note to my bereft colleague. I wanted her to know that she was valued for her passion and creativity.

Another student showed up at my office door a few minutes before 1:00 pm. She's a quiet one and never says much in class. Even so, she has one of the sharper minds in the room. Enigmatic, though. I can't always read her. Anyway, I had gently called her out about "phoning it in on me" only the week before, and her most recent work has been much better. Something must have struck a chord because now she was asking me to write her a letter of recommendation for a scholarship opportunity. I told her I would be delighted.

It had been a long week and I so wanted to leave early Friday afternoon. My wife has been sick and I have not been feeling so great myself, but I happened to see another student as I was making my way across campus to my car. She's an art major in my capstone seminar and her senior show was opening last Friday afternoon.

"You will come, won't you?" she asked with a lip already starting to pout at the expectation of an excuse. I said I would be there, which meant hanging around until 5:00 when her reception opened. So I doubled back to my office and some more grading. At 5:00 pm I walked back across the now-deserted campus to her reception. I spent some time asking about her work and complimenting her on a very nice print.

And so that was Friday.
I didn't do a single thing that will ever show up on my annual faculty up-date, didn't publish anything, didn't use any newfangled teaching techniques, or even--for that matter-- have a very good 1:00 class. Still I can't help thinking I was really doing my job last Friday.


One summer, long ago, during the Ford administration and the waning days of my parents' unhappy marriage, I laid each afternoon upon a...