Category Archives: philosophical

A little out of season, but an attempt at a return…

Last semester has supplied me with a lot of new “material,” alas. I am resolving to return to my blog.

But I want to begin by looking back, and offering a meditation on some of the changes that are affecting not only my students’ language skills, but also their lives.

This is an editorial I wrote two springs ago for the Connecticut AAUP newsletter I edit, VANGUARD. The title was “Silent Spring,” partly because this editorial was written for the Spring issue (2016), and partly because yes, I was thinking of Rachel Carson. This past semester my first-years all carried out research projects having to do with social media, and they have concluded that the kinds of things I addressed here have only gotten worse. Anyway, here goes:

Silent Spring

In 1962, Rachel Carson published the landmark environmental book Silent Spring, the result of her study of the consequences of indiscriminate use of man-made pesticides, particularly DDT. She envisioned a future in which the Springs would be silent because insect eradication by poison was passing the poison on up the food chain to birds, killing them outright or so weakening their calcium production that their eggs were not viable: hence, the glorious sound of birdsong that is their territorial and mating music would be no more.

The book and its vision appealed to the general public as well as to the scientific community, and created strong pressure to ban the use of DDT. As I sit in my kitchen with the deck door open and listen to the glad songs outside, I thank Rachel Carson (for whom my niece is named) for awakening us before it was too late.

Her title phrase resonates with new meaning for me 54 years later, when I walk into my classroom building and have a moment of wondering whether I have misread the calendar and shown up to teach on, perhaps, a Sunday morning. Or maybe there’s a holiday I missed…. The building is silent. Well, the ground floor is never very busy.

I ride the elevator up to the third floor, home of my busy department and my scheduled class. Silence greets me. Has The Rapture actually occurred?

Into my classroom I go, to find twenty students or so waiting for class. They seem to be alive, but they are soundless, each gazing into a tiny screen, heads down, thumbs a-gallop.

As I make my way to the front of the room, I remember other days. Long ago the halls were alive with chatter and murmur, and beginning a class session meant getting the students to wind up their conversations with neighboring students and focus on beginning the day’s work. I taught with my door closed to mute the random bursts of chatter in the hallways and deter the occasional passer-by who otherwise wouldn’t resist putting his or her head in to see what was going on, or waving at a friend. As soon as the formal class was over, students would turn to one another and begin to talk—expressing reactions to the lesson, making plans to get together, sharing jokes… Sometimes one or two would ask me to join them for coffee so we could continue a class discussion.

Before things changed in the classroom, they started changing in the halls, as cellphones become more and more popular. Voices were raised because the phone connections were sometimes weak, or because the speaker couldn’t hear his or her own conversation over the phone-shouting of someone standing nearby. Signs appeared in the hallway: PLEASE USE CELLPHONES IN STAIRWELL ONLY. In the classroom, soon thereafter, electronic music would suddenly burst forth and a student would sidle out to talk to mother? friend? boss? Colleagues and I began announcing policies for cellphone use in the class.

And then cellphone use became silent, thanks to the innovation of texting. In the classroom my policy now is: “If I see you gazing happily into your lap and notice that your hands are moving, my first thought will NOT be ‘texting.’ You don’t want to know what my first thought will be.” (Amazing how effective this is, once they realize that first thought and blush….) A friend says she marks student cellphone users Absent: “Your mind is certainly not here.”

The halls are still full of students. There are students in the classrooms. But they make no sound. My colleague is right: they are not “multitasking,” at which it turns out none of us is very good despite our confidence to the contrary; they are absolutely elsewhere. Before class they aren’t thinking about the class, preparing their minds for some lively interaction; they’re thinking about whatever the mother? friend? boss? on the other end of the text is saying. (They haven’t even turned on the lights, since the phone screen makes its own illumination. At first glance they look, especially the ones wearing hoodies, like monks at prayer in a dim cloister.) After class, same: out come the phones, down go the heads. What has been going on in the classroom—let’s call it the “lesson”—is a little capsule framed by completely other concerns. Curiosity, challenge, reconsideration, reflection on lesson or assigned text: the texts of a different world take their place.

In the halls and on the sidewalks, students walk straight ahead, heads down, thumbs moving. They don’t see the leaves coming out, the birds flying by, the blue of the sky. They don’t see me jumping aside or hugging the wall so as not to be walked directly into. Sometimes they don’t even look for traffic in the street they’re crossing.

Yesterday one of my slightly-older students dropped by my office after class, “looking for someone to talk to.” I enjoy talking with him: he’s curious, reflective, funny. So of course I’m not talking about everyone. In fact, one thing he talked about was the creepy silence in the halls….

Like DDT, cellphones are man-made. Like DDT they are weakening our students in many ways: their ability to pay attention to one line of thought; their ability to discuss ideas and challenge interpretations; their comprehension and retention of class material; their willingness to engage with one another face-to-face. Will they realize this by themselves, and seek a different kind and quality of communication? Can new policies limit the danger and the damage? Or will our Springs, and Autumns, and Summer Sessions fall silent as our technology saps our students’ willingness, and ability, to participate in their own lives? —RAB

Advertisements

“Beowulf, like Everyman, accepted death towards the end of his life.”

That’s a pretty good time to accept it.

Actually, both of them fully accept death AT life’s end, not TOWARDS it. Furthermore, Beowulf makes a beginning at acceptance quite early in life, whereas Everyman waits until the last minute.

We see Beowulf as heroic partly because he accepts even in youth the very real possibility that he will die in one of his exploits. Wrestling with the ferocious and powerful Grendel in the Danes’ mead hall is fraught with danger; but even though Beowulf acknowledges this, he insists on meeting Grendel in barehanded single combat; although a dozen hand-picked Geats stand ready to assist him, he sees the battle as HIS fight. Either he will prevail, thus saving the lives of countless Danes and relieving King Hrothgar of the burden of guilt AND at the same time enhancing his own reputation for strength and courage; or he will fail, and failure means death of a particularly gruesome kind. Similarly, when he takes a sword and pursues Grendel’s mother into her underwater cave to avenge her (revenge-) killing of Hrothgar’s best friend, he tells his Geats and the Dane warriors assembled at the brink of the mere that he goes into this alone, and their only task is to watch and, if necessary, report his death. Fifty years later, when he goes to fight the dragon who has been despoiling his kingdom after a drunken lout disturbed the treasure-hoard the dragon existed to guard, he acknowledges that he will probably die in the attempt but insists that he must fight alone. Young Wiglaf enters the fight after the dragon has wounded Beowulf, but although he manages to wound the dragon he leaves the last knife-thrust for Beowulf. Both hero and dragon die as a result of this battle; but before Beowulf dies he distributes some of the treasure from the hoard among his people and gives them some good advice (through Wiglaf)—in effect, he makes his will. His people mourn him greatly, a “good king” who has ruled wisely and fairly. Beowulf, though, accepts death with the same grace with which he has accepted success before: it is in his nature to accept death.

This is nothing like the way Everyman “accepts” death, especially towards (as distinct from at) the end of his life. When God sends Death to Everyman to set him on the road to his final accounting at the grave, Everyman tries to talk Death out of it, asking him to come back later, give him just a little more time…. Death being adamant, Everyman then bemoans the terrible state of his accounting book and tries to persuade a series of friends and relatives to go with him to buck him up on the journey. They all refuse (one pleads a sore toe!); he sets out, but continues to ask such friends as Beauty and Strength to come along. He manages to restore Good Deeds to health after much too much neglect, and he embraces the promise of salvation and confesses his sins; he can’t actually be accurately said to “accept” death until the very end, though—his attitude is closer to resignation than acceptance.

So my student is wrong two ways: both on the timing of the acceptance of death, and on the similarity of this acceptance. She should have known better than to try to equate a HERO with an EVERYMAN, or “typical person.”

What an interesting discussion could have developed from a comparison between the two characters. She might have speculated on the relative philosophical stances of a hero and an everyday kind of guy, or on the role of an afterlife on the way a Christian should live life as handled by a (probably) Christian monk writing about a pre-Christian hero, and another (probably) monk several centuries later writing about a not-very-diligent Christian. She could have discussed the value of remembering the inevitability of death (memento mori) even when life is at its richest, comparing Beowulf’s integrity even in his youthful adventures to Everyman’s moral and religious laxity until the last minute (“O Death, thou comest when I had thee least in mind”). What conclusions she might have reached I don’t know, since I admit I’ve only begun to think of these possibilities as a result of writing today’s post on today’s horror. But they seem to be worth exploring nevertheless.

Making a hasty generalization about a vaguely defined moment is not the way to find the road to revelation: I do know that.

Sometimes I look back on my college career and lament the opportunities I missed: courses I might have taken, papers I might have given more thought to, heights I might have reached…. I know we all have such regrets. It breaks my heart that my students seem to amass regrettable moments so quickly, and at such a trivial level, where they could instead have let themselves be tempted into taking more glorious risks.

Well, anyway, she sighed.

Let us accept the inevitable things while we can still throw joy at them.


“The question that is always wondered in everyone’s mind is…”

So the verb here is “to be wondered.” Do we have yet another inanimate agent? Not sure, because the question is wondered; that is, it is wondered by something, and that thing is the agent. The agent is certainly unclear, though: the wondering takes place in everyone’s mind, so the mind itself can’t be doing the wondering. Could the question be presenting itself to be wondered…by whatever else happens to be in the mind?

I suppose if my student had included a preposition—”the question that is always wondered about“—the phrase wouldn’t seem quite so bizarre, although the matter of agent would still be up for grabs, or for gropes in the dim recesses of the mind. Wondered about by whom or what?

Perhaps one of my readers more thoroughly informed in grammatical terminology can name this error. I throw up my hands, then put them down again and grab a pen so I can write “awkward and unclear” in the margin and move on.

And so, on to the question itself:

“The question that is always wondered in everyone’s mind is ‘Whose fault is obesity?'”

I had assigned five essays on the “American obesity epidemic” for the week’s reading. Apparently my student generalized from those examples and assumed that everyone was thinking about the issue, all the time. Now, as a perennially-dieting person from the age of eleven on, I probably think about obesity more than a lot of other people do—and I don’t think about it all that much, at least compared to the other things I think about. I especially don’t spend a lot of time wondering whose fault it is. Two or three of the assigned readings did place blame: one accused the weak-willed or perverse individual; one accused pleasure-pushing fast-food joints; a third accused a hurried and thoughtless society that offered few convenient alternatives to junk food. It’s tempting here to echo a wonderful song by Jo Carol Pierce (Bad Girls Upset with the Truth) and add “I blame GOD!” But none of the readings did that…

So my student wasn’t really far off the mark, and an effort at more precise diction would have produced a more effective opening to a (probably accurate-enough) essay of his own. The quarrel I have with him is that he spawned that horribly awkward and unclear noun clause and then went blithely on with his verb of being and ill-defined predicate-nominative question. And that’s the sentence he used to launch an essay that staggered its way through a similarly awkward and ill-defined discussion.

I really, really believe that taking more time on that first sentence would have given him some control as he went forward.

Did he read what he had written? In the small draft-reading circles, did any of his partners object to, or ask about, this sentence? Or, horrible to contemplate, was this phrasing the result of polishing something even rougher as he finalized his paper to turn in?

All these speculations are too depressing as the second week of the semester chugs along and my brand-new first-years toil over Essay Number One, Draft One.

Many years ago, a professor on whom I had a blinding, suffocating crush came into class the day after, we later learned, his wife had left him and commented à propos of nothing that “Hope was the last thing released from Pandora’s Box…the last evil, and the worst.” I tell myself this characterization was as wrong as it was unorthodox, as I gaze hopefully at my students.


“An estimated 1.7 million to 3.4 million American women once were or are now married…”

In celebration of today’s dumping of DOMA by the Supremes (5-4), this garbled statement. It begins so authoritatively, with its statistics and alternatives (“once were or are now married”…); then it loses its grip entirely and falls into chaos:

“An estimated 1.7 million to 3.4 million American women once were or are now married to men who have found that their husbands have homosexual tendencies.”

According to this student, then, gay marriage has been going on for quite some time, and has been quite widespread, and some of the men in those marriages have also had bigamous marriages (not sure “bigamous” is quite the word, but I don’t know what would be better) to American women. Evidently those men were not originally aware that the men they had married had “homosexual tendencies,” either; they’ve just found that out. I’ve never met anyone in this complex situation, but I should be reassured by those statistics that such ménages à trois exist somewhere.

The problem is, of course, the relative pronoun “who.” If she had gone directly from “who” to “have,” she would have been fine. Or if she had written “and” instead of “who,” she’d be okay, although not very graceful. But in her sentence the “who” must modify its direct antecedent, which is “men,” and “their” should refer to the nearest appropriate noun, which again has to be “men.” The husbands of the men married to the women.

I’ve written before about sentences that invite the reader to imagine the writer deeply engrossed in a thought and then unexpectedly interrupted—perhaps by suppertime, perhaps by an alien invasion, perhaps by a fit of despair, perhaps by a bothersome roommate—to resume the sentence upon returning without rereading it. This is that kind of sentence.

I can’t recall where my student took the essay from this amazing statement; I’m not even sure what the assigned topic was.

I’ll just be grateful that, going forward in our nation, men “with homosexual tendencies” will not have to enter into complicated relationships, including heterosexual marriages, for the sake of social acceptance or insurance benefits. No matter what strange sentences my students write on various subjects hereafter, this is one sentence that will not appear again.


“He has a pension for fantasy.”

A simple hearing error.

How often anymore does the typical student encounter the word penchant? Still, somebody she heard had encountered it…or that person had heard it from someone who had encountered it…all the way down into the Quaker Oatmeal box, at some point in which sequence there was a person who actually knew the word was penchant. Whoever heard that person, though, didn’t know the word, and in came “pension.”

How strange it is that college undergraduates would be more likely to know the word “pension” than “penchant.” Are they thinking about retirement before they even enter the ranks of the employed? It’s possible to receive a pension without retiring, as Webster’s first and second variants on definition #1 show: “a fixed sum paid regularly to a person; a gratuity granted (as by a government) as a favor or reward.” But there’s our common understanding, in definition 1c: “a sum paid under given conditions to a person following his retirement from service or to his surviving dependents”—the latter should the employee die in harness, presumably.

[Just to be thorough: Webster’s definition #2 is “hotel or boardinghouse in Europe.” That one derives from the French pension, or boardinghouse, pronounced more like pon(g)-syON(g). But that word has nothing to do with what my student was trying to write.]

Back to definition #1. There’s something staid and settled about “pension.” PEN-shn. Even though a young person could receive a pension, the word would age him, I think.

“Penchant,” on the other hand, has that French je ne sais quoi about it. In real French it’s a form of the verb pencher, to lean, says Webster; in English it means “a strong leaning,” a liking. The definition isn’t terribly interesting, but the word itself…yes, there’s something. Even though the pronunciation isn’t anything special—PEN-chnt—the spelling is so nice. And in affected moments one can always give it a bigger French spin: “Yes, I do have a pon(g)-SHAN(G) for being pretentious!”

Can whoever committed the first mishearing of the word be blamed for confusing PEN-shn and PEN-chnt? Well, my high school French teacher would never have put up with sloppy hearing: his dictées were grueling, and corrected with precision. He would have expected my student (or whoever it was who got “penchant” wrong) to have listened more discerningly, no less in English than in French.

If we don’t blame the hearer, perhaps we should blame the speaker. His fault was plainly speaking good ol’ English. If he had but been a little more pretentious, he might have said the word so that my student heard something closer to the intended term—or, of course, accused the speaker of using “hard words.”

But all of this ignores the true delight of the error. The idea behind this blog has been not only to try to understand the intellectual activity behind the student’s mistake, but also to show the kinds of distracting notions that interpose themselves between the writer’s intention and the reader’s comprehension. In this economic climate, at least for a writing instructor laboring in the hardscrabble vineyards of part-timer-dom, my student’s sentence achieves a poignancy, a poetry, that transports one into a world of revealing truths.

Yes, I have a fantasy pension. Or, receiving a pension from my current employers when I dodder off into the sunset is a fantasy. Or, my only pension after all these years is my finely honed gift for fantasy. I have fantasy for a pension.

Should I punish this student for taking me down this distracting lane, or reward him for giving me a new way of summarizing my life?


“They tried to force the ‘savages’ to convert to Christianity by…”

My student is writing about the missionaries that tried to take what they considered the Word of God to Native American tribes. He’s using the term “savages” because that’s the term frequently used in the Christian literature, especially the Puritan literature, that discusses the indigenous peoples of New England. In American Literature I, we have read a lot of this literature, and also some of the eloquent testimony and commentary of Indian leaders. My favorite is the reply of Red Jacket, an orator and negotiator of the Seneca people, to the Massachusetts missionary Joseph Cram in 1805: “BROTHER: We do not wish to destroy your religion, or take it from you. We only want to enjoy our own.” If only people in the 21st century could think like that!

It is perhaps unfair to take a sentence from a midterm exam for discussion here, but what interests me is not the relatively minor writing error, which should probably be excused on a test; it is the shining evidence of a tin ear (or a blind mind’s-eye), the kind that afflicts people writing under pressure.

Here’s the full sentence, the completion of what I’ve teased you with above:

“They tried to force the ‘savages’ to convert to Christianity by throwing the Christian bible in their face.”

We’re not talking friendly persuasion!

The actual error is the number disagreement between “savages”/”their” and “face.” Attending to the plural might have made my writer pause and rethink his cliché. But he did not notice the mistake, and so he gave me a moment of hilarity in the midst of my midterm tears.

“Don’t throw that in my face,” we say, when someone we’re arguing with refers to a past gaffe or stupidity and thereby scores a point. I think that’s the principal usage for this phrase, isn’t it? If so, it’s not the cliché my student should have chosen (as long as he was determined to choose a cliché at all). Still, any reader knows he wasn’t speaking literally: we have no record of someone actually throwing Bibles at Indians, nor would my student have tried to claim anyone actually did. He just meant, I’m sure, that missionaries and others pushed the Christian message again and again, unremittingly, brooking no protest and engaging in no debate—for the “savages'” own good, assuredly, the missionaries must have believed.

Still, I did laugh at the ridiculous image, the cartoon that flitted across my mind’s eye, Chingachgook throwing up his hands to protect himself from the barrage of airborne Bibles being flung by the hot-eyed, high-collared holy.

Well, we would all have been better off if the only ammunition had been Bibles. Still a spiritual assault, true, but causing much less bodily harm, and less permanent harm, than the bullets from the muskets of settlers and the rifles of soldiers.


“Once he started using drugs it was hard for him to stop because…”

This is the hapless son of yesterday and earlier posts. Actually the case write-up I gave the class(es) didn’t say the kid couldn’t stop using drugs, and the father, who killed him to save him from a junkie’s death, actually admitted that his son was NOT yet addicted. But the student who wrote this sentence was sympathetic to the boy, and preferred to assume, I’m sure, that he couldn’t stop—not that he just wouldn’t.

From the rest of the sentence we can see why stopping was so hard:

“Once he started using drugs it was hard for him to stop because he would start using them again.”

I don’t know what I can add here. Obviously the surest way to fail at quitting is to not quit, or to keep restarting.

I’m not sure how many student writers have trouble with the word “because,” but I can say with some certainty that many of my student writers do. If I asked this student what she meant, I’m pretty sure she would say she meant that the kid quit smoking dope more than once but kept going back to it. I do not think she would say that what was keeping him from quitting was not-quitting, even though that’s pretty much what her sentence says: the reason he couldn’t stop was that he kept on. She’s describing a cycle of behavior but inserting a false causality.

And that’s because she doesn’t seem to know what the word “because” actually means: it’s just a word to put between two actions. Proof of this assumption is that the facts in the case left a lot of room for speculation, a lot of clues. For instance, plenty of adolescents discover that their parents’ priorities are no longer their own priorities, and in this case the boy quit his high school varsity football team and lost interest in tossing the ball around with his (ex-pro-football-player) father; he began hanging around with kids from school instead of coming home to spend time with Dad; in defiance of school rules and probably Dad’s preferences he grew his hair long. Suspended for this dress-code infraction, instead of mending his ways he accepted expulsion and began to skulk in his room. Marijuana and cheap wine seem to be part of this effort to find his own way, or at least evade the path he suddenly didn’t want to tread any longer.

If he was rebelling against his father’s expectations of him, smoking dope may have been THE thing he knew would make his father crazy. In that case, Dad’s efforts to “help” him would have confirmed him in his refusal to cooperate.

Or, of course, at 17 he may have suddenly felt lost in a world he wasn’t sure he could deal with, and wanted to be left alone for awhile to sort things out.

ANY of these might have been a “because”: “Once he started using drugs it was hard for him to stop because he wasn’t ready to deal with his confusion.” Or “once he started using drugs it was hard for him to stop because giving in to his father and the school would have been too embarrassing.” Or “once he started using drugs it was hard for him to stop because he liked the feeling of not having to care about anything.” Or “once he started using drugs it was hard for him to stop because for the first time in his life he felt really cool.”

By sticking a “because” where  it made no sense, my student prevented herself from actually thinking about what might have been going on in that boy’s mind. Instead, she gives us a sentence that dooms itself to circularity and the whole essay to superficiality.

That’s why these errors, or “horrors,” or what-you-may-call-ems, matter: because rather than inviting thought they short-circuit it; because rather than opening new vistas of intellectual possibility, they pull the blinds and leave the writer in a situation like the boy’s in this case…safe and numb in the dark, getting through the day (or the assignment) but going nowhere.


“Nature possesses the ability to be seen in a multitude of perspectives.”

My students live in a panpsychist world.

How else explain the neediness of abstract or nonanimate things? Punishment needs to be dealt out. Merit needs to be rewarded. Attention needs to be paid.

With (reportedly first-remarker) Pythagoras, Spinoza, William James, and others, my students believe “everything is sentient.”

Here we have Nature, possessing the ability to be seen. She is visible! She is visible, in fact, “in” a variety of perspectives. This phrase evokes Andrew Marvell’s charming poem “The Picture of Little T.C. in a Prospect of Flowers.” It ends:

But O young beauty of the Woods,
Whom Nature courts with fruits and flow’rs,
Gather the Flow’rs, but spare the Buds;
Lest Flora angry at thy crime,
To kill her Infants in their prime,
Do quickly make th’ Example Yours;
And, ere we see,
Nip in the blossome all our hopes and Thee.

Little T.C. is “in” a prospect of flowers because she is “in” the picture, and the picture shows a prospect. I guess Nature could be seen “in” perspectives in the same sense.

But I believe my student meant that Nature can be seen from a multitude (why not “variety,” which is more to the point perhaps?) of perspectives: hence poetry.

Now, “can” might also imply sentience, or capability, on the part of Nature; acceptable usage lets that one slide. “Possesses the ability to” cannot be grandfathered in, though.

Well, on Thanksgiving Day, with a pie in the oven, one should not carp.

One should look up from one’s plate and gaze upon the variety (and, if you’re lucky, multitude) of faces looking back. One should consider the seemingly infinite variety of Nature, of which those dear faces are examples. One should be grateful not only that such variety—and such loveliness—exist, but also that they are visible. Whether everything is sentient or not, WE are sentient. Celebrate it.

Today, give thanks for everything. Tomorrow, the red pen. Tomorrow, gather the blossoms. Root out the weeds, by the way! But try even then to spare the buds. Mantra for a writing teacher?


“The artwork that was placed there for deceased loved ones was put there for a reason.”

My student is clearly indignant that the lover of graveyard art appropriated some objects from the cemeteries where he was employed.

I know she was indignant partly because the phrase “___ for a reason” is usually spoken in tones of indignation: “I told you not to touch the stove for a reason, Missy!” “I give homework for a reason, young man.” Implied is “for a very GOOD reason, and now you know, don’t you?”

And so I’m confident that indignation, rather than complete obliviousness, has produced this seemingly circular sentence. “For deceased loved ones” isn’t quite the reason the artwork was “placed there,” although it is perhaps the occasion, or the practical purpose. In the case of the assignment, she’s discussing a Tiffany window that had graced a family mausoleum until, neglected and evidently forgotten by any remaining kin, the mausoleum lost part of its roof and the walls began to shift, and the window itself sagged out of its frame, its glass stress-cracked and its leading weather-softened. The art-lover removed the window, took it home and restored it, and then sold it to an antiques fence (and was caught in the trap set to catch the fence).

Once upon a time, the window was put into the mausoleum, and the mausoleum was “placed there” to house the beloved remains—that is, the mausoleum was “for” deceased loved ones, and the window was part of the edifice. Both window and mausoleum commemorated the dead. But, at least according to western belief, the dead weren’t likely to be able to admire the window or the handsome stone-and-mortar work; they weren’t likely to celebrate being laid on shelves instead of buried in the ground, either, or consciously bask in the pools of colored light. What, then, could the “reason” be for erecting a lovely little building and installing a window made by a famous (and fashionable) artist?

The reason must have been multidimensional: to honor the dead (not merely to deposit them); to comfort the surviving family that they had “done right by” their forebears; to console and delight survivors when they came to lay wreaths or to bring more company to the deceased; … and to impress passersby with the dignity and wealth of the family.

In the context of the story, one might question the success of the installation. The mausoleum was in what the news report called a “neglected corner” of the cemetery, where passersby would be unlikely. The tomb itself was in a state of neglect and decay, meaning that the family had died out, moved away, or just forgotten about it: at any rate, nobody was laying wreaths, adding new “loved ones,” or stopping in for some private grieving. Nobody among the living was enjoying the window as it inched toward disintegration. No aspect of the reason was being fulfilled.

If I had been writing the essay, I might have followed the sentence with a discussion that went in exactly that direction: for a reason, but the reason is long forgotten. I probably would have been sad rather than indignant, and I probably would have been arguing that the theft hurt no one and should not be prosecuted as a felony. I believe such a case can be made.

My student didn’t go in that direction, though. She wrote her statement, and then she put her figurative hands on her figurative hips and went on to argue that no one had the right to take the artwork away, since it had been put there “for a reason.” She didn’t go into the reason at all, and she didn’t meditate on the pitiless tooth of Time and the decay of all earthly things.

This was, after all, comp class, not lit or creative writing.

And because she did NOT go into the reason, I suggested in my comments that the sentence was somewhat circular (or self-reflective) and seemed to belabor the obvious rather than making a point.

If students would come to office hours, so many interesting conversations might occur! But at the end of a paper in a stack of 40 papers, a sentence or two in cursive (which I write but which many of my students seem unable to read) can’t do much. I write my comments for a reason, but I’m not sure that the reason is fulfilled.


“Gods play a part in humans’ affairs, which I think is unfair…”

The plural “gods” suggests (accurately) that he was writing about the Greek and Roman gods, who certainly did play a part in humans’ affairs, and frequently were the humans’ affairs. Zeus in particular was pretty busy in the affair department. Less literally, many of the other gods took sides in human conflicts and manipulated events to suit themselves, or their adherents (or whoever had provided the most fragrant barbecue).

Readers who admire Hector and have little sympathy for Achilles in The Iliad tend to feel strongly about the gods’ meddling. They are probably inclined to find it unfair.

My student isn’t just taking sides, though; he has a good reason for his opinion:

“Gods play a part in humans’ affairs, which I think is unfair because they are inevitably immortal and won’t be affected.”

I agree with him! Intervening in a situation in a way that is sure to victimize—perhaps severely and irrevocably—one side may sometimes be justified; but when the intervenor knows all along that he or she cannot be harmed in the process or as a consequence, “unfair” is a word that can reasonably come to mind.

What gives a reader pause here is the word “inevitably.” Perhaps my student meant “invariably”? Actually, if the lore, legends, and teachings are to be believed, not all gods are immortal. Pan dies. Nietzsche even proclaimed of the great Western god (although he and countless thinkers since have made the statement far more nuanced and complex than it sounded), “Gott ist tot.” Ragnarök gets rid of them wholesale. So, depending on the god or gods one chooses, that immortality may not be “inevitable.”

And usually when we say “inevitably,” we imply that efforts have been made or could be made, all in vain. So have any gods tried to avoid being, or stop being, immortal? We might mention Jesus here, but his death was the precondition for his resurrection and therefore although painful, risk-free. Yes, death certainly affected him in the moment, and changed his physical being; but it didn’t alter his immortality.

Would my student yell “NO FAIR!” at someone who claimed to “have God on our side”? Should we recall all those championships and medals won, according to their winners, “with God’s help”? Didn’t they have an unfair advantage?

Next time some god shows up at your house wanting to kibbitz, should you tell him or her to move along and stay out of the affairs of mortals, “go back to your own kind”? Well, being human, I’d probably feel that my side was the right side, and welcome the aid of the divine to make sure things turned out “right.” That would be only fair.

Worked for Achilles.