You’ve Read This Post Before

The Glossary, a Los Angeles-based audiovisual marketing firm, has reinvented David Foster Wallace as a motivational speaker. This “fine purveyor of STIMULATING VIDEOGRAMS” edited the best soundbytes from Wallace’s graduation speech at Kenyon College, “This Is Water,” and then dressed it up with video, trendy animated scribbles, and sprightly background music.

The Glossary included the lines from the speech that haunted Wallace’s readers after he hanged himself:

Think of the old cliché about quote the mind being an excellent servant but a terrible master.

This, like many clichés, so lame and unexciting on the surface, actually expresses a great and terrible truth. It is not the least bit coincidental that adults who commit suicide with firearms almost always shoot themselves in the head. They shoot the terrible master. And the truth is that most of these suicides are actually dead long before they pull the trigger.

Returned to its original context as part of an exhortation to graduates to work towards mastery of their own perceptions – considering, for instance, that the overweight woman losing her temper in a checkout line might have spent the night with a dying husband and was not, in fact, just put on earth to annoy everyone in line behind her – the passage serves as a sort of radical motivation in which reimagination is the only way to keep oneself alive. Some critics, including Leslie Jamison, in his review of a Wallace biography, have rewritten Wallace’s suicide as a piece of postmodern performance art, with the “terrible master” passage a snippet of autobiography concealed by being waved in front of a crowd.

The less esoteric version has Wallace suffering from lifelong depression, forced to go off his medication because of severe side effects, and then, after falling into an even more severe depression and restarting the poison pills, discovering that they were no longer effective for him. Apparently, even if you are a genius, you still also have to be a person and a body with an uncooperative brain. Irreconcilable differences are bound to occur.

What surprises me about The Glossary video that has gone viral this week is that people find Wallace’s views so inspiring and revolutionary. In essence, he argues that most people ricochet back to the same mental point of origin, the panoramas that are so familiar we have stopped seeing them; but by prodding ourselves to consider other versions of what looks like reality, we are free to become better masters of our minds. He also acknowledges that getting outside ourselves is difficult, exhausting work, and he admits that sometimes he himself is too tired to engage in it.

To me, this celebration of possibilities is as good a definition of creativity as I’ve ever come across – something like mental Cubism, in which all realities can be embodied at the same time. But it also makes perfect sense to me that Wallace’s call to reinvent and reenvision, and the massive effort it takes to do so, would come from someone who was suicidal enough of his life for a bullet in the brain to become a metaphor. With depression as the random point in space from which you view the world, death is always right in front of you, blocking your view. To survive, you have to imagine a different frame, in which the option of suicide is somewhere far in the distance, behind a closed door, somewhere you might visit sometime when you don’t have so many other things to do. Once you know where the door is located, though, it is impossible to forget it exists or how to open it.

In a speech at the 2011 National Book Festival, Toni Morrison briefly discussed her dissertation, which compared William Faulkner’s and Virginia Woolf’s conceptions of suicide. Faulker viewed suicide as the ultimate defeat, Morrison explained, while Woolf saw it as a reasonable choice, in her case a rational alternative to putting herself and her husband through another period of psychosis. I tend toward Woolf’s view, and, I would guess, so did Wallace. Wallace’s “This Is Water” speech offers instructions for making other choices.

However, it is a more than a little paradoxical that the speech has been appropriated by a marketing firm. As a former (mostly mediocre) ad writer, I’m in a position to know that the whole objective is to create materials that act as magnets, pulling thoughts in the intended direction without infringing on viewers’ certainty of their own free will. Within a few days, the video had attracted 2.7 million views, dwarfing the popularity of previous projects (and, incidentally, using audio of Wallace’s Kenyon speech without permissions). In an Adweek interview, the creators claim, disingenuously in my opinion, “Our main goal was to expose people to the content of the speech.” Later in the interview, though, the creators concede, “…as a tiny company in an industry filled with so much talent and competition, it’s extremely difficult to get your work noticed…so we’d welcome anyone who enjoyed ‘This Is Water’ to get in touch with us.”

I’m reminded of the perennially puzzling sentences, “This statement is untrue” and “Question authority.” Wallace’s legacy will almost certainly transcend this little ripple in the information ecosystem, but I’m also fairly sure its undertow is meant to pull us down into the water.

Total B.S.: The Resurrection

whyprof, joining the multitudes with an unnatural fixation on professor-bashing, has declared “University Professor” to be the least stressful career because of “high growth opportunities, low health risks and substantial pay.” High-level corporate executive jobs – yes, the ones that gobble up bonuses and sail away in golden parachutes, even when they lure both their companies and taxpayers into economic netherworlds  – were for some reason declared among the most stressful.

As with David C. Levy’s editorial claiming that faculty at my college are underworked and overpaid based on spurious information, the Careercast article is notable both for for its disconnection from reality and the Schadenfreude with which it is forwarded by people with much cushier careers. I love my job, and I’m not going to argue that it belongs on the Top 10 most-stressful-jobs list, but déjà vu moments are becoming more common than apocalypse predictions. Once again, the good writers have based their claims on faulty assumptions:

  1. Professors have high pay.  For support, Careercast cited the (yes, dizzying) compensation of faculty from Harvard, University of Chicago, and UCLA, ignoring the fact that most full time faculty at middle-tier institutions earn about half this amount and full time faculty at two year colleges earn about a third of this amount, even using the mysteriously inflated data in the study. My own institution, Montgomery College, only includes salaries of full time faculty and doesn’t list adjunct salaries, which are measly.
  2. Faculty jobs are multiplying. Careercast misleadingly declares, “To maintain the quality of education while meeting the increased demand, universities are expected to add 305,700 adjunct and tenure-track professorial positions by 2020.” The article briefly mentions that competition is fierce for full time faculty positions (at MC, we usually receive at least 100-150 applicants for every opening) and cites the “new emphasis” on adjunct positions. However, most of the growth in faculty hiring has been in low-paid adjunct positions, with a good proportion of full time hiring to replace retiring faculty who were hired during community colleges’ hiring heydays in the 1970s.
  3. The job has few physical risks. I don’t have any official statistics, but with the massive teaching loads at institutions below the top tier of colleges and universities, overuse injuries and stress-related illnesses (like migraines) are rampant among faculty teaching more than 100 students a term. I’m not talking whining about headaches, but instead injuries that have interfered with work and have needed ongoing medical attention. We’re not saving lives here, but my neck, wrist, and shoulders are never going to be normal again, and huge numbers of my colleagues say the same thing. (Also, if the wacky proposals to require guns in classrooms are successful, the lethality of the job could increase quickly.)

Careercast considered these other factors in calculating stress levels, although they didn’t provide scores for each category:

  • Travel. Most of my long-distance travel is nominally optional, but necessary to stay current in my field. My short-distance travel is almost always reasonable – but if you’d asked me when I was an adjunct driving 500+ miles a week to different campuses, you would get a much different answer.
  • Deadlines. As far as I can tell, our faculty lives are one big calendar of deadlines – for conference proposals, articles, reviews, committee work, collaborations, class preparation. Most notably, courseload directly determines the amount of deadline pressure to respond to student work. A professor who assigns 25 pages of writing during a semester to 125 students is going to grade more than 3000 pages of writing in a semester, not counting homework.
  • Working in the public eye. Even when the public eye is mostly closed – as for this latest article – our work is scrutinized. Public speaking is the #1 most common phobia, and when faculty step in front of a classroom, scrutinized by dozens of skeptical students and sometimes by their helicopter parents, it feels very, very public…witness the persistence of teaching nightmares.
  • Environmental conditions. Environmental risks vary by discipline, ranging from picking up illnesses students pick up from their children to dealing with toxic chemicals and toxic people.
  • Hazards encountered. We’re not exactly on the front lines, but considering that most of the mass shootings have involved students and colleges in some way and that we’re “first responders” for a variety of situations, I wouldn’t say our jobs are hazard free.
  • Own life at risk. Thankfully, our lives are usually not at risk, but occasionally domestic violence, gang violence, or mental instability can create some dangerous situations. These are handled confidentially so I don’t have statistics, but at my own college several threatening situations arise at campuses each semester.
  • Life of another at risk. When I worked in advertising, I can say with absolute confidence that nobody ever came to me afraid that she would be killed by a spouse or ex-spouse, that he or she was about to attempt suicide, that he had nothing to eat, that her parents had kicked her out, or that an addiction made him a danger to himself. Now that I am a professor, these situations come up several times a semester, and once in a while I’ve even probably had a small role in saving a handful of lives. That doesn’t even count the kinds of lifesaving that happens when the support of a faculty member helps a student escape a soul-killing future, which is what I consider to be a good-sized chunk of my job.
  • Meeting the public. What is it that Careercast thinks we do at the start of every semester?

As I said, I am certainly not arguing that my job is the most stressful, and in fact, because my work is so satisfying, the stress doesn’t seem to matter as much as it would in a job that was empty of social value. On the other hand, knowing that I am doing something important in a population whose last best hope is often education carries its own sort of stress, because every day I must weigh my own needs against the needs of others and balance all sorts of competing projects that represent competing values.

It is interesting to me that level of responsibility, amount of prioritizing necessary to get the job done, and public bias (few have knowledge of what we do, but everyone has an opinion about it) weren’t considered in the criteria, since they’re known stress factors, but I’m not a pollster, so whatever. As I tell my students, I am privileged to have the best job in the world. It’s just not the best job in the world for the reasons some people think.

No, Really and Truly – The Absolutely, Positively Worst Ideas of 2012

Copernicus_-_Heliocentric_Solar_SystemFor some reason, The Washington Post prematurely nominated its worst ideas of 2012 way back on October 1. All the Post’s bad ideas had to do with sexual indiscretion by powerful men, political incorrectness, hubris, or all three. The one bad decision in the bunch made by a woman was the failed ouster of University of Virginia president Teresa Sullivan, which was spearheaded by that self-appointed defender of vision, the unfortunately-named Helen Dragas.

Speaking of hubris, though, the Post left out almost three months of bad ideas and almost an entire gender – which is sort of amusing, considering that some of the worst ideas of the year were about women. Here goes:

Do-it-yourself birth control: First, Foster Friess, a billionaire and mutual fund manager, kicked off the war on women when he suggested Bayer aspirin could prevent pregnancy: “The gals put it between their knees, and it wasn’t that costly.” In case we excused Friess’s comment as anomalous, Missouri Republican Todd Akin – also known for trying to eliminate school lunches for embryos that make it to grade school – defended prohibitions on abortion for rape victims by declaring, “If it’s a legitimate rape, the female body has ways to try to shut that whole thing down.”

Rape as God’s will: Not to be outdone, Indiana Republican Richard Mourdock argued – several times! – that any life resulting from rape was “something God intended to happen.” His idea manages to be terrible on several levels: first, that (despite its frequent appearance in the Bible) rape is acceptable because the ends justify the means; second, that God means to torture women; and third, that Mourdock somehow knows what God intends.

Ayn Rand: From Rand’s excruciating prose, eugenically-selected protagonists, contempt for acts of generosity on the grounds that they enable helplessness, and glorification of selfishness, we learned that the Romney-Ryan defeat stemmed from the triumph of mediocrity rather than Romney’s staggering ignorance of the world inhabited by the ordinary riffraff. (Dana Milbank’s piece in the Washington Post, “At Romney Headquarters, the Defeat of the 1%” does the best job I’ve seen to show that Romney’s insensitivity comes straight from the heart.)

Teachers bearing arms: If I actually have to explain why this is a terrible idea, please stop reading now.

The Second Amendment: If you skip the “well-regulated” and “necessary to a free state” parts, assault weapons make perfect sense.

Jonathan Franzen’s opinion of Edith Wharton: Based on Wharton being unattractive and sexless, America’s most popular purveyor of unpleasant characters dismisses her entire body of work. The bad idea – which you really might expect someone at The New Yorker to question – is the entire assumption that women have no artistic legitimacy without sex appeal.

New Yorker cartoons: Looking for sexism? Women carping at their downtrodden husbands? Gender dynamics that haven’t changed since the 1920s? I love The New Yorker, but I wish it would reconsider its tradition of phallocentrism.

Women are helpless, except when they’re not: Okay, I’m supposed to believe that the general of the most powerful military in the world was prostrate before the siren song of Paula Broadwell? Either he couldn’t resist – which I highly doubt, given that Petraeus was entrusted with our national security – or he could have resisted, but didn’t bother since the popular press would blame the woman anyway.

Voyeurism. Maybe Invisible Children was a showcase for the arrogance of Jason Russell, but when TMZ broadcast him staggering naked through the streets of San Diego and ridiculed what was clearly a mental breakdown, it didn’t exactly show the public in a flattering light when we played along. Same with the photograph of a man about to be hit by a NYC subway car. And same with the anguished photo of a woman trying to find out the fate of her sister, who had already been killed by the Sandy Hook shooter.

Illusions of privacy. Yes, my privacy has gone the way of the Twinkie, without the anti-union rhetoric. I value privacy, but not when it gets in the way of seeing the cartoons and photos my friends post or being able to avoid entering twice as many addresses into Google Maps on my phone.

The end of the world. The true bad idea here is that I didn’t plan an end-of-the-world potluck holiday party; I hosted one in 1999, asking guests to bring the dish they would want to eat if the world really ended at the turn of the millennium. Good times. P.S. Runner-up: blaming the prediction on the Mayans.

The end of the list. And if you believe that these are the only worst ideas of 2012, I have something I want to sell you. Close your eyes, hold out your hands, and count to ten.

Survival Guide for the U.S. Election Season

As a U.S. citizen, I am fortunate to live in a country in which gargantuan ethical and civic questions can be decided by an election. According to the U.S. Census, 71% of eligible citizens are registered to vote, and only 57% of the voting-age population voted for president in the 2008 elections. Only about a quarter of eligible voters in the age range of most of my students vote, which means that each voter under 30 gets to make decisions for three other students.

In each election cycle in my 10+ years of teaching, I have urged my students to vote. Early on, I tried to cultivate a sense of civic responsibility, and they countered with arguments that the candidates were not substantially different, that all politicians lie, that they (the students) felt they were not informed enough about issues to give an opinion at the polls, that their votes wouldn’t matter to the outcome, and that no candidate’s point of view represented their own.

And, these days, in each election cycle I have argued that the simple act of voting – even if their candidates and initiatives lose – makes it more likely that politicians will pay attention to the needs of their demographic. About 70% of older voters cast ballots, which can’t possibly be unrelated to the way social supports (such as they are) play out. Why, I ask my students, do you think tuition is skyrocketing, childcare is basically unaffordable, and student loans enrich the bankers and impoverish underemployed college graduates? Why aren’t there more jobs for young people just entering the workforce? If you were a politician, I go on, why would you spend your time on legislation to help people your age when people four times your age are almost three times as likely to vote?

Usually, at this point, I can look up and see a room full of mildly shocked eyes. I like to tell myself that I have made a compelling argument, and I never see anyone sleeping or text messaging for this particular speech, but I have never once had a single student tell me that I convinced her to vote, either, so the shock must be that it’s the middle of the semester and they have only just realized their professor sometimes makes stuff up.

Complaining, on the other hand, is a truly participatory American process. I am not one of those people who goes around telling non-voters that they have no right to complain. First, I sincerely believe that everyone has a right to complain; but second – and more to the point – getting people to stop complaining is like getting DC cars to stop running through crosswalks: If you try to stop them, every absolutely-right molecule in your body is still going to get obliterated by the vehicle whose steering wheel drivers can feel in their hands.

I don’t know whether it’s a function of maturity (or was that a euphemism for cynicism?), the parting of the red sea from the blue sea in American politics, or exasperation with a political system that is far to the right of my own beliefs, but I have also stopped enjoying conversations about politics. All such conversations end exactly like my impassioned pleas to get my students to vote – that is to say, with a high probability that everyone’s minds will be just as unmoved as before I used up all that oxygen.

Political conversations have become a bore, because they have such limited possible outcomes:

  1. You express outrage to people who are outraged about the exact same things…and nobody’s mind changes. (1.a. is that you offer new facts to add to someone’s pre-existing outrage.)
  2. You express your fabulously well-thought-out opinion to someone with whom you disagree, you argue, and, if you’re particularly tactless or impassioned, you discover you can’t talk about politics…and nobody’s mind changes. (2.a. is that you decide you are so horrified by the other person’s politics that you will never speak to each other again, at least until the election is over. 2.b. is that you are secretly horrified that you know and like someone who would have opinions you think should have gone the way of bloodletting as a cure for illness.)
  3. You listen while someone passionately tells you to believe something you already believe, vote in a way you will already vote, or regard the other side as stupid and crazy.
  4. You listen while someone passionately tells you to believe something you are not going to believe anyway, and you realize the other side is stupid and crazy.
  5. The person who disagrees with you makes good points, but you still disagree.
  6. You express your not-so-well-thought-out opinion and refine it so that you gain a better understanding of what you believe. (In my opinion, this is the only good reason to make a political argument these days.)
  7. The dream that you might convince someone who disagrees with you makes you go on and on and on and on and on and on about what you believe.

I have several friends who are both political junkies and chain smokers. My unscientific estimate is that a political conversation results in changing someone’s mind about as often as a conversation about smoking convinces a smoker to quit. It’s not impossible, but you might see a unicorn first.

To me, the only realistic option is 8: You already know what you believe and accept that you can’t convince anyone, so you don’t bother talking about politics.

But, someone will argue, can’t you sway someone who is undecided to take your side?

Um, no. Not really. If someone has trouble choosing between Obama and a presidential candidate who believes that only some people deserve food and healthcare; that science and history should conform to one’s ideology; that some pigs are more equal than others; that Atlas Shrugged should replace the Bill of Rights; that 47% of Americans who are retired, raising kids, or going to college are freeloaders; that government control is bad except when it pertains to women’s bodies; and that it’s refreshingly resourceful to strap the family dog to the roof of the car so the luggage can ride inside, there’s not a whole lot to talk about.

In other words, shut up and vote.

Mommy Issues and the War on Teaching Faculty

First of all, thank you to everyone who read, forwarded, and commented on “The Shelf Life of Total B.S.” I am awed and honored.

When I wrote my response to David C. Levy’s salvo against teaching faculty, I expected that its readership would be limited to the same dozen or so long-suffering friends who’d hung on for the last few entries. I had even thought of retiring the blog, and I might have done so were it not for the encouragement of these readers.

I would absolutely never have predicted that journalist Kaustuv Basu would call my office the next morning to interview me for an article in Inside Higher Education – my first thought was, “Do these people know what a nobody I am?” – nor that my blog entry would gain support from higher ed colleagues across the country, and certainly not that I would seem worthy of ridicule in Gawker.

The Gawker article, “College Professors Find Plenty of Time to Be Outraged About Being Called Not Busy,” was dwarfed by a photograph of a balding, disheveled white man evidently snoring in a recliner, cat by his side:

True? Untrue? It doesn’t matter. (Except to academia [=boring].) When it comes to winning these public debates, all that matters are the “optics” of the thing. From Inside Higher Ed:

Jill Kronstadt, an associate professor of English at Montgomery College, was in the middle of grading papers Sunday when she came across a Washington Post opinion piece questioning whether college professors work hard enough.

She was upset.

Kronstadt spent the next few hours writing a rebuttal to the piece

“I am so outraged about your piece insinuating that I do not have way too much work to do that I just stopped doing my little bit of work and spent hours crafting a response to you, because hey, I have the time for that,” is what I imagine her intro said.

Apart from author Hamilton Nolan’s not bothering to link to – or, it seems, even read – my response, and apart from the fact (which I emphasized to Basu several times when he interviewed me, but which he nevertheless misrepresented) that I finished grading and then wrote the blog, not the other way around, I find a much bigger and arguably more sinister message.

Since when does it show a poor work ethic to take a few hours on a Sunday to do something not strictly work related? The underlying assumption is that teaching faculty are not working hard enough unless they are working every single minute of every single day and weekend. On the other hand, Levy’s article argues, “The faculties of research universities are at the center of America’s progress in intellectual, technological and scientific pursuits, and there should be no quarrel with their financial rewards or schedules.”

Levy’s Ayn Rand-like contention that researchers and not teachers are central to national prosperity – carrying with it the idea of trickle-down prosperity rather than robust education for all – is arguable at best. In Levy’s formulation, researchers deserve unquestioned reverence, whereas teachers, teaching teachers, should be followed around with stopwatches. But why is the stereotype of Ivory Tower slackers so enduring?

In pondering this question, which Basu also posed to me in our interview, and to which I then had no answer, it occurred to me that there is another group of people who are disparaged and even hated unless they work incessantly, give infinitely, and sacrifice endlessly without crass hopes for things like compensation, appreciation, or societal supports. And, in the likely event that life is not 100% certifiably perfect, members of this group are the first ones assigned blame.

That’s right: mothers.

It so happens that Levy’s double standard falls along gender lines. According to a 2006 study, “AAUP Faculty Gender Equity Indicators,” full professors at doctorate-granting institutions – the faculty Levy singles out for particular reverence – are only 19.3% female even though women earn nearly 50% of doctoral degrees awarded. At community colleges, by contrast, women compose 50.8% of full time faculty and 51% of part time faculty.

The gender breakdown of K-12 teachers, however, dwarfs the disparities in higher education. A 2006 Harvard University report, “The Segregation of American Teachers,” states that women occupy 75% of the teaching positions in public schools. Based on these statistics, it doesn’t seem like a big leap to say that the word “teacher” conjures images of women, and the word “professor” elicits images of men. A 2000 article in Teaching Sociology (Oct 2000: 28.4) confirms this suspicion. Perhaps coincidentally, the public has assigned the majority of the blame for the higher education crisis to K-12 teachers, with undergraduate faculty making rapid gains, as documented in the oft-quoted book by Richard Arum and Josipa Roksa, Academically Adrift: Limited Learning on College Campuses.

In fact, many studies have documented gender bias in student evaluations of faculty. For example, one study conducted by researchers at Harvard University, Clemson University, and the University of Virginia found, “In biology and chemistry, male students tended to underrate their female teachers, but female students did not. In physics, both male and female students tended to underrate their female teachers.” An October 2008 article in Political Science & Politics, “All-Knowing or All-Nurturing? Student Expectations, Gender Roles, and Practical Suggestions for Women in the Classroom” (41:4) lists multiple studies showing gender bias and offers advice to female faculty for mitigating the effects of this bias on their own evaluations.

These many student evaluation and demographic studies seem to imply that systemic gender bias is a factor in perceptions of the value of teaching faculty. In my cursory, inexpert view, attacks on educators and educational institutions seem to be directly proportional to the percentage of women in these institutions. If students unconsciously expect their female professors to act like mothers, it seems plausible that the average unreflective, media-saturated member of the general public would impose his or her expectations of mothers onto teaching faculty as a group.

Gender bias could also explain how Levy’s unqualified endorsement of research over teaching faculty found publication despite a lack of factual and logical support. Men contribute to society; women nurture. Male faculty conduct important research; female faculty teach.

It is true that we have a higher education crisis. Enrollments have skyrocketed, particularly in community colleges like Montgomery College, and counties and states, who have lost revenue in the economic downturn, have underfunded education, and especially higher education. Consequently, the cost of education has been offloaded onto students in the form of increases to tuition, class size, and hiring of underpaid part time faculty (a population which, incidentally, tends to be balanced by gender).

Recently, we have seen the devaluation of women play out everywhere from Arizona’s legislative attacks on contraception and reproductive choice to Hilary Rosen’s spurious claim that Ann Romney never worked a day in her life. If students routinely perceive female faculty as less competent than males, is it wrong to wonder whether the public perceives female-dominated institutional roles to be less valuable than male-dominated ones? And, as we try to control education costs and improve outcomes, are we letting societal mommy issues obscure the real solutions?

Gladly Beyond: A Place for Literature

The e.e. cummings poem that begins, “somewhere i have never travelled,gladly beyond/any experience,your eyes have their silence” and ends “nobody,not even the rain, has such small hands” is unabashedly about love: The poem figures prominently in Woody Allen’s film Hannah and Her Sisters and was swapped between lovers in my college dorm. The idea of finding the one person whose looks can “unclose” you seduces anyone capable of undressing in the throes of a soul-mates fever dream.

Nonetheless, the lines of this poem – which seem so tenderly meant for a lover – immediately sprang to mind when, preparing for the beginning of the semester last week, I thought about how to articulate my love of literature to my new students. Reading: so much like the unfolding between cummings’ lovers, only exponentially more promiscuous. Every work of literature opens its own universe. I have only one bricks-and-mortar life, but literature gives me thousands of consciousnesses in hundreds of times and places. Each book uncloses me, transports me out of myself and into lives that are absolutely, impossibly not my own.

When I talked to my students that first day, I shared the reasons for my passion for literature and saw that my students appreciated my love of my subject but did not share it. Seeing their skepticism, I dutifully trotted out the pragmatic reasons for careful, thoughtful reading and how they might apply to the career aspirations of the students in the class, but now I regret falling back on salesmanship.

Literature, it seems to me, is the antithesis of the agenda embedded in public discourse, of social networking and Web 2.0, of everything on demand 24/7. At least in the United States, we live in an age that exalts the individual; we devote more and more of our ingenuity towards customizing our own experiences – in other words, to limiting awareness to what we have already imagined and requested. Even in education, we judge success through measurable outcomes and whether college has conferred skills that mean something in “the real world.” On our separate phones and laptops, all password protected, we can choose the apps we want, the news sources whose views we espouse, and the people who share our own interests.

I think about the trouble some students have with reading – in 2011, only half of students had ACT reading scores predictive of college readiness – and I think that at least part of the problem is that reading demands that we enter someone else’s consciousness, that we desire to understand what is inaccessible to us and learn to decipher it. The hyperlinked Web 2.0 world, by contrast, privileges the self over the other and rewards predictability, even customizing ads and offers based on a user’s browsing history. Rather than the practice of reading being an act of seeking, in the hyperlinked world it becomes an act of receiving, as Netflix puts it, “More like this.”

Literature, on the other hand, entices us from our own separate worlds into someone else’s, “whose texture/compels me with the color of its countries.” Reading, at its best, keeps us from emotional and intellectual celibacy. It gives us thousands of eyes, all unclosed, and, as we turn the pages of a book, allows us to transcend our own small hands.

The Stranger at the Table

Before she flew to her native Poland for the holidays, my doctor told me that, on Christmas Eve, Polish families set an empty place for “the stranger,” a person who, symbolically or actually, has nowhere else to go. In the United States, she lamented, Christmas has become so commercialized and gift-focused that Americans have lost focus on the celebration of family and friends that make the holiday meaningful.

Supposedly I can trace some of my ancestry to Poland, but my family is Jewish, not Christian, and so for most of my life, the holidays have had a neither-nor quality. Hanukkah, indifferently promoted in gift catalogs and spread out over eight days that only sometimes intersect with Christmas, doesn’t have a prayer – forgive the pun – of competing with Christmas.

To be honest, I like it better that way. I am one of those people who describe themselves as more spiritual than religious, but I can see how Judaism has shaped my outlook. Some years I light Hanukkah candles, some years not. In most Jewish celebrations, as in Poland, particular objects have symbolic meaning. The menorah, which symbolizes one day’s worth of oil lasting for eight after the rededication of a temple in Jerusalem, celebrates (at least for me) the miracle of enduring spiritual light. The symbolism of a gift-buying blowout does not have meaning I care to celebrate. In that sense, my Polish doctor and I can find common ground despite having very different beliefs.

I am also fortunate to have been welcomed as the stranger at the table many, many times. When I lived in Seattle, I spent most Christmases with close friends. I don’t think I exaggerate when I say that their spectacular cooking was as good a way as any to celebrate our varied beliefs. My friends made crown roasts; I always brought homemade challah. One year, when their family piled into a car for midnight mass at St. Mark’s Cathedral, whose choir is locally renowned, I even joined them. I don’t think I’ll ever forget the moment when, seeing me hover alone near the entryway while my friends took Communion, a priest approached to ask if there was anything he could do for me. I shook my head, smiled, and thanked him, not feeling the need to explain. Even after years of continuing to wander between holidays, his small kindness – his offer of the stranger’s seat at the table – still warms me with gratitude.

Compassion, no matter what its spiritual foundation, is the true miracle.