Rubbernecking

from the Boston Globe

from the Boston Globe

It is fashionable to express contempt for those who drive past an accident and slow down to look. According to the critics, rubbernecking signifies a prurient interest in the misfortunes of others, a fundamental and irresistible inhumanity automatically triggered by the prospect of blood, gore, and emotional wreckage. The same principle applies to other varieties of voyeurism activated by celebrity meltdowns, tell-all memoirs, sexual indiscretions, mass tragedies, noble sacrifices, and spectacular acts of strength and courage. If we were a better species, not so prone to viewing destruction and exposure as entertainment, so the story goes, our curiosity would not be so much on display.

Personally, I’m not convinced that our human interest in calamity (and calamity barely averted) stems from something sordid that sprouts from the brickwork of civilization. In a work of literature, captivation begins where good luck runs out, and we attribute the burning compulsion to turn the page to curiosity or a search for meaning rather than bad character. When disaster hits bricks-and-mortar reality, though, the same impulse seems outré. If the medium is the message, then Twitter, facebook, Reddit, and the blogosphere seem to lead us towards the worst of both fiction and reality, where facts and meaning are equally elusive.

Yes, I am talking about the Boston Marathon bombings.

When I see a car accident, I always, always look. I am not ashamed of looking. I want to know two things: Is it someone I know? and Are the victims okay? I do not seek the frisson of adrenaline rush that comes from contorted metal or imagining something worse behind the ambulances and fire trucks. In a work study job cataloguing historical photos when I was an undergraduate, police photos of local car crashes comprised a good portion of the collection, but I couldn’t bear to look at them; and in high school Driver’s Ed, when we were forced to watch several editions of the car-crash scare series Red Asphalt, I became so terrified I would kill someone that once I finally got my license I didn’t want to drive. In other words, I am looking for reassurance, not a cheap thrill at someone else’s expense.

I think that something similar happens when someone seemingly “normal”—or at least normal enough—commits a large-scale atrocity. Some people complain that we are more interested in the perpetrators than in the victims, who are more deserving of media attention. But, to me (and, I suspect, to others), the victims’ role is not nearly as frightening as the perpetrators’. Certain horrific acts, like what took place at the Boston Marathon, or Sandy Hook, or Aurora, or Tuscon, make us seek answers to our most terrifying questions: Who could be capable of such a thing? Could I? Could someone I know? Would I recognize such a person? How does someone make the decision to become a terrorist? Could he have been stopped?

At least from the preliminary reports, both the Boston Marathon bombers turned to violence in response to ordinary human pain: parents’ divorce, immigration, a best friend’s murder. The evidently more volatile brother, who already felt out of place in the United States, lost the possibility of citizenship when he committed domestic violence, and, in response, threw away his own humanity to retaliate with terrorism. He went to Bunker Hill Community College (where I have colleagues), and then dropped out while immigrants with similar problems kept going. The younger brother, the one almost universally described as warm, kind, and popular, bafflingly went along with his brother’s plans—why?

Peter Young Hoffmeister, a high school teacher and former Huffington Post blogger, lost his HuffPost blogging gig when he submitted a post recounting his past as an angry, lonely, gun-obsessed young man. After being expelled for carrying a loaded, stolen handgun to high school, he got kicked out of two more schools before “the support of some incredible adults” and an outdoor program for troubled teens inspired him to straighten out. Compassion saves, at least sometimes. Maybe there will always be Loughners and Holmeses who spiral out of reach, but on the other side there are also Hoffmeisters who force us to ask, Couldn’t something have been done?

I have noticed that it’s much easier to throw around the “evil” label, to dehumanize, to call for the torture and death of the “monsters,” than to ask such questions—at least judging by the talk shows, media rhetoric, and inflammatory facebook posts that have rippled through my feed the past few days. Now that the victims are maimed or dead, it’s too late for compassion to make a difference in the outcome, but to look for reasons is to acknowledge that there might have been a moment, or even moments, when someone might have intervened, or some time when a few kind words might have helped prevent so many worlds from breaking.

Total B.S.: The Resurrection

whyprof
Careercast.com, joining the multitudes with an unnatural fixation on professor-bashing, has declared “University Professor” to be the least stressful career because of “high growth opportunities, low health risks and substantial pay.” High-level corporate executive jobs – yes, the ones that gobble up bonuses and sail away in golden parachutes, even when they lure both their companies and taxpayers into economic netherworlds  – were for some reason declared among the most stressful.

As with David C. Levy’s editorial claiming that faculty at my college are underworked and overpaid based on spurious information, the Careercast article is notable both for for its disconnection from reality and the Schadenfreude with which it is forwarded by people with much cushier careers. I love my job, and I’m not going to argue that it belongs on the Top 10 most-stressful-jobs list, but déjà vu moments are becoming more common than apocalypse predictions. Once again, the good writers have based their claims on faulty assumptions:

  1. Professors have high pay.  For support, Careercast cited the (yes, dizzying) compensation of faculty from Harvard, University of Chicago, and UCLA, ignoring the fact that most full time faculty at middle-tier institutions earn about half this amount and full time faculty at two year colleges earn about a third of this amount, even using the mysteriously inflated data in the study. My own institution, Montgomery College, only includes salaries of full time faculty and doesn’t list adjunct salaries, which are measly.
  2. Faculty jobs are multiplying. Careercast misleadingly declares, “To maintain the quality of education while meeting the increased demand, universities are expected to add 305,700 adjunct and tenure-track professorial positions by 2020.” The article briefly mentions that competition is fierce for full time faculty positions (at MC, we usually receive at least 100-150 applicants for every opening) and cites the “new emphasis” on adjunct positions. However, most of the growth in faculty hiring has been in low-paid adjunct positions, with a good proportion of full time hiring to replace retiring faculty who were hired during community colleges’ hiring heydays in the 1970s.
  3. The job has few physical risks. I don’t have any official statistics, but with the massive teaching loads at institutions below the top tier of colleges and universities, overuse injuries and stress-related illnesses (like migraines) are rampant among faculty teaching more than 100 students a term. I’m not talking whining about headaches, but instead injuries that have interfered with work and have needed ongoing medical attention. We’re not saving lives here, but my neck, wrist, and shoulders are never going to be normal again, and huge numbers of my colleagues say the same thing. (Also, if the wacky proposals to require guns in classrooms are successful, the lethality of the job could increase quickly.)

Careercast considered these other factors in calculating stress levels, although they didn’t provide scores for each category:

  • Travel. Most of my long-distance travel is nominally optional, but necessary to stay current in my field. My short-distance travel is almost always reasonable – but if you’d asked me when I was an adjunct driving 500+ miles a week to different campuses, you would get a much different answer.
  • Deadlines. As far as I can tell, our faculty lives are one big calendar of deadlines – for conference proposals, articles, reviews, committee work, collaborations, class preparation. Most notably, courseload directly determines the amount of deadline pressure to respond to student work. A professor who assigns 25 pages of writing during a semester to 125 students is going to grade more than 3000 pages of writing in a semester, not counting homework.
  • Working in the public eye. Even when the public eye is mostly closed – as for this latest article – our work is scrutinized. Public speaking is the #1 most common phobia, and when faculty step in front of a classroom, scrutinized by dozens of skeptical students and sometimes by their helicopter parents, it feels very, very public…witness the persistence of teaching nightmares.
  • Environmental conditions. Environmental risks vary by discipline, ranging from picking up illnesses students pick up from their children to dealing with toxic chemicals and toxic people.
  • Hazards encountered. We’re not exactly on the front lines, but considering that most of the mass shootings have involved students and colleges in some way and that we’re “first responders” for a variety of situations, I wouldn’t say our jobs are hazard free.
  • Own life at risk. Thankfully, our lives are usually not at risk, but occasionally domestic violence, gang violence, or mental instability can create some dangerous situations. These are handled confidentially so I don’t have statistics, but at my own college several threatening situations arise at campuses each semester.
  • Life of another at risk. When I worked in advertising, I can say with absolute confidence that nobody ever came to me afraid that she would be killed by a spouse or ex-spouse, that he or she was about to attempt suicide, that he had nothing to eat, that her parents had kicked her out, or that an addiction made him a danger to himself. Now that I am a professor, these situations come up several times a semester, and once in a while I’ve even probably had a small role in saving a handful of lives. That doesn’t even count the kinds of lifesaving that happens when the support of a faculty member helps a student escape a soul-killing future, which is what I consider to be a good-sized chunk of my job.
  • Meeting the public. What is it that Careercast thinks we do at the start of every semester?

As I said, I am certainly not arguing that my job is the most stressful, and in fact, because my work is so satisfying, the stress doesn’t seem to matter as much as it would in a job that was empty of social value. On the other hand, knowing that I am doing something important in a population whose last best hope is often education carries its own sort of stress, because every day I must weigh my own needs against the needs of others and balance all sorts of competing projects that represent competing values.

It is interesting to me that level of responsibility, amount of prioritizing necessary to get the job done, and public bias (few have knowledge of what we do, but everyone has an opinion about it) weren’t considered in the criteria, since they’re known stress factors, but I’m not a pollster, so whatever. As I tell my students, I am privileged to have the best job in the world. It’s just not the best job in the world for the reasons some people think.

Not with a Whimper but a Bang

candlesThis is the way the world ends
This is the way the world ends
This is the way the world ends
Not with a bang but a whimper.

-T.S. Eliot, “The Hollow Men”

Something about the slaughter of twenty first graders and seven adults in Newtown, Connecticut makes me want to state the obvious rather than striving for eloquence. The dead deserve eloquence, but they will be honored more by a thoughtful response to our country’s dysfunctional relationship with guns.

Massacres of innocents with semiautomatic weapons have become so frequent that recent articles on the Sandy Hook shooting haven’t even had space for them all in their ledes. Grisly greatest hits like Columbine, Virginia Tech, Tucson, and Aurora usually get a mention, but so far there have been eight mass shootings in 2012, not including a bow and arrow attack at Casper College in Wyoming last week or a man who opened fire this morning at a hospital in Birmingham, Alabama. Also left off most lists of past shootings are Kip Kinkel, a depressed 15 year-old who killed his parents and two classmates and wounded 22 others in Springfield, Oregon in 1998 – yes, before Columbine – whose story was featured in Frontline but has been all but lost in the crowd of other shooters.

An objective observer might conclude that we have a problem.

It’s not a some-people-are-evil problem, a constitutional problem, or even a mental health system problem. It’s a gun problem.

How many times have you heard that the mass murderer of the moment was “always polite,” “perfectly normal,” or “doing well”? Kinkel’s parents were dimly aware of his psychiatric problems and tried to help him; Seung-Hui Cho and Jared Lee Loughner had attracted the attention of school officials who were unable to compel treatment; James Eagan Holmes had been seeing a psychiatrist. In most cases, the guns used in mass shootings were legally obtained. The overwhelming majority of people suffering from mental illness are not dangerous and never will be. However, the overwhelming majority of people, period, are clueless about what is going on with other people, period; and those who are not clueless are often reluctant to intervene, unsure of how to intervene, or helpless to intervene.

Meanwhile, shots continue to be fired. Firearms in the home significantly increase the risk of death from domestic violence, crime, suicide, and accidents. Gun-rights advocates rightly say that gun owners who are careful, properly trained, and law-abiding can safely use guns and that Second Amendment rights trump the risks. But since when are humans consistent about being careful, properly trained, and law abiding? There are more than 17,000 car accidents per day in the U.S. with a crash-related death, on average, every 13 minutes.

With cars, though, the driver who makes the mistake is roughly at as much risk as the other drivers and passengers involved, which theoretically acts as a counterbalance to carelessness and stupidity. Not so with guns. Also, cars have keys, meaning that it is difficult for anyone but the lawful owner to use them. Again, not the case with guns. In a perfect world, only people kill people, and on purpose. But our world, the real one with routine violence and accidental death, is filled with rampant imperfection and frequent errors in judgment. It’s nice to think that only responsible people will use guns, or that these good citizens can somehow deter killers who have abandoned civility or reason, but reality is not on the side of idealism.

The gun control topic has come up regularly in my classes since I started teaching. At first, I adamantly opposed all guns in all circumstances, and I regarded the fiery psyches of my gun-owning students with suspicion. From talking with students, though, I realized that in cities, guns are used overwhelmingly for violence, but that in rural areas, they were necessary for protecting and euthanizing livestock and sometimes for defending humans against large predators. When I moved to DC and commuted on a highway to work, seeing so many deer disemboweled by cars even made me sympathetic to hunting: Which is more cruel, a clean shot or a painful and terrifying evisceration by accident?

But semiautomatic weapons? Seriously? In Newtown, the six- and seven-year-olds were shot multiple times, presumably because the guns Adam Lanza used continued to fire after the children were hit. In this as in so many other things, the bullets are speeding towards their victims much more rapidly than a shooter can think.

We’ve had almost fourteen years to think about Columbine, though, and as the gratuitous death toll has mounted, the political environment has become more hostile to gun control. So many families will go through the holidays missing loved ones who died for no reason – or, rather, who died because skewed notions of self-defense and the right to hunt have overshadowed the reality of the world we live in, in which the killers right in front of us are far more dangerous than the ones from which we imagine guns will protect us.

Reality is on the side of reinstating the ban on semiautomatic weapons, keeping guns out of schools and other public places, requiring robust background checks and review of owners’ continued ability to use guns responsibly (we do it for driver’s licenses!), and considering possession of lethal weapons as a factor in judging whether a mentally ill patient is a danger to him/herself or others.

According to the ancient Mayans (or at least catastrophizers crediting the ancient Mayans), the world is supposed to end on 12/21/12. If the world really ends, I may die regretting my blithe attitude – another day, another apocalypse that hasn’t materialized – but really I’ve hardly given the date much more airtime than it takes to roll my eyes.

In the case of guns, on the other hand, it’s time to stop pretending that we can do nothing to prevent another apocalypse like the many others that have unfolded in the past year. For every family dealing with the aftermath of the dozens of shootings that have cumulatively caused hundreds of avoidable murders, the apocalypse has already come and gone – and any of the first graders who were killed at Sandy Hook, if they had lived, could have told the rest of us what we should do to stop the next one.

The Spectre of Sandy

Hurricane Sandy’s wind field as of 6 p.m. Sunday evening (Brian McNoldy), via Capital Weather Gang

On Friday, I shared a climate change case study with my transfer composition class, simulating what would happen in an eight-foot storm surge in New York City. The case study came from a four-day workshop at Dickinson College a couple of years ago, Cooling the Curriculum, which aimed to help liberal arts faculty integrate climate science into their courses in meaningful ways.

The storm surge simulation had been on the syllabus since last summer, but it ended up coinciding with the approach of Hurricane Sandy, which by then had already killed 29 people en route to the U.S. East Coast. While my students picked apart the data in the case study and discussed what it meant in an emergency, I pulled up the New York Times on the computer projector to show students how New York City planned to respond to Sandy. As of the middle of class, planners seemed nonchalant, saying they had no plans for subway closures and that they anticipated the storm would be much less severe than Irene.

Two nights later, the simulation has become a horrifying reality, with massive evacuations and citywide closures. For a while, it seemed like Washington, D.C., where I live, would not need to take such drastic measures, but by dusk, the National Weather Service had predicted hurricane-force winds, feet of snow to the West, and probable flooding of the Chesapeake and Potomac. Then the DC Metro, too, announced closures beginning at midnight; the rain totals shot up to a possible ten inches, and the Washington Post’s Capital Weather Gang blogged that people shouldn’t venture outside after tomorrow afternoon because of the risk of falling trees and flying debris. Almost as alarmingly, I received robo-calls from the power company and Comcast, plus emails from a credit card company, warning of extended outages and waiving fees, respectively. (Few things are as scary as a credit card company having a fit of generosity.) We were bombarded with messages on how to prepare, along with dire advisories on how to protect pets in hurricanes. I wondered: could a hurricane-force wind lift a small dog?

Suddenly, the storm threat we’d discussed in class seemed starkly real, and the giant lollipop dwarfing the coastline looked nightmarish and psychedelic. Until I moved to DC, I have always lived in earthquake country – California, and then Seattle (the quake that damaged the Washington Monument and National Cathedral notwithstanding) – where catastrophe could strike without warning, which spared us the spectre of watching it approach.

To the many, many friends, students, colleagues, and family members I know who will be impacted by this storm: I am scared with you and for you. May we all find shelter, and may we all emerge from it safely.

Survival Guide for the U.S. Election Season

As a U.S. citizen, I am fortunate to live in a country in which gargantuan ethical and civic questions can be decided by an election. According to the U.S. Census, 71% of eligible citizens are registered to vote, and only 57% of the voting-age population voted for president in the 2008 elections. Only about a quarter of eligible voters in the age range of most of my students vote, which means that each voter under 30 gets to make decisions for three other students.

In each election cycle in my 10+ years of teaching, I have urged my students to vote. Early on, I tried to cultivate a sense of civic responsibility, and they countered with arguments that the candidates were not substantially different, that all politicians lie, that they (the students) felt they were not informed enough about issues to give an opinion at the polls, that their votes wouldn’t matter to the outcome, and that no candidate’s point of view represented their own.

And, these days, in each election cycle I have argued that the simple act of voting – even if their candidates and initiatives lose – makes it more likely that politicians will pay attention to the needs of their demographic. About 70% of older voters cast ballots, which can’t possibly be unrelated to the way social supports (such as they are) play out. Why, I ask my students, do you think tuition is skyrocketing, childcare is basically unaffordable, and student loans enrich the bankers and impoverish underemployed college graduates? Why aren’t there more jobs for young people just entering the workforce? If you were a politician, I go on, why would you spend your time on legislation to help people your age when people four times your age are almost three times as likely to vote?

Usually, at this point, I can look up and see a room full of mildly shocked eyes. I like to tell myself that I have made a compelling argument, and I never see anyone sleeping or text messaging for this particular speech, but I have never once had a single student tell me that I convinced her to vote, either, so the shock must be that it’s the middle of the semester and they have only just realized their professor sometimes makes stuff up.

Complaining, on the other hand, is a truly participatory American process. I am not one of those people who goes around telling non-voters that they have no right to complain. First, I sincerely believe that everyone has a right to complain; but second – and more to the point – getting people to stop complaining is like getting DC cars to stop running through crosswalks: If you try to stop them, every absolutely-right molecule in your body is still going to get obliterated by the vehicle whose steering wheel drivers can feel in their hands.

I don’t know whether it’s a function of maturity (or was that a euphemism for cynicism?), the parting of the red sea from the blue sea in American politics, or exasperation with a political system that is far to the right of my own beliefs, but I have also stopped enjoying conversations about politics. All such conversations end exactly like my impassioned pleas to get my students to vote – that is to say, with a high probability that everyone’s minds will be just as unmoved as before I used up all that oxygen.

Political conversations have become a bore, because they have such limited possible outcomes:

  1. You express outrage to people who are outraged about the exact same things…and nobody’s mind changes. (1.a. is that you offer new facts to add to someone’s pre-existing outrage.)
  2. You express your fabulously well-thought-out opinion to someone with whom you disagree, you argue, and, if you’re particularly tactless or impassioned, you discover you can’t talk about politics…and nobody’s mind changes. (2.a. is that you decide you are so horrified by the other person’s politics that you will never speak to each other again, at least until the election is over. 2.b. is that you are secretly horrified that you know and like someone who would have opinions you think should have gone the way of bloodletting as a cure for illness.)
  3. You listen while someone passionately tells you to believe something you already believe, vote in a way you will already vote, or regard the other side as stupid and crazy.
  4. You listen while someone passionately tells you to believe something you are not going to believe anyway, and you realize the other side is stupid and crazy.
  5. The person who disagrees with you makes good points, but you still disagree.
  6. You express your not-so-well-thought-out opinion and refine it so that you gain a better understanding of what you believe. (In my opinion, this is the only good reason to make a political argument these days.)
  7. The dream that you might convince someone who disagrees with you makes you go on and on and on and on and on and on about what you believe.

I have several friends who are both political junkies and chain smokers. My unscientific estimate is that a political conversation results in changing someone’s mind about as often as a conversation about smoking convinces a smoker to quit. It’s not impossible, but you might see a unicorn first.

To me, the only realistic option is 8: You already know what you believe and accept that you can’t convince anyone, so you don’t bother talking about politics.

But, someone will argue, can’t you sway someone who is undecided to take your side?

Um, no. Not really. If someone has trouble choosing between Obama and a presidential candidate who believes that only some people deserve food and healthcare; that science and history should conform to one’s ideology; that some pigs are more equal than others; that Atlas Shrugged should replace the Bill of Rights; that 47% of Americans who are retired, raising kids, or going to college are freeloaders; that government control is bad except when it pertains to women’s bodies; and that it’s refreshingly resourceful to strap the family dog to the roof of the car so the luggage can ride inside, there’s not a whole lot to talk about.

In other words, shut up and vote.

The Million Meeting March

On the first day of class, I always try to avoid reading through the syllabus. One reason is that students, anticipating a day of tedium, sometimes skip the first day and miss hearing the course requirements. The main reason, though, is that I don’t want to start the semester with a day of tedium.

For some reason, however, for faculty the academic year generally starts with several days of tedium, most of which have no relevance to the things that excite us about teaching. At the start of the year, what I find most invigorating are discussions of pedagogy, innovative assignments, and shared insights from a summer of reflection; but where we spend most of our time is sitting in large halls listening to announcement after announcement and wishing we were back in the classroom.

In my opinion, the just-kill-me-now endless rounds of meetings are a wasted opportunity. When we teach a course, we have course outcomes and activities that support these outcomes. The more interactive and active the classroom activities are, the more likely participants will meet the outcomes. When we go to conferences, we have specific disciplinary or pedagogical issues and problems that the presentations address. In most of the everyone-must-go meetings, the intended outcomes are not clear, the activities are not tied to student success and don’t actively involve participants, and the speakers generally repeat information that has already been delivered via email, at other meetings, and even within the same meeting.

I have heard staff complain that the meeting content is too academic, and faculty routinely complain that the meetings rob us of valuable start-of-semester prep time. Administrators, as far as I can tell, are not free to give their opinions. I will be honest: I am writing this blog entry from an auditorium in a meeting that has already gone on nearly two hours and shows no sign of ending. The highlights were the delightful acceptance speeches for the staff and faculty excellence awards – but they didn’t come up in the agenda until well after the meeting was scheduled to end.

While we’re sitting in here, our college wrestles with a reorganization, an upcoming accreditation review, budget crises, public pressure to demonstrate that our students are learning, and sometimes-flailing efforts to bolster student retention, persistence, and completion. As of this writing, midway through the week, I have already attended more than 12 hours of meetings, with at least six more hours to come before the end of the week. The clock is ticking, and my fingernails and toenails are turning blue.

As I have said more than once, I am blessed to work at a college where everyone – faculty, staff, and administration – are talented, intelligent, innovative, collegial, and dedicated to improving the lives of our students. So why are we wasting our own time?

The Teacher’s Pet

After about twenty years of thinking about it and ten years of talking about it, I finally adopted a dog from Washington Animal Rescue League, which looks and acts like a different species than a typical animal shelter. Because I’ve had cats for my entire adulthood, people’s reactions have ranged from outrage (“But I thought you were a cat person!”) to misty idealism (“Having a dog will completely change your life”) to tears (mine, anyway).

As I said many times before I ever adopted a dog, I am an animal person who (up until now) has had a cat lifestyle. Every creature with four legs and fur has been at risk of adoption since the day I was born. My life has definitely changed, but not in the unconditional-love-at-last sense, since the five different cats I’ve had in my lifetime have all done their solicitous best to defy every myth of feline independence.

For some reason, people also feel compelled to give commentary on my dog’s name, Kerfuffle. About a third of the people who hear it think it’s the best name ever, a third have no idea what “Kerfuffle” means and say something along the lines of, “That’s a mouthful!” and the rest suggest that the name has too many syllables. “Cockapoo” also has three syllables, so I’m not sure what the problem is. Secretly, I call him Kerfuffle Cappuccino, Kerfuffle Puffle, Kerfuffleupagus, or often, just “Fuff.”

People also feel compelled to opine on Kerfuffle or me, either directly or obliquely. One elderly walker of two chubby Shih Tzu mixes told him sternly and insistently, “Your tail should be wagging!” When I tried to quiet Kerfuffle when we were at the dog park with dogs five times his size, another owner said, “Oh, let him bark!” I’ve noticed, though, that nothing breaks up a conversation like a barking dog. Some of the advice, like other owners’ informal reviews of dogwalkers and doggie daycares and groomers, is welcome and useful. Most of the owners are responsible and loving towards their dogs. One of my favorite dogs in the neighborhood, though, scared Kerfuffle with a rough invitation to play, and other large dog owners seem mystified that Kerfuffle might feel a little hesitant around the gigantic Cerberuses strutting through the streets.

It’s not at all that I don’t need advice, since my transition to dog owner – while far from tempestuous – has not been entirely smooth. Kerfuffle arrived with an ear infection, a cold that needed antibiotics a few days after he came home, separation anxiety, and a predilection for stealing cat toys and eating paper products. One of the first things he ate was a rough draft of a story, and it occurred to me that being an English professor and writer with a paper-eating dog might eventually pose some problems. Next, he started what I learned was “resource guarding” (a common issue in which dogs defend their food and toys), snapped at the vet when she tried to look inside his ears, began barking at new people in the building hallways, chased my cats, refereed their tumbling play with more barking, and developed a habit of barking at a certain corner in the neighborhood where dogs and owners tend to congregate. At times I feel like I have a misbehaving toddler at the end of the leash!

You would think that someone who can teach students to write 10-page essays or love Hamlet would be able to train a 17-pound cockapoo. I grew up around family dogs, rode and helped out with horses, and managed to teach my cats some house rules. Kerfuffle, for his part, seems sweet-natured and eager to please, and I’m fairly sure that someone put a lot of effort into training him, based on his generally excellent leash manners and apparent prior knowledge of “Stay.” A couple of years ago, I read David Wroblewski’s novel, The Story of Edgar Sawtelle, and was absolutely captivated (among other things) by the information on dog training and psychology embedded in the book. I was determined to have a happy, well-behaved dog, I felt able to be clear and consistent, and I understood that most dogs feel most secure when their owners provide them with a sense of purpose.

Yeah, right. Can you hear Anubis snorting at me from where you’re sitting? There is something uniquely humbling about having the best of intentions and yet still managing to confuse a generally willing, intelligent dog. So far, Kerfuffle has learned (or maybe relearned) to sit, sit-stay, stop at street corners instead of rushing out into traffic, not jump on me or the furniture (I would love to have him up there, but aboveground is cat country), not chase the cats, release his toys, quit begging for human food, and start walking on the leash when I say “Let’s go.” From Kerfuffle and the many people I have consulted about Kerfuffle, I have learned that I am not quick or gushy enough with praise, that forgetting to pick up a dog dish when you have cats is just asking for trouble, that raw honey is a good treatment for itchy skin (I haven’t tried that one yet), that walking a couple of hours a day is good for both dogs and humans, that I’m not as consistent as I think I am, that to teach a dog to stop barking you should first teach him “Speak,” and that training with positive reinforcement requires my constant, focused attention.

Back to school, indeed.

What to Wear to the Apocalypse

Average temperature change since 1970, from http://www.climatecentral.org

I remember when the weather used to be a safe topic of conversation. Actually, the weather hasn’t been a safe topic of conversation for about 150 years, but around the time Al Gore released An Inconvenient Truth, the subject became downright inflammatory.

I was an adjunct at the time, and when I heard that one of the feeder districts for the community college where I was teaching – Federal Way School District in Washington State – had placed a “moratorium” on showing the DVD and required teachers using the film in class to include a “credible, legitimate opposing view” (which, according to all but a handful of climate scientists, does not exist), I decided to incorporate it into my syllabus. The now-extinct Seattle Post-Intelligencer reported with barely concealed sarcasm that the regrettably named Frosty Hardison had contacted the School Board to oppose the film:

“Condoms don’t belong in school, and neither does Al Gore. He’s not a schoolteacher,” said Frosty Hardison, a parent of seven who also said that he believes the Earth is 14,000 years old. “The information that’s being presented is a very cockeyed view of what the truth is. … The Bible says that in the end times everything will burn up, but that perspective isn’t in the DVD.”

Frosty, in name and deed, seemed to embody all that was ridiculous about the Federal Way policy. But, in class, the film generated not controversy but daytime sleepiness. The students argued, legitimately, that an argument attached to a politician was suspect; and the lack of popular coverage of the issue, combined with my lack of knowledge at the time, made the whole topic less than engaging in class. Soon, media outlets began picking apart the facts in the film, with the drowning polar bear animations drawing particular skepticism. By the time the “Climategate” scandal hit, naysayers had so thoroughly bombarded the media with talking points that the exoneration of the UK’ s Climate Change Unit had become a lot like the absence of WMDs in Iraq: something factually true that was believed to be false. Besides, the scientists’ failed public relations strategies didn’t change scientific reality.

Awareness of global warming has finally filtered down to the general public. I was at a store earlier this week and commented at checkout, “It’s a lot hotter than advertised outside,” and, rather than a bland response, the clerk said, “Global warming! Well, we were warned the apocalypse would come in 2012…”

Here in DC, we have already had seven days over 100 degrees, including a record-breaking four in a row (so far). Millions lost power in a massive derecho; heat buckled Metro tracks, causing a derailment; we’re in the midst of the worst drought since the 1950s, with 80% of the country “abnormally dry”; up to 15 inches of rain fell in Texas; the Quileute Indian Nation of Washington State, featured in the Twilight film and book series, is facing the obliteration of tribal lands due to rising sea levels; twenty-nine square miles of Colorado is now burnt to ash. Outside the U.S., a chunk of ice the size of two Manhattans has broken off Greenland. While scientists can’t claim conclusively that warming is directly responsible for a particular weather event, they have said repeatedly that the violent weather we’ve experienced in recent years is “consistent with” what would be expected as the planet overheats. In other words, global warming has become obvious even to the average unconcerned citizen.

Unfortunately, however, as Elizabeth Kolbert points out this week in her impassioned New Yorker essay, “The Big Heat” (July 23, 2012), politicians have barely mentioned the climate. With the exception of Bill McKibben’s much-forwarded Rolling Stone article, “Global Warming’s Terrifying New Math,” which has appeared in my facebook feed at least a dozen times now, the climate news that has appeared in news outlets at least every day has been largely ignored in favor of a constant barrage of election news (and now, as we process the tragic Batman shooting in Aurora, CO, with arguments about gun control). If you read only one article I’ve mentioned in this blog entry, McKibben’s is a good one, his main point being that if we burn all the fossil fuels currently in the ground, we will end up as science fiction characters in a mostly uninhabitable world.

We are rightly concerned with sublunary national issues – the 1%’s plunder of the rest of the nation, the war on women, Mitt Romney’s cruelty to both animals and humans, the fall of the JoePa statue, advancements toward legal gay marriage, the hegemony of the SuperPacs, attacks on and defenses of Obamacare. The divisions between red states and blue states have never seemed so livid, and they pour into my email and news feeds in gushes of lurid partisanship.

If you’ve read this blog once or twice, you probably have a good idea of my political leanings. But I find myself worrying about the decimation of species – in the most likely scenario, 25% of species will reach extinction by 2050 – and the disappearance of the world’s coral reefs. When I’m out walking or driving, I try to memorize the contours of plants and the beauty and sounds of birds, because probably many of them will disappear before I die. I see that I live in a wealthy country that will most likely manage to keep its grocery stores filled, unlike Kivalina, Alaska, which will soon be under the ocean; or the Sahel, which is withering away into desert. These effects will both outlast and dwarf the impact of our other social policies.

There can be no lasting social justice without environmental justice. Since life has existed on Earth, every other species has responded to shortages of food, water, and shelter with massive die-offs. Unless we make use of the science we have, the human species will be no different. We can look out at our blue universes and our red universes, but if we are living on a brown, dead planet, our politics will no longer matter.

On the other hand, maybe by fighting together to save the world, we can save ourselves; and maybe by fighting to save ourselves, we can save what’s best in our shared humanity. The first step is to try to see past the red and blue so that we can do what it takes to hold on to what’s left of the green.

My Great Freeway-Flying Adjunct Apprenticeship Adventure

When I was an adjunct, I was always mightily annoyed when full time faculty fell into a piteous lament about the “plight” of part time faculty. It wasn’t just that these moments of disingenuous sympathy often occurred just after the full-timer in question had just bumped me from a class; more, it was the idea that my fellow freeway flyers and I were helpless victims of a system that neither we nor the tenure-track faculty could influence.

In some ways, of course, this assessment of the over-reliance on part time faculty in higher education is absolutely accurate. Part time faculty are underpaid, and hence they often overwork – often at several colleges, often long commutes apart – in order to pay the bills. Students, in something like at least half the classes at most institutions, end up with exhausted professors whose energy must be scattered among some more-than-reasonable number of classes. The faculty themselves, moreover, are expected to provide a quality education even though they do not have a permanent workspace, dedicated computers, or even, sometimes, a file cabinet for stashing their stuff. If these disadvantages were not enough, in many institutions the adjunct offices are housed in overcrowded Siberias relegated to the most distant edges of the campus, where they are conveniently removed from the life of their home departments.

In these Siberias, legend has it that adjuncts never get full time jobs, that they sometimes get full time jobs but never where they have worked as part time faculty, and that they sometimes get full time jobs, but only if they are anointed to these jobs within a certain number of years, which varies depending on whom one asks. When I started as an adjunct, it seemed to me that the dusty, grayish film that coats everything in these offices had penetrated deep into the morale of my colleagues, who were dedicated and skilled teachers staggering underneath a massive lead cape of injustice.

It is true that the numbers of part time faculty vastly eclipse the number of full time positions available, and that this situation will persist until public higher education receives adequate funding. It is also true, however, that nearly all the full time faculty everywhere I have ever worked – six different community colleges and a for-profit, but who’s counting? – were once part time faculty. If part time faculty never get full time positions, how can this be?

Before I got into teaching, I worked in the private sector, where, at least when I entered the workforce, there were two prevailing views: one, employees should get promoted for loyalty and seniority; or two, they should get promoted for doing superior work, regardless of seniority. Well, we all know how that dichotomy played out.

Before I worked in the private sector, I was a student, paying dearly (with the help of my parents) for the privilege of slaving over papers and scribbling notes in overpriced books. Why do we students pay for an education, even though we’re overworked and largely uncompensated? Because we’re not uncompensated. We’re getting an opportunity to learn things, and from people, we would otherwise not be able to access. A degree itself has no particular value; as I’ve occasionally quipped to whiny students, “If you don’t want to learn, just make a diploma in Photoshop.” It is the ability to apply and synthesize an education that makes it valuable.

I have written in a different blog entry about the incredible mentorship I received from full time faculty during my adjunct years. I was amazingly fortunate to work with colleagues who were so generous with their support. At the same time, though, I was the consummate freeway flyer, routinely zipping to three campuses and teaching five or six classes in a day, sometimes covering 200 miles from the time I left in the morning until the time I returned at night. I kept a file box behind my front seat, since I usually didn’t have file space; at one campus, I held office hours in the copy room because I didn’t have an office or a desk. It was only when I became full time that I realized how much time had been usurped as I sorted and moved files from campus to campus. Several times I had classes taken away at the last minute. More often, I accepted more classes than I could handle because I was afraid that if I said no I would never be hired again at that college. In a given year, I worked 166% of a full time load to make less money than a full-timer.

But I still refused to call it “plight.” In colonial times, fourteen year-old boys signed up for seven-year stints with master craftsmen, their only compensation the skill they would take with them when they completed their apprenticeships. Early child labor laws limited workdays to twelve hours, at least on paper, the implication being that the actual work hours were probably longer.

I found it helpful for both my professional growth and my morale to think of adjunct teaching in similar terms: apprenticeship, not servitude. As in college and graduate school, I was paying to gain knowledge, experience, professional development, and familiarity with the discipline. I could use the time passively, teaching my classes and then going home; or I could use my scattered existence as a pretext for learning how different colleges solved common problems. Teachers constantly talk about “stealing” ideas from colleagues. How amazing that I could steal from colleagues at six different campuses! I might have limited opportunities at any given college, but by doing so, I was able to piece together a fairly rich professional existence.

It is the time of year when a lucky few are notified that they have crossed the threshold into full time employment, while the much greater numbers of the not-as-lucky sink into a seasonal rite of discouragement. This discouragement motivated me, finally, to broaden my search to out of state colleges. In the private sector, was a given that just doing my job well was not enough. In the summer months, when disappointment flowed and work opportunities ebbed, I found it helpful to remember how much I was learning from my itinerant existence; and to remind myself that by learning a little bit more, my chances would improve in the next hiring season.

Many of the benefits of my approach weren’t evident until I started working full time and using what I’d learned. Because there are many aspects of contingent employment that are truly exploitative, I can understand why some of my colleagues might find my attitude controversial. I can say, however, that (to paraphrase Simon & Garfunkel) I’d rather be a hammer than a nail…or, in other words, that my apprenticeship had far more dignity than my plight.

The Art of Not Knowing

Every time I teach my online fiction writing course, several students introduce themselves by saying, in one form or another, that by the end of the class they hope to find out whether they have talent.

I, too, would like to know whether I have talent. Every time I sit down to write, I have the urge to gaze at at my own work like Narcissus gawking at his own image in a pool, and I wonder whether what I create is beautiful, horrifically bad, or simply in need of substantial revision. I am not alone. For instance, Lynda Barry, in her autobiographical comic, “Two Questions,” recounts how the dichotomy “Is this good?”/“Does this suck?” nearly destroyed her ability to do art because she began to see each piece she created as a judgment on her worth as an artist and a human being.

Having spent about a decade of my precious days on earth asking myself similar questions, I wish I could help my students avoid this particular creative death spiral. I tell them that practicing any sort of art is a long process and that they are at the beginning of the process. I tell them that the course will occupy only a few weeks of their lives, and I warn them against using these weeks as an oracle that will tell them whether they should keep writing or not. I tell them, too, that I make a point of not answering questions about my opinion of their potential.

The more I write and slog through the uncertainty of writing, the more I realize that Barry’s two questions are the very last ones I should be asking because they’re just not relevant to the work itself. I have been told all sorts of things about my writing – everything from “You’re not James Joyce” to “I don’t see why you would care so much about things that aren’t even real” to the coveted “This is very strong work” – and nothing anyone has said has made much difference to my confidence level. (In one of my first college writing workshops, on the other hand, the professor recounted an incident from her own college years, in which an embittered professor told a student, “If I wrote like you, I’d slit my throat,” which almost certainly would have had an impact – but I desperately hope that story is an urban legend.) In response to various negative reactions and rejections, I’ve spent long periods of Not Writing, but I have always gone back to it eventually; and when I have received praise and encouragement, I’ve glowed for a few days and then spent weeks and months convinced that I would never write anything good ever again.

Like I said: creative death spiral.

Every time I begin to write, I start at zero. I feel that I am not just inventing a story, but myself as a writer. It as though I have to relearn everything I have ever known, every single time. I have to accept – again – that what I have to say, should I even succeed in saying it, may not be worth saying. I may be a better writer than when I started out, but that doesn’t mean I won’t write something terrible, and I am fairly sure I will fall far short of what I wish I could write. If I want to keep going, I have to embrace zero and everything it doesn’t mean. I have had to stop believing that my feelings have any relationship whatsoever to the quality of what I produce. I have to focus on the work itself, not ponder whether it is any good.

Lynda Barry’s comic dramatizes her search for “what is missing” in her art. In the last frames of the strip, she is taunted by ghosts whose frenzy increases the more she resists, until she inadvertently cries out the answer: “I don’t know!” and liberates her work from questions of meaning and worth.

Inevitably, my students will ask themselves the two questions no matter what I say to them, just as I did in my first writing class and for many years afterwards; and some students will read every comment they receive as though it’s a prophesy of what is possible. Some will become angry when what they think of as the oracle suggests that years of practice may stand between them and instant brilliance. I can’t stop my students from wanting answers to the question, “Do you think I’m any good at this?” any more than I could stop myself from asking the same thing when I was in my first workshop. But just because I’ve had – and continue to have – my own struggles with Barry’s lesson doesn’t mean I can’t try to bequeath it to my students.

At my college, the creative writing faculty have been charged with coming up with a way to measure what students learn in art and performance classes they take for general education credit. We had a spirited discussion about what was attainable in one beginning writing course, but we all agreed that it was not reasonable to expect a piece of high artistic quality the first time through the process. None of us, I suspect, are as good as we would like to be, which gives us common ground with our students. The difference is that those of us with more experience have by now swapped our fantasies of genius for a long, lonely march along an unmarked path through unmapped terrain, in search of a hypothetical treasure that may or may not have value. This expanse of untrodden mystery, however, is what freedom actually looks like.

Take it from someone who doesn’t know.