It Seemed Like a Bad Idea at the Time

Photo by the Washington Post

My first job after college was at a health insurance company, making copies, printing out letters to reject claim appeals, filing, sending out mailings, and answering phones…and those were the fun parts. For this position, across from Westlake Park in downtown Seattle, I’d turned down an offer for the proofreading night shift at a tiny publisher situated in a bad part of town and a position assisting a quadriplegic entrepreneur, which sounded fascinating until every last member of his staff made a point of telling me he regularly insulted them until they cried.

By the time I capitulated and took the health insurance job, I was feeling the strain of hunching over a computer on a flea-infested rug in an apartment without furniture (yes, in the same building where I was later threatened by a shotgun-wielding retiree), living in a city where I knew no one and couldn’t afford to go out. I took a part-time telemarketing gig that involved calling up unemployed people without health insurance and trying to get them to buy season theater tickets. A bubbly blonde actress who started the same night I did almost immediately started reeling in customers, while I rapidly proved myself to be the world’s worst telemarketer. I didn’t even make it through the training, and after two nights on the phone I stopped coming, too humiliated to ask if I would have a paycheck.

I did not have much more aptitude for clerical work than I did for telemarketing. I learned that my high GPA and undergraduate degree did not necessarily mean I could survive in an office without losing my mind. I thought health insurance was ethically wrong – insurance companies profited from people’s vulnerabilities and fears, weaseled out of paying for illnesses, and had an entire department of registered nurses and doctors whose sole purpose was to ensure that treatments were medically necessary, which turned out to be code for finding reasons to deny coverage – but by then I needed the money.

Essentially, my first lesson after graduation was one that most people grow up knowing: idealism is hard to maintain when you need to pay the rent. Though I looked down on my coworkers for their unquestioning loyalty to a company that made its money through the pain and suffering of its customers, I quickly realized that I had fallen into the same moral compromise as the insurance profiteers. I dragged slowly through the filing, stealing chances to read the documents as I went and then justifiably being chastised for my lack of productivity.

I found a fiction workshop to take at night, but the couple of hours a week devoted to writing withered under the pressure of 40-hour weeks devoted to work I hated. The job was spectacularly dull, but worse than that it sucked me into a bottomless pit of self-loathing. My supervisor, a truly kind person trying to do the right thing, began to express concern about my emotional state. One morning, filling in while the receptionist was on a coffee break, she told me that my voice was too soft on the intercom and asked me to speak more loudly. When she turned away from the reception desk, my eyes filled with tears. I can’t even answer the phone right, I thought. How am I supposed to get a decent job?

A moment later, though, I had one of those merciful, lifesaving thoughts that from time to time have saved me from myself: What does answering the phone right have to do with anything? A day or so later, I gave notice. My last day was Valentine’s Day, some five months or so after I’d started, and I celebrated with a date at the Pink Door with a man from the writer’s workshop.

Those five months, however, ended up more than paying for themselves as the health insurance system got more and more complex. Because I played Harriet the Spy with the filing, I understood the labyrinthine rules of insurance games, even as they grew more and more labyrinthine in the decades after I graduated from college. I knew definitions, exclusions, preexisting conditions, capitation; I knew the difference between copays, coinsurance, deductibles, and lifetime maxes. I knew that, as people suspected, the regulations were, in fact, meant to give insurance companies reasons not to pay. And, from doing filing in the provider services department, I saw the kinds of malpractice and sexual harassment that would be tolerated without serious repercussions by the medical establishment. (When, a dozen or so years later, a gynecologist faced criminal charges for drugging and then raping female patients, I thought of those files.)

As I’ve been absorbing today’s news that the Affordable Care Act defied the pundits and survived its constitutional challenge in the Supreme Court, I have thought a lot about my insurance years. Yesterday, Sarah Palin repeated her erroneous and much-debunked claim that “death panels” would determine whether patients received care. Meanwhile, those who would most benefit under health care reform misunderstand the legislation, and those who oppose reform actively distort and wrongly characterize its provisions. Even as the quality and availability of health care have plummeted below any other advanced Western nation, factual information has failed to counter ideological misrepresentations.

One reason for the public’s confusion may be that the law’s advocates have not adequately explained the ACA or persuaded citizens of its benefits. I think that the main reason, though, is that insurance in general is difficult to understand unless you have spent substantial time learning about insurance. The whole system is a sleight of hand, meant to fool the unwary, and, despite the victory in the Supreme Court, a large segment of the public are still too easily fooled.

A Failed History of Flight

Nearly a year ago, partway through my recovery from a neck and shoulder injury, my orthopedist waved me out of his office with vague instructions to avoid car accidents, roller coasters, and hang gliding. When I asked for more detail about what might happen if I went hang gliding, he dodged specifics, then fumed, and then finally, when I explained that I had always planned to try hang gliding someday and wanted to know the risks, told me that a bad landing might result in my needing neck surgery.

“Thank you,” I said. “That’s all I wanted to know.”

I can’t say I mind trying to avoid car accidents. I will miss roller coasters, although they are not so important to me that I am willing to risk catastrophic injury. Hang gliding, though, is another story.

Since I can remember, I have wanted to fly. I think I was eight years old, inspired by a summer-camp reading of Jonathan Livingston Seagull, when I decided that humans had failed to fly only because they had convinced themselves they couldn’t. With proper concentration and determination, I insisted, I would be the first human child ever to take to the air unaided.

I put myself on a rigorous program of fierce concentration and practice. A friend and I constructed cardboard wings, which we tied to our arms with string, and then ascended the steep hill at the top of the local park and ran down it as fast as we could, our arms outstretched. My friend, who broke her arm attempting a different flight-related activity on her own, missed the next stage of training, which involved jumping off a branch of the biggest California oak tree in the park, then about eight or ten feet off the ground.

Nearly each day that summer, I ran down the hill and jumped off the branch, thinking every time, This time this time this time this time. One day, several weeks into my regimen, I tipped off balance and sprained my ankle when I landed. The sky – or maybe my hopes – had, for the first time, betrayed me.

I had heard that you should get back on a horse after you fall, so I jumped out of the tree one more time, as a farewell to the impossible. Part of me believed that it could not possibly be true that something I wanted so badly could be forever unattainable. I jumped, and gravity, the same as always, pulled me back to earth.

At a fiction reading I attended shortly before The Brief Wondrous Life of Oscar Wao won the Pulitzer Prize, Junot Díaz said, almost as an aside, that all creative people have personal origin myths about their creativity, but they don’t as often have myths about the origins of their inner critics. (He said many amazing things, but most of them have nothing to do with this blog entry.) My own creation myth is one of flying and falling, making uneasy negotiations with hope and then having to accept my own limitations.

What did I learn from my failed experiment?

Basically, I learned almost nothing. Over the years, I have learned nothing over and over again. I can’t watch even the scruffiest sparrow dip in and out of shrubs without feeling the same old longing for flight. I wrecked my knee in a dance class, then danced again; when I’ve flown in airplanes, I always ask for the window seat so that I can imagine what the wind would feel like if my arms were wings. All I have learned is that hope is very, very difficult to kill.

If being what I am not is impossible, though, being what I am is not all that easy, either, nor is it really as prosaic as it sounds. The first year I had a vegetable garden (in Seattle’s P-Patch community gardens), I was astonished that plants could so thoroughly be themselves. The first thing a carrot seed did was send down a long, fragile root; the first thing a lettuce seed did was make leaves. At the time I started gardening, I was recovering from knee surgery and had not yet been cleared to go to dance class. I was just starting to know what my reconstructed knee would be able to do, and just beginning to understand that its abilities would fluctuate from day to day. It is humbling to become the student of a radish, but the plants had an insouciant self-acceptance that I did not.

I hope that eventually I will have the opportunity to decide whether to try hang gliding despite the risks. One side of the balance represents fulfillment of a dream of flight; the other side offers a promise of seeing the world – without embellishment – precisely as it is. Both possibilities, in the end, seem equally profound; and both still seem almost, but not quite, beyond my reach.

Pass on the Robo-Pet…and Hold the Animals

In a gorgeous scene early in Marilynne Robinson’s novel, Gilead, children bring a litter of kittens to a river and baptize some of them before an adult stops them. The kittens all find homes, but nobody remembers which were baptized. The narrator, a minister, always wonders whether there is any theological difference between the baptized and non-baptized kittens. I don’t find this dilemma troubling at all, since I have never for a moment doubted that animals have souls.

My fifth-grade teacher, who was also a former minister (and, as far as I’m concerned, a saint as well), tried unsuccessfully to convince me otherwise. He’d filled his classroom with assorted fauna – rats, tarantulas, gopher snakes, chicks, lizards, crawfish – and allowed us to hold them during lessons. I was nonchalant about snakes, couldn’t bring myself to hold a tarantula, and fell in love with the rats, especially after reading Robert C. O’Brien’s classic, Mrs. Frisby and the Rats of NIMH.

Trying to turn me into a good scientist, he showed me several science books that presented as fact that one of the factors that distinguishes humans from animals is the capacity to feel emotion. He taught me a word that felt clunky and collegiate on my tongue: anthropomorphize, the tendency of humans to attribute human qualities to animals and other things that are not human.

His contention that the rats had no personalities and no emotions was the one thing he ever told me that I didn’t believe. Research since then has suggested that I was right that the divide scientists then drew between humans and animals was artificial and anthropocentric. One of the experts on animal emotion, Jaak Panksepp, a professor and researcher at Washington State University, says “people don’t have a monopoly on emotion; rather, despair, joy and love are ancient, elemental responses that have helped all sorts of creatures survive and thrive in the natural world.”

The human tendency to anthropomorphize inanimate objects, however, is also well documented. In a famous 1960s experiment I studied in college, students confided in a computer program, ELIZA, that spat out responses based on Rogerian therapy. Many participants mistook ELIZA for a human and grew emotionally attached to “her.” More recently, a New York Times editorial by branding consultant Martin Lindstrom – contended (controversially and possibly falsely) that brain scans revealed that “the subjects’ brains responded to the sound of their phones as they would respond to the presence or proximity of a girlfriend, boyfriend or family member…they loved their iPhones.

The implication here is something like, “Humans anthropomorphize both objects and animals; objects don’t have emotions; therefore it is likely that animals don’t have emotions either.”

For those of us who spend time around them, though, it seems glaringly obvious that animals have an emotional life. Pets clearly show jealousy, anger, affection, and joy; and when their owners feel strong emotions, they rarely fail to appear with a cold nose, a warm tongue, or a snuggle. Our pets declare their personalities and desires as plainly as any child, and they share the child’s impulse to touch and give comfort in the face of human emotions they don’t understand. Researchers point out that only humans have the ability to think about their own emotions…at least as far as they know.

And then there’s Amy Hempel’s story, “In the Cemetery Where Al Jolson Is Buried,” about a woman’s failure to acknowledge her best friend’s terminal illness, which ends with this devastating passage about a real-life chimpanzee who has been taught sign language:

I think of the chimp, the one with the talking hands.

In the course of the experiment, that chimp had a baby. Imagine how her trainers must have thrilled when the mother, without prompting, began to sign to her newborn.

Baby, drink milk.

Baby, play ball.

And when the baby died, the mother stood over the body, her wrinkled hands moving with animal grace, forming again and again the words: Baby, come hug, Baby, come hug, fluent now in the language of grief.

I have never been able to read this story – or, I have just discovered, write about it – without crying.

So, when I read Clay Risen’s brief article in the New York Times Magazine, citing “Robo-Petting” as an innovation we can anticipate in the next four years or so, I couldn’t help thinking that the idea was not just disgusting but a poor approximation of the love of a good animal:

Petting a living animal has long been known to lower blood pressure and release a flood of mood-lifting endorphins. But for various reasons — you’re at work, or you’re in a hospital, or your spouse is allergic to dogs — you can’t always have a pet around to improve your mental health. So researchers at the University of British Columbia have created something called “smart fur.” It’s weird-looking (essentially just a few inches of faux fur) but its sensors allow it to mimic the reaction of a live animal whether you give it a nervous scratch or a slow, calm rub. Creepy? Yes. But effective.

Right. I prefer my “smart fur” on a live animal, thank you very much…and I would be willing to bet that animals do, too.

The Art of Not Knowing

Every time I teach my online fiction writing course, several students introduce themselves by saying, in one form or another, that by the end of the class they hope to find out whether they have talent.

I, too, would like to know whether I have talent. Every time I sit down to write, I have the urge to gaze at at my own work like Narcissus gawking at his own image in a pool, and I wonder whether what I create is beautiful, horrifically bad, or simply in need of substantial revision. I am not alone. For instance, Lynda Barry, in her autobiographical comic, “Two Questions,” recounts how the dichotomy “Is this good?”/“Does this suck?” nearly destroyed her ability to do art because she began to see each piece she created as a judgment on her worth as an artist and a human being.

Having spent about a decade of my precious days on earth asking myself similar questions, I wish I could help my students avoid this particular creative death spiral. I tell them that practicing any sort of art is a long process and that they are at the beginning of the process. I tell them that the course will occupy only a few weeks of their lives, and I warn them against using these weeks as an oracle that will tell them whether they should keep writing or not. I tell them, too, that I make a point of not answering questions about my opinion of their potential.

The more I write and slog through the uncertainty of writing, the more I realize that Barry’s two questions are the very last ones I should be asking because they’re just not relevant to the work itself. I have been told all sorts of things about my writing – everything from “You’re not James Joyce” to “I don’t see why you would care so much about things that aren’t even real” to the coveted “This is very strong work” – and nothing anyone has said has made much difference to my confidence level. (In one of my first college writing workshops, on the other hand, the professor recounted an incident from her own college years, in which an embittered professor told a student, “If I wrote like you, I’d slit my throat,” which almost certainly would have had an impact – but I desperately hope that story is an urban legend.) In response to various negative reactions and rejections, I’ve spent long periods of Not Writing, but I have always gone back to it eventually; and when I have received praise and encouragement, I’ve glowed for a few days and then spent weeks and months convinced that I would never write anything good ever again.

Like I said: creative death spiral.

Every time I begin to write, I start at zero. I feel that I am not just inventing a story, but myself as a writer. It as though I have to relearn everything I have ever known, every single time. I have to accept – again – that what I have to say, should I even succeed in saying it, may not be worth saying. I may be a better writer than when I started out, but that doesn’t mean I won’t write something terrible, and I am fairly sure I will fall far short of what I wish I could write. If I want to keep going, I have to embrace zero and everything it doesn’t mean. I have had to stop believing that my feelings have any relationship whatsoever to the quality of what I produce. I have to focus on the work itself, not ponder whether it is any good.

Lynda Barry’s comic dramatizes her search for “what is missing” in her art. In the last frames of the strip, she is taunted by ghosts whose frenzy increases the more she resists, until she inadvertently cries out the answer: “I don’t know!” and liberates her work from questions of meaning and worth.

Inevitably, my students will ask themselves the two questions no matter what I say to them, just as I did in my first writing class and for many years afterwards; and some students will read every comment they receive as though it’s a prophesy of what is possible. Some will become angry when what they think of as the oracle suggests that years of practice may stand between them and instant brilliance. I can’t stop my students from wanting answers to the question, “Do you think I’m any good at this?” any more than I could stop myself from asking the same thing when I was in my first workshop. But just because I’ve had – and continue to have – my own struggles with Barry’s lesson doesn’t mean I can’t try to bequeath it to my students.

At my college, the creative writing faculty have been charged with coming up with a way to measure what students learn in art and performance classes they take for general education credit. We had a spirited discussion about what was attainable in one beginning writing course, but we all agreed that it was not reasonable to expect a piece of high artistic quality the first time through the process. None of us, I suspect, are as good as we would like to be, which gives us common ground with our students. The difference is that those of us with more experience have by now swapped our fantasies of genius for a long, lonely march along an unmarked path through unmapped terrain, in search of a hypothetical treasure that may or may not have value. This expanse of untrodden mystery, however, is what freedom actually looks like.

Take it from someone who doesn’t know.

Caution: The Moving Walkway Is Ending

Early one morning a little over a week ago, the DC Metro deposited me at Reagan National Airport, where I would depart for an intensive fiction workshop in San Francisco. Only a few months had passed since my last flight from Reagan, but I had already forgotten the familiar robo-female voice that met me at the airport entrance before I reached the moving sidewalk, repeated its message ad infinitum, and followed me around for days afterwards: “Caution! The moving walkway is ending!”

If a message repeats itself that many times, it functions something like an advertising jingle or a mantra. It snakes into ordinary thoughts and insinuates itself into travel destinations. Eventually, new, unintended meanings stick to it like burrs on a tube sock.

In my case, the prickly tube sock has morphed into a metaphorical statement about the second week of May, the last of this academic year. The moving walkway – something like a tunnel, on which I’d stepped last August, accelerated in a predetermined direction, and then landed in the precise spot the engineers intended – was ending. For months, I’d sped through most days, hopscotched through classes, workshops, conferences, committee meetings, planning, mentoring, and grading, constantly sprinting toward the next point on the calendar.

After all the uproar about workload at Montgomery College, I am not going to mount a defense of summer, which for me will include teaching an online class, serving on a couple of time-intensive committees, co-facilitating a workshop, helping in academic advising, writing an article or two, and, it now seems, helping to compile a handbook for faculty teaching transfer composition. In other words, I will be working this summer. But what I won’t have, at least most of the days, is the moving walkway of obligation to appear in person, dressed presentably, at a specific time and place.

Yes, three months of modified entropy is a luxury. And yes, I will be getting paid for most of the work. However, I learned more about fiction in my four days in San Francisco than I probably have in all my years of writing, and more than anything else I am grateful for the opportunity to be a writer for a couple of months. Among the things I love about teaching is the chance to counterbalance the self-absorption required for writing with work that has a direct and immediate benefit to others. In other words, one of the advantages of a moving walkway is that I have a destination, a clearly marked path, and an arrival time: everything my writing is (usually) not.

It’s time I embraced potential uselessness, fruitlessness, pointlessness, and aimlessness, at least for a little while. It’s true I may get lost, but it’s also true I may end up somewhere the moving walkway can never take me no matter how fast I run.

Lessons from the First Decade

The Friday before last, I celebrated the tenth anniversary of the first class I ever taught. Well, to be perfectly honest, I thought about celebrating – in between answering email, grading essays, and scrambling to finish three separate projects by their deadlines – and wished I had time to blog in honor of the occasion.

People seem to have a widespread misconception that, unlike in every other profession, good teachers spring into shape like instant Ramen rather than going through a period of training, learning, and sometimes-painful introspection. As a mentor and superb teacher who had been in the classroom more than twenty years told me, “My first year, I thought I did pretty well. Then after a few more years passed, I thought, ‘Well, the students learned in spite of me.’” Even though multiple colleagues told me I was “a natural,” I still had a huge amount to learn. I assume that once I’ve taught two decades I will look ruefully back on how little I knew after my first one, precisely because experience matters. Here are some of the (hard) lessons I’ve learned so far.

10. Don’t put policies in your syllabus that you don’t have the heart to enforce. During my first year teaching, I tried to strike a balance between the faculty who wanted me to impose military-style discipline (there were a lot of veterans at my first job) and those who told me, “But your emotions are part of your pedagogy!” Classroom management is like training a cat: either be consistent, or just let the cat take charge.

9.  It’s best to be gullible. I am an English professor, not an FBI agent. I prefer to believe what my students say, even if it is likely that they are not telling me the truth. And, if I can’t prove that a statement is untrue, it’s best to pretend to believe it. The most outrageous example occurred when a student known to be a compulsive liar claimed to have a brain tumor. When he showed up at class, seemingly unable to walk without assistance from other students, I took his word for it. One of my colleagues, on the other hand, required medical documentation. By the next class, he was fully recovered. However, I would not have wanted to be the one to say “I don’t believe you” to someone with a serious illness.

8. Most plagiarism is accidental. When I learned to cite sources, there were more or less only three kinds we were allowed to use in essays: books, magazines, and newspapers. Today’s students are exposed to literally hundreds of different genres, some of which themselves contain plagiarism, sampling, and remixing. The idea that ideas as well as words can be plagiarized comes as a particular shock when I cover academic integrity. While some students flagrantly copy whole papers and hope to get away with it, most are genuinely confused.

7. There is no such thing as review. You are either teaching – as if students have either never learned or have forgotten what you’re talking about – or reinventing. One of my worst-ever teaching mistakes (Fall 2004, my first time teaching developmental English, is burned in my memory) involved rushing through material I thought students would know if they had met the course prerequisites. It took us about a month to recover, but I never made that mistake again.

6. Less is more. One of my favorite student comments of all time (I think it was Winter 2004) came during the class before an essay was due. I knew some students had been confused by the assignment, but I had explained it multiple times and thought they now understood. About a third of the way through class, though, one of my most conscientious students asked, “I know you have said we need to include X and Y. But what are we supposed to do in this paper?” Oops.

5. Don’t work harder for your students to pass than they do themselves. As I have mentioned, I teach at a community college, and many students face serious obstacles to completing an education. There’s a fine line between reaching out to a struggling student and, well, overreaching. Films about teachers often focus on recalcitrant students who respond to a teacher’s caring and mentorship (Good Will Hunting, with multiple people chasing after the troubled-genius janitor played by Matt Damon stands out especially), but in real life, I have found that if I put more effort into the student’s passing than the student herself, outcomes are almost always bad no matter how noble my intentions. Consequently, I have learned to let passing be the student’s accomplishment, not mine.

4. Teaching is the fun part of your job. Committee work is the price of admission to a classroom, and there is no better antidote to frustrating college politics (or, for that matter, exhaustion, aggravating personal situations, etc.) than an hour spent teaching students.

3. Dress for the mess. I truly admire my colleagues who can wear white to work and not stain it with coffee, dry-erase markers, ink, copier toner, or any of life’s other little accidents. Come to think of it, I’m not sure I actually have colleagues who wear white to work. Public speaking may be the most common phobia, but even faculty who get up in front of people every day don’t want to feel like the spotlight is on…spots. Prints and layers are good. Black is usually good unless you are teaching with old-fashioned chalk, in which case you’ll look like you’ve been in a paintball fight by the end of class.

2. Leap. Even a well-planned class can sometimes go awry, and at such times, you are lucky to be in a profession where you can change direction without warning or approval.

1. Don’t let what you’ve learned override your passion. When I was a new teacher, I had love for my subject, but no experience. Once I had some experience, I became obsessive about getting class right – being structured, sequential, and clear – and not leaning on spontaneity when I should have good planning. While these goals were all worthwhile, I realized last summer that, somewhere along the line, my perfectionism had led me to leave my passion at the door when I stepped inside a classroom. This year, I realized I’d come full circle: my magic ingredient was the one I’d had with me all along.

I feel like I could easily come up with several dozen more lessons. So, colleagues, what bits of hard-won knowledge would you include on your list?

The Other Side of the Fence

As I cleaned my office in preparation for the start of the semester, a small yellow slip of paper somehow rose to the surface: a parking permit request form from Highline Community College in Des Moines, Washington, where I spent my first and last quarters as an adjunct. The paper bears my old zip code and the license number of a 1991 Civic hatch that most likely no longer runs.I will leave for another blog entry the story of how I came to leave the Civic in one Washington when I flew off to the other one, but finding an artifact of my adjunct days seemed especially fortuitous this week.

It has been sort of a big week for me. About ten days ago, a wonderful teacher and colleague announced that he would be resigning to follow his wife to the Pacific Northwest. He was the coordinator for our transfer composition and literature courses, which involved mentoring and overseeing nearly thirty adjunct faculty members, conducting assessments of our learning outcomes, reviewing course descriptions and requirements, and working with department chairs and coordinators on our college’s other two campuses.

Those duties have now fallen to me. Nearly ten years have passed since the first day I strode to the front of a classroom, saw a row of faces aimed in my direction, and thought, “Wow, they’re looking at me like they think I’m a real teacher. I guess I’d better teach.”

Similarly, when I went to yesterday’s beginning-of-the-semester adjunct orientation meetings, my part-time colleagues looked at me like I was a real coordinator. Just as I discovered by acting like a real teacher that I could become one, I somehow found myself – despite self-doubt, nerves, hesitation to advise faculty who had been teaching far longer than I have, ambivalence about thinking of myself in a leadership role – unexpectedly transformed from the neophyte who asked all the questions into a professional tasked with answering them.

I realized something amazing this week: To my surprise, I can do the job.

I fielded questions. I commented on syllabi and assignments, offered sample handouts my colleague had left on a disk, suggested approaches that had worked for me, and reassured newcomers. I even said “No” a few times, and nobody seemed to hate me afterwards. I navigated personalities, facilitated discussions, and led a workshop. People treated me like someone who knew what she was doing. It was a little weird.

If I can do the job, though, it’s because of what I have stolen from or have been given by others. When I advocated for stipends for part time faculty who facilitate workshops or serve on committees, I thought of a colleague at Green River Community College who assigned adjuncts to leadership positions for short-term assessment projects and refused to let us work without pay, and I remembered the department chairs at South Seattle Community College who picketed on behalf of adjuncts while I scurried to the parking lot to drive to my next teaching gig.

When a part-timer asked me for help applying for full time positions, I thought of the former department chair who fired interview questions at me, critiqued my responses, and gave me tips on my teaching demo; the coordinator who advised me to learn to teach developmental English and initiate visible projects; and the brilliant tenured instructor who took at least two hours out of her winter break to scrutinize my CV and cover letter. The job search advice I give is their advice.

I think of my fellow “freeway flyers” in our various part-time faculty offices who took hours out of their breakneck schedules to talk through assignments, advise me on classroom management, and let me plunder their best ideas. I think about my first dean, whose response to most of my teaching questions was, “Well, what have you thought about doing?” and an electronics instructor who was my unofficial mentor through the bruising first years of learning to teach. I think of my current department chair, who has tirelessly answered my questions, gracefully negotiated department and college politics, and encouraged me to grow as a professional.

And I thought of the kind words of a colleague who was then a stranger, a compliment that sustained me when I was ready to give up my full-time job search. At the time, I had no idea that a year after I signed that part-time permit request, I would be living on the opposite coast and starting my first semester as a full time faculty member.

This morning, having survived my first week as coordinator, I pinned the slip of paper on my bulletin board to remind me of the generosity of many, many others. I have no way to thank them – except, perhaps, to do my job as they would do it.

The Persistence of Teaching Nightmares

I have never once come to the first day of class without a syllabus, but at the start of every single semester I have nightmares that I show up with materials for the wrong class, that I forget to show up to teach one of my classes for an entire semester, that the campus changes shape so that I can’t find my classroom, and on and on and on.

Salvador Dali, "The Persistence of Memory"

March 2 of this year marks the tenth anniversary of the day I taught my first class. My current college is on the semester system, but before that, I taught four quarters a year. That’s a lot of nightmares. But, as in the adage that you can’t be a good horseback rider until you get thrown 99 times, I stopped counting long before I hit the magic number.

I am not alone, either. If you visit our department during the first week of classes, you’ll see a whole bunch of extremely competent faculty pretending that they don’t feel like teenagers trying to open their lockers on the first day of high school. Some of them have taught twice as many years as I have, and some of the coolest ones will still admit that they always feel nervous on the first day. The rest of them just look like they do.

So what if I spend the start of the semester with a vertiginous feeling that I am going to fall on my face? I think that if I ever stop feeling nervous on the first day, it will mean one of two things: one, that I am actually dead and don’t realize it (I know, I know, The Sixth Sense has warped an entire generation); or two, that I have stopped caring. Both of these options are undesirable, but I think I would prefer death to apathy. That finger-in-socket zap of stage fright means I’m still alive (which, incidentally, is more than I can say for M. Night Shayamalan’s last few movies).

On Friday night – yes, Friday night! – I had one dream after another about writing syllabi, each wave of syllabus-writing perfectly mundane and lifelike. I woke up on Saturday surprised and a little outraged that my classes were still not prepped, considering how much time I’d already spent on them. I did what I do every semester: I sat down at the computer and got to work.

The week before classes, full-time faculty have a week of nearly back-to-back meetings, workshops, retreats, and meet-and-greets, and, sometimes, last-minute class schedule changes that might necessitate writing a last-minute syllabus. Most of the meetings are necessary and valuable, but the schedule is exasperating. This semester I have some new responsibilities, so I took a look at my Professional Week calendar, narrowly avoided hyperventilating, and decided that – perhaps for the first time – to prep my classes early. Nevertheless, I fully expect the nightmares to commence on their usual schedule.

Sweet dreams, colleagues!

The Whole Dang Pie

I made pizza a few nights ago (yes, that’s it in the photo). My three year-old cats sat on the windowsill and observed with mild interest as I simmered sauce, stirred poufs of flour into water, kneaded dough, and grated cheese. Dough’s nature is to abandon itself to your hands, and the moment it yields under my fingers is one of my purest, simplest pleasures.

My cats looked at me. I looked at my cats. And then I realized: Although cooking is one of my favorite things to do and pizza is probably my favorite thing to cook, I hadn’t made it in over three years.

That just seems wrong. The process of pizza-making is the opposite of teaching, writing, thinking, and reading, which is how I spend most of my time. When I am cooking, my hands, nose, and tongue experience every sensation while my head empties and floats out of the kitchen like a helium balloon. Also, almost everyone of any age can be made happy with pizza, and I have no particular qualms about exchanging homemade pizza for affection. I am reminded of Monty Python’s The Meaning of Life, when Gaston the waiter explains, “The world is a beautiful place. You must go into it and love everyone. Try to make everyone happy, and bring peace and contentment everywhere you go. And so I became a waiter….”


I love teaching. In fact, I’m crazy about it. Even when I “go into it and love everyone,” I can’t make everyone happy, nor can I bring peace and contentment everywhere I go. Knowledge – when I am skilled enough to impart it – is unsettling and challenging more often than not. The slice of me that thinks, reads, plans, and grades feeds one set of mouths, and the slice that wants to serve peace and contentment topped with grilled eggplant and sundried tomatoes feeds others.

Every January and every August I tell myself that if I scrupulously manage my time and energy, I can have the whole pie. I can return papers on time and read lush, fat novels; I can be brilliant in the classroom every single day; I can fulfill the CDC’s physical activity guidelines; I can write twenty pages a week; I can sleep eight hours a night, entertain every weekend, and launch a wildly satisfying and time-consuming love affair with no impact on any of the above.

If a meeting runs late, an emergency comes up at work, a home repair becomes urgent, or a cat has to go to the vet, my January/August illusions fall apart. Usually, the disillusionment process occurs sometime during the first week of classes and accelerates until it’s time to refuel my illusions again.

I am like many other teachers in that I have a hard time making a commitment to fulfill my own needs when the needs of students are so often so much greater than my own. I can tell myself “Put your own oxygen mask on first” as often as I like, but frankly I don’t like watching other people turn blue while I take time for myself. I don’t think the habit of gasping for air makes me a better teacher, though, and I’m fairly certain it doesn’t make me a better person, either.

Lately I’ve started thinking that maybe it’s not better to lavish all my attention on one slice of pizza when I could be serving up an entire sumptuous pie.

Blah, blah, blah.

First, I need my January/August syndrome to last until February.

My 2011 Cosmic Footprint

Rare Cosmic Footprint from the Hubble Space Telescope, July 2011

The past couple of years, I have avoided most New Year’s resolutions. What resolutions I did make last January have only a tangential relationship to what I accomplished this year, or, for that matter, to how I spent my time. For example, 2011 was the first year in a long time that I didn’t make a resolution to lose weight, but for once I actually did. Nevertheless, I don’t think the world really cares whether my jeans fit.

This New Year’s, I am feeling just a tiny bit skeptical about our national obsession with self-improvement resolutions. It’s not that I don’t need improvement – you could come up with a substantial list of my faults faster than I can say “2012” – but, frankly, we can all think of oodles of things that need improvement far more than I do. Lately, instead of re-evaluating the goals I set last January, I’ve been asking myself what I feel is a more important question: “What would have been different about the universe this year if I hadn’t been in it?”

I imagine all the people I care about, all the lives I touch as a teacher, and my impact on strangers I will never meet, and I wonder: What have I done for them? What have I made more difficult? When I say I believe in something, what have I done about it? When have I been constructive, and when complacent? If teaching at a community college has taught me anything, it’s that caring words at the right moment have the power to change lives. This year, there were times I paid attention at the right moments, and times when I was something less than mindful.

I think of my 2011 Cosmic Footprint as something like the carbon footprint calculators that have proliferated around the web, which give you an idea of how much carbon you consume and save from energy conservation each year. The first time I used a carbon footprint calculator, my self-righteousness toppled like so many clearcut trees in the Amazon rainforests. (Lesson #1: Caring about the environment does not reduce carbon emissions.) In the past two years, I’ve reduced my personal emissions by roughly 30%, mostly by telecommuting one day a week and teaching online during the summers. My footprint is still much too large to help save the world, but I have made steady progress.

Similarly, when I contemplate my cosmic footprint, I’m not trying to decide whether on balance I’m better or worse than I wish I were. Instead, I am trying to reflect, as neutrally as possible, on how my existence impacts the universe outside myself. Both knowingly and unknowingly, I am sure I did both good and bad over the past year, but thinking about my cosmic footprint is more like trying to trace the circles that radiate from a pebble dropped in water. There were pebbles I threw into the water and pebbles that accidentally fell out of my hands, but all of them made ripples.

The new year is a time when we indulge our desire to be perfect versions of ourselves, and because we’re human, we fail more often than not. It is a matter of what about ourselves we are trying to perfect. Plenty of people have left cosmic footprints that stride far ahead of mine, and plenty of people make New Year’s resolutions that focus on helping others and trying to change the world. The media’s “New Year, New You” mania for self-improvement, however, discourages us from trying to improve all the things that exist beyond our own skins.

I am not surprised that each year, after a few weeks turned inward for the holidays, our ritual discussion of New Year’s resolutions focuses on our outer selves: how to make more money, how to be more attractive, how to turn resolutions into reality. What surprises me a bit is that it took me so long to realize that the footprint I leave on the universe is far more important than the ones I put on the scale.