You’ve Read This Post Before


The Glossary, a Los Angeles-based audiovisual marketing firm, has reinvented David Foster Wallace as a motivational speaker. This “fine purveyor of STIMULATING VIDEOGRAMS” edited the best soundbytes from Wallace’s graduation speech at Kenyon College, “This Is Water,” and then dressed it up with video, trendy animated scribbles, and sprightly background music.

The Glossary included the lines from the speech that haunted Wallace’s readers after he hanged himself:

Think of the old cliché about quote the mind being an excellent servant but a terrible master.

This, like many clichés, so lame and unexciting on the surface, actually expresses a great and terrible truth. It is not the least bit coincidental that adults who commit suicide with firearms almost always shoot themselves in the head. They shoot the terrible master. And the truth is that most of these suicides are actually dead long before they pull the trigger.

Returned to its original context as part of an exhortation to graduates to work towards mastery of their own perceptions – considering, for instance, that the overweight woman losing her temper in a checkout line might have spent the night with a dying husband and was not, in fact, just put on earth to annoy everyone in line behind her – the passage serves as a sort of radical motivation in which reimagination is the only way to keep oneself alive. Some critics, including Leslie Jamison, in his review of a Wallace biography, have rewritten Wallace’s suicide as a piece of postmodern performance art, with the “terrible master” passage a snippet of autobiography concealed by being waved in front of a crowd.

The less esoteric version has Wallace suffering from lifelong depression, forced to go off his medication because of severe side effects, and then, after falling into an even more severe depression and restarting the poison pills, discovering that they were no longer effective for him. Apparently, even if you are a genius, you still also have to be a person and a body with an uncooperative brain. Irreconcilable differences are bound to occur.

What surprises me about The Glossary video that has gone viral this week is that people find Wallace’s views so inspiring and revolutionary. In essence, he argues that most people ricochet back to the same mental point of origin, the panoramas that are so familiar we have stopped seeing them; but by prodding ourselves to consider other versions of what looks like reality, we are free to become better masters of our minds. He also acknowledges that getting outside ourselves is difficult, exhausting work, and he admits that sometimes he himself is too tired to engage in it.

To me, this celebration of possibilities is as good a definition of creativity as I’ve ever come across – something like mental Cubism, in which all realities can be embodied at the same time. But it also makes perfect sense to me that Wallace’s call to reinvent and reenvision, and the massive effort it takes to do so, would come from someone who was suicidal enough of his life for a bullet in the brain to become a metaphor. With depression as the random point in space from which you view the world, death is always right in front of you, blocking your view. To survive, you have to imagine a different frame, in which the option of suicide is somewhere far in the distance, behind a closed door, somewhere you might visit sometime when you don’t have so many other things to do. Once you know where the door is located, though, it is impossible to forget it exists or how to open it.

In a speech at the 2011 National Book Festival, Toni Morrison briefly discussed her dissertation, which compared William Faulkner’s and Virginia Woolf’s conceptions of suicide. Faulker viewed suicide as the ultimate defeat, Morrison explained, while Woolf saw it as a reasonable choice, in her case a rational alternative to putting herself and her husband through another period of psychosis. I tend toward Woolf’s view, and, I would guess, so did Wallace. Wallace’s “This Is Water” speech offers instructions for making other choices.

However, it is a more than a little paradoxical that the speech has been appropriated by a marketing firm. As a former (mostly mediocre) ad writer, I’m in a position to know that the whole objective is to create materials that act as magnets, pulling thoughts in the intended direction without infringing on viewers’ certainty of their own free will. Within a few days, the video had attracted 2.7 million views, dwarfing the popularity of previous projects (and, incidentally, using audio of Wallace’s Kenyon speech without permissions). In an Adweek interview, the creators claim, disingenuously in my opinion, “Our main goal was to expose people to the content of the speech.” Later in the interview, though, the creators concede, “…as a tiny company in an industry filled with so much talent and competition, it’s extremely difficult to get your work noticed…so we’d welcome anyone who enjoyed ‘This Is Water’ to get in touch with us.”

I’m reminded of the perennially puzzling sentences, “This statement is untrue” and “Question authority.” Wallace’s legacy will almost certainly transcend this little ripple in the information ecosystem, but I’m also fairly sure its undertow is meant to pull us down into the water.

Advertisements

No, Really and Truly – The Absolutely, Positively Worst Ideas of 2012

Copernicus_-_Heliocentric_Solar_SystemFor some reason, The Washington Post prematurely nominated its worst ideas of 2012 way back on October 1. All the Post’s bad ideas had to do with sexual indiscretion by powerful men, political incorrectness, hubris, or all three. The one bad decision in the bunch made by a woman was the failed ouster of University of Virginia president Teresa Sullivan, which was spearheaded by that self-appointed defender of vision, the unfortunately-named Helen Dragas.

Speaking of hubris, though, the Post left out almost three months of bad ideas and almost an entire gender – which is sort of amusing, considering that some of the worst ideas of the year were about women. Here goes:

Do-it-yourself birth control: First, Foster Friess, a billionaire and mutual fund manager, kicked off the war on women when he suggested Bayer aspirin could prevent pregnancy: “The gals put it between their knees, and it wasn’t that costly.” In case we excused Friess’s comment as anomalous, Missouri Republican Todd Akin – also known for trying to eliminate school lunches for embryos that make it to grade school – defended prohibitions on abortion for rape victims by declaring, “If it’s a legitimate rape, the female body has ways to try to shut that whole thing down.”

Rape as God’s will: Not to be outdone, Indiana Republican Richard Mourdock argued – several times! – that any life resulting from rape was “something God intended to happen.” His idea manages to be terrible on several levels: first, that (despite its frequent appearance in the Bible) rape is acceptable because the ends justify the means; second, that God means to torture women; and third, that Mourdock somehow knows what God intends.

Ayn Rand: From Rand’s excruciating prose, eugenically-selected protagonists, contempt for acts of generosity on the grounds that they enable helplessness, and glorification of selfishness, we learned that the Romney-Ryan defeat stemmed from the triumph of mediocrity rather than Romney’s staggering ignorance of the world inhabited by the ordinary riffraff. (Dana Milbank’s piece in the Washington Post, “At Romney Headquarters, the Defeat of the 1%” does the best job I’ve seen to show that Romney’s insensitivity comes straight from the heart.)

Teachers bearing arms: If I actually have to explain why this is a terrible idea, please stop reading now.

The Second Amendment: If you skip the “well-regulated” and “necessary to a free state” parts, assault weapons make perfect sense.

Jonathan Franzen’s opinion of Edith Wharton: Based on Wharton being unattractive and sexless, America’s most popular purveyor of unpleasant characters dismisses her entire body of work. The bad idea – which you really might expect someone at The New Yorker to question – is the entire assumption that women have no artistic legitimacy without sex appeal.

New Yorker cartoons: Looking for sexism? Women carping at their downtrodden husbands? Gender dynamics that haven’t changed since the 1920s? I love The New Yorker, but I wish it would reconsider its tradition of phallocentrism.

Women are helpless, except when they’re not: Okay, I’m supposed to believe that the general of the most powerful military in the world was prostrate before the siren song of Paula Broadwell? Either he couldn’t resist – which I highly doubt, given that Petraeus was entrusted with our national security – or he could have resisted, but didn’t bother since the popular press would blame the woman anyway.

Voyeurism. Maybe Invisible Children was a showcase for the arrogance of Jason Russell, but when TMZ broadcast him staggering naked through the streets of San Diego and ridiculed what was clearly a mental breakdown, it didn’t exactly show the public in a flattering light when we played along. Same with the photograph of a man about to be hit by a NYC subway car. And same with the anguished photo of a woman trying to find out the fate of her sister, who had already been killed by the Sandy Hook shooter.

Illusions of privacy. Yes, my privacy has gone the way of the Twinkie, without the anti-union rhetoric. I value privacy, but not when it gets in the way of seeing the cartoons and photos my friends post or being able to avoid entering twice as many addresses into Google Maps on my phone.

The end of the world. The true bad idea here is that I didn’t plan an end-of-the-world potluck holiday party; I hosted one in 1999, asking guests to bring the dish they would want to eat if the world really ended at the turn of the millennium. Good times. P.S. Runner-up: blaming the prediction on the Mayans.

The end of the list. And if you believe that these are the only worst ideas of 2012, I have something I want to sell you. Close your eyes, hold out your hands, and count to ten.

What to Wear to the Apocalypse

Average temperature change since 1970, from http://www.climatecentral.org

I remember when the weather used to be a safe topic of conversation. Actually, the weather hasn’t been a safe topic of conversation for about 150 years, but around the time Al Gore released An Inconvenient Truth, the subject became downright inflammatory.

I was an adjunct at the time, and when I heard that one of the feeder districts for the community college where I was teaching – Federal Way School District in Washington State – had placed a “moratorium” on showing the DVD and required teachers using the film in class to include a “credible, legitimate opposing view” (which, according to all but a handful of climate scientists, does not exist), I decided to incorporate it into my syllabus. The now-extinct Seattle Post-Intelligencer reported with barely concealed sarcasm that the regrettably named Frosty Hardison had contacted the School Board to oppose the film:

“Condoms don’t belong in school, and neither does Al Gore. He’s not a schoolteacher,” said Frosty Hardison, a parent of seven who also said that he believes the Earth is 14,000 years old. “The information that’s being presented is a very cockeyed view of what the truth is. … The Bible says that in the end times everything will burn up, but that perspective isn’t in the DVD.”

Frosty, in name and deed, seemed to embody all that was ridiculous about the Federal Way policy. But, in class, the film generated not controversy but daytime sleepiness. The students argued, legitimately, that an argument attached to a politician was suspect; and the lack of popular coverage of the issue, combined with my lack of knowledge at the time, made the whole topic less than engaging in class. Soon, media outlets began picking apart the facts in the film, with the drowning polar bear animations drawing particular skepticism. By the time the “Climategate” scandal hit, naysayers had so thoroughly bombarded the media with talking points that the exoneration of the UK’ s Climate Change Unit had become a lot like the absence of WMDs in Iraq: something factually true that was believed to be false. Besides, the scientists’ failed public relations strategies didn’t change scientific reality.

Awareness of global warming has finally filtered down to the general public. I was at a store earlier this week and commented at checkout, “It’s a lot hotter than advertised outside,” and, rather than a bland response, the clerk said, “Global warming! Well, we were warned the apocalypse would come in 2012…”

Here in DC, we have already had seven days over 100 degrees, including a record-breaking four in a row (so far). Millions lost power in a massive derecho; heat buckled Metro tracks, causing a derailment; we’re in the midst of the worst drought since the 1950s, with 80% of the country “abnormally dry”; up to 15 inches of rain fell in Texas; the Quileute Indian Nation of Washington State, featured in the Twilight film and book series, is facing the obliteration of tribal lands due to rising sea levels; twenty-nine square miles of Colorado is now burnt to ash. Outside the U.S., a chunk of ice the size of two Manhattans has broken off Greenland. While scientists can’t claim conclusively that warming is directly responsible for a particular weather event, they have said repeatedly that the violent weather we’ve experienced in recent years is “consistent with” what would be expected as the planet overheats. In other words, global warming has become obvious even to the average unconcerned citizen.

Unfortunately, however, as Elizabeth Kolbert points out this week in her impassioned New Yorker essay, “The Big Heat” (July 23, 2012), politicians have barely mentioned the climate. With the exception of Bill McKibben’s much-forwarded Rolling Stone article, “Global Warming’s Terrifying New Math,” which has appeared in my facebook feed at least a dozen times now, the climate news that has appeared in news outlets at least every day has been largely ignored in favor of a constant barrage of election news (and now, as we process the tragic Batman shooting in Aurora, CO, with arguments about gun control). If you read only one article I’ve mentioned in this blog entry, McKibben’s is a good one, his main point being that if we burn all the fossil fuels currently in the ground, we will end up as science fiction characters in a mostly uninhabitable world.

We are rightly concerned with sublunary national issues – the 1%’s plunder of the rest of the nation, the war on women, Mitt Romney’s cruelty to both animals and humans, the fall of the JoePa statue, advancements toward legal gay marriage, the hegemony of the SuperPacs, attacks on and defenses of Obamacare. The divisions between red states and blue states have never seemed so livid, and they pour into my email and news feeds in gushes of lurid partisanship.

If you’ve read this blog once or twice, you probably have a good idea of my political leanings. But I find myself worrying about the decimation of species – in the most likely scenario, 25% of species will reach extinction by 2050 – and the disappearance of the world’s coral reefs. When I’m out walking or driving, I try to memorize the contours of plants and the beauty and sounds of birds, because probably many of them will disappear before I die. I see that I live in a wealthy country that will most likely manage to keep its grocery stores filled, unlike Kivalina, Alaska, which will soon be under the ocean; or the Sahel, which is withering away into desert. These effects will both outlast and dwarf the impact of our other social policies.

There can be no lasting social justice without environmental justice. Since life has existed on Earth, every other species has responded to shortages of food, water, and shelter with massive die-offs. Unless we make use of the science we have, the human species will be no different. We can look out at our blue universes and our red universes, but if we are living on a brown, dead planet, our politics will no longer matter.

On the other hand, maybe by fighting together to save the world, we can save ourselves; and maybe by fighting to save ourselves, we can save what’s best in our shared humanity. The first step is to try to see past the red and blue so that we can do what it takes to hold on to what’s left of the green.

My 2011 Cosmic Footprint

Rare Cosmic Footprint from the Hubble Space Telescope, July 2011

The past couple of years, I have avoided most New Year’s resolutions. What resolutions I did make last January have only a tangential relationship to what I accomplished this year, or, for that matter, to how I spent my time. For example, 2011 was the first year in a long time that I didn’t make a resolution to lose weight, but for once I actually did. Nevertheless, I don’t think the world really cares whether my jeans fit.

This New Year’s, I am feeling just a tiny bit skeptical about our national obsession with self-improvement resolutions. It’s not that I don’t need improvement – you could come up with a substantial list of my faults faster than I can say “2012” – but, frankly, we can all think of oodles of things that need improvement far more than I do. Lately, instead of re-evaluating the goals I set last January, I’ve been asking myself what I feel is a more important question: “What would have been different about the universe this year if I hadn’t been in it?”

I imagine all the people I care about, all the lives I touch as a teacher, and my impact on strangers I will never meet, and I wonder: What have I done for them? What have I made more difficult? When I say I believe in something, what have I done about it? When have I been constructive, and when complacent? If teaching at a community college has taught me anything, it’s that caring words at the right moment have the power to change lives. This year, there were times I paid attention at the right moments, and times when I was something less than mindful.

I think of my 2011 Cosmic Footprint as something like the carbon footprint calculators that have proliferated around the web, which give you an idea of how much carbon you consume and save from energy conservation each year. The first time I used a carbon footprint calculator, my self-righteousness toppled like so many clearcut trees in the Amazon rainforests. (Lesson #1: Caring about the environment does not reduce carbon emissions.) In the past two years, I’ve reduced my personal emissions by roughly 30%, mostly by telecommuting one day a week and teaching online during the summers. My footprint is still much too large to help save the world, but I have made steady progress.

Similarly, when I contemplate my cosmic footprint, I’m not trying to decide whether on balance I’m better or worse than I wish I were. Instead, I am trying to reflect, as neutrally as possible, on how my existence impacts the universe outside myself. Both knowingly and unknowingly, I am sure I did both good and bad over the past year, but thinking about my cosmic footprint is more like trying to trace the circles that radiate from a pebble dropped in water. There were pebbles I threw into the water and pebbles that accidentally fell out of my hands, but all of them made ripples.

The new year is a time when we indulge our desire to be perfect versions of ourselves, and because we’re human, we fail more often than not. It is a matter of what about ourselves we are trying to perfect. Plenty of people have left cosmic footprints that stride far ahead of mine, and plenty of people make New Year’s resolutions that focus on helping others and trying to change the world. The media’s “New Year, New You” mania for self-improvement, however, discourages us from trying to improve all the things that exist beyond our own skins.

I am not surprised that each year, after a few weeks turned inward for the holidays, our ritual discussion of New Year’s resolutions focuses on our outer selves: how to make more money, how to be more attractive, how to turn resolutions into reality. What surprises me a bit is that it took me so long to realize that the footprint I leave on the universe is far more important than the ones I put on the scale.

They Shouldn’t Have to Die to Get Our Attention

Several times a day, online news, a facebook status update, or another “It Gets Better” clip reminds me that bullying can end in death. Just the other day, I was at the doctor’s office, where a gigantic screen that dwarfed the room featured photos of a young teen who had committed suicide after being bullied. Pinned in my seat as the tearful mother described finding her son dead and insisted that she must tell her story to prevent similar tragedies, I wanted to weep myself. My emotions were mixed: horror and sadness at the mother’s loss and the tragedy of a young person’s suicide, a sense of violation that this story was forced on me at that particular moment, and outrage, not just at the bullies, but at the notion that bullying is only noteworthy when it ends in suicide.

I don’t need to enumerate the ways that bullying has become more public, and more permanent, than when I was a child. Bullying through note-passing and whispering in my own teen years has turned into a blowtorch of humiliation on social networking sites, and the virtual “Kick Me” signs are nearly impossible to remove. Dan Savage’s incredible “It Gets Better” project has drawn attention to the particular suffering of young gay kids, but it seems important to remember that most victims of bullying will never get media attention. With 30% of kids reporting that they have been the perpetrators or victims of bullying, according to a September 2011 report by the Democratic Independent Congress, the number of victims dwarfs the few dramatic and tragic stories reported in the media. The number of young people involved in bullying may be far higher, however, according to Dana Boyd and Alice Marwick’s New York Times editorial, “Why Cyberbullying Rhetoric Misses the Mark.” Boyd and Marwick observed and interviewed teenagers and found that, despite nearly unanimous insistence that bullying didn’t exist, that once the word “bullying” was changed to “drama,” the percentage of teens admitting their involvement “as victims and/or perpetrators” skyrocketed.

I don’t mean to diminish the anguish of those teens whose chose suicide as the final escape from their tormentors, nor the grief of their families and friends, and I applaud everyone who has released an “It Gets Better” video or has taken action to draw attention to the prevalence of bullying. But the problem is like the truth that the flea we see means that 300 more are lurking in the rug, and the media’s fetishistic fascination with death-by-bullying has the potential to cost more lives than it saves.

Here’s why: Put yourself in the mind of a bullied teenager for a moment. In their editorial, Boyd and Marwick discuss the rhetorical divide between the words “drama” (everybody does it) and “bullying” (which carries a gigantic stigma), “For a teenager to recognize herself or himself in the adult language of bullying carries social and psychological costs. It requires acknowledging oneself as either powerless or abusive.” In other words, a teen who admits to being a victim of bullying is implicitly acknowledging her position at the bottom of the social hierarchy, and, as Boyd and Marwick find in their study, “Many teenagers who are bullied can’t emotionally afford to identify as victims, and young people who bully others rarely see themselves as perpetrators.” As those of us who were bullied know all too well, the shame of admitting you’re the kid who is left out of every game and social event is bad enough.

The media portrayals of bullying suicides add to this shame, though, by sending the  message that teens aren’t really suffering – and that therefore they can dismiss their pain as unimportant – unless (or until) they’re considering suicide. Yet another way the media portrayals might be counterproductive is by creating an incentive for bullied teens to attempt suicide because they don’t see any other way to voice their pain or ask for adult help. In the 15-24 year-old age group, according to the American Foundation for Suicide Prevention, 1 in 5 have “seriously considered” suicide in the past 12 months. Chances are none of them have said anything to an adult, and possibly not to anyone. I am not implying that suicidal kids are all bullied kids, but depressed or bullied teens (and adults, for that matter) are more likely to conceal suicidal thoughts than not. The message I am afraid that the media is giving kids is that they haven’t really suffered unless their trauma ends in suicide. Unfortunately, a dead child is beyond our help, but the living kids are all around us, telling themselves things really aren’t bad enough, or unusual enough, to be deserving of adult notice. The media’s fixation on the most extreme consequences of bullying has the potential to do further harm, in contrast to the “It Gets Better” videos, which offer encouraging messages of survival.

How many of the (admittedly few so far) readers of this blog were bullied themselves? And how many of us tell ourselves, “Well, if I got through it, so should they”? How many feel that the bullying went beyond being a rite of passage and crossed the boundary between painful experience and trauma? Sure, it gets better. Sometimes it doesn’t. And yet here we all are, still alive. But you know what? I’m pretty sure very few of us told. I’m pretty sure even fewer were taken seriously when we did. We can’t be everywhere, and we can’t see everything – but we can see past the media drama to the more subtle instances of suffering close to home, and when we do, we should be ready to listen.

Time Worth Wasting

As of this writing, the Occupy Wall Street movement is spreading – some would say metastasizing – all over the country and now all over the world. The list of reasons I could choose not to participate is long: I have stacks and stacks and stacks of grading remaining to do; at 42, I’m too old to join what seems like a youth movement; Occupy’s lack of a coherent message makes it seem like a weekend project for disaffected millennials; the marchers and campers are predominantly white and I’m white, and I don’t want to live up to a stereotype of myself; and I’m just one person, so my presence doesn’t add much to the effort. Beyond the circumstances of this particular movement, I am not a person who enjoys being in crowds, shouting simplistic slogans, hearing ideas I believe in reduced to accusatory black-and-white statements, or listening to earnest speech after earnest speech after earnest speech. There are activists who enjoy activism, but I am not one of them.

But still, my life since the few days of abortive antiwar rallies in 1991 has been riddled with various protests. I have rallied in the dark, on glorious days and gray ones, and once in a deluge with Seattle police in riot gear standing shoulder to shoulder. I have been cheered and I have been heckled. I’m not going to pretend I’ve been out every weekend, and I’ve gone through long periods where it has seemed like writing letters to the editor is a better use of my time. Nevertheless, I was at the Occupy marches today, Saturday, October 15, representing the long-suffering 99% – even though I’m an employed, not particularly remarkable Gen-Xer who would just as soon have been enjoying the sunshine somewhere other than at the Washington Monument.

But still. I was there, well before I was inspired and well before horns started honking in support, before tourists gave us the thumbs-up from open-roofed buses, and before a yellow schoolbus full of kids hung out the windows to cheer us on, their faces painted with excitement. Before all that, I was there because, when voting is not enough, people have to vote with their words, and when that’s not enough, we have to vote with our feet. When I march, I am not expressing myself, but telling the world that I am part of a disregarded whole. I am doing my part to combat the media narrative that direct action and civil disobedience are committed only by the scruffy, the pierced, and the disreputable. If I am not there, my opinion is no better than if I didn’t give a rip, so I have a responsibility to support the causes I believe in. By showing up, I am telling the world that my point of view exists, even if it is destined to be defeated or diminished. I am telling the world that I am not going to go silently.

In this case, though, the simplicity of the message – we want governments to consider our welfare, not just corporate profits and political expediency – may end up surviving its assault by the media. What is frustrating about this movement is also what makes it unique: Rather than advocating a specific course of action, Occupy is challenging the cultural and political assumptions that individual rights can be sacrificed in the name of political stability. We are the 99%, and we finally spoke up for ourselves. If I can’t spend a few hours on a beautiful Saturday afternoon voicing my support for a more just world, why should I expect someone else to fight the fight on my behalf?