The Other Elephant in the Room

At this point, we have all heard plenty about James Eagan Holmes’ mass murder of an Aurora, Colorado audience at a midnight opening of The Dark Knight. We have heard very little about Holmes, however, other than that he was a neuroscience graduate student at the University of Colorado, that he killed twelve people and wounded 58, that he was struggling in his program and was in the process of withdrawing, that he booby-trapped his apartment with explosives, and that, oddly, he left only the barest traces on the internet.

In a way, I have been relieved not to be bombarded with the sordid details of Holmes’ quiet life. I’ve been thinking quite a bit this week about a point that protection expert Gavin de Becker makes in his book, The Gift of Fear: “Reporters usually refer to assassins with triple names, like Mark David Chapman, Lee Harvey Oswald, Arthur Richard Jackson…Our culture presents many role models, but few get as much hoopla and glory as the assassin.” Add to this list Seung Hui-Cho, Jared Lee Loughner, and James Eagan Holmes.

In the absence of concrete information about Holmes and his motivations, often-acrimonious debates about gun control have proliferated on editorial pages, blogs, and – naturally – facebook. (My favorite facebook feature these days is the “Unfollow Post” button.) Many, many gifted writers have argued eloquently in favor of increased enforcement of existing gun control laws, reinstatement of the Assault Weapons Ban, and more conscientiousness about background checks.

Other than David Brooks’ column, “More Treatment Programs,” though, I have seen very little about the role of mental illness in mass shootings. In his article, Brooks criticizes those who use the Aurora shooting as a pretext for calling for more gun control, saying, “These killers are primarily the product of psychological derangements, not sociological ones” and arguing that “The best way to prevent killing sprees is with relationships — when one person notices that a relative or neighbor is going off the rails and gets that person treatment before the barbarism takes control.” Just a few weeks ago, however, The New York Times Magazine featured an article by journalist Jeneen Interlandi, “When My Crazy Father Actually Lost His Mind,” describing her and her family’s futile efforts to hospitalize her father long enough for him to regain stability.

Though the Virginia Tech shooting in particular has provoked huge changes in protocols for how mental health issues on campus are addressed, I have noticed an uneasy silence among my colleagues about these mass killings, especially considering that most of the shooters have had some sort of connection to an educational setting. When Jared Lee Loughner – who, incidentally, is still not stable enough to stand trial as of this writing – shot 18 people, including Congresswoman Gabrielle Giffords, I read with anxiety about the many, many attempts by teachers and students to alert the college of Loughner’s threatening behavior. My first thought was “There but for the grace of God go I,” because Loughner could easily have been one of my own students. Likewise, administrators at Virginia Tech asked Seung Hui-Cho to get mental health treatment, but because he was an adult, the college had no real way to enforce their request.

In 1997, the summer between my two years of graduate school, I worked as a temporary executive assistant for a mental health agency, and the site where my office was located also served as a day program for severely mentally ill patients who could not function on their own and were unlikely ever to recover. In August of that summer, a man with a sword held up downtown Seattle for 11 hours, after which police finally subdued him by blasting him with a fire hose. It turned out he was a client at the agency where I was temping, and one of the administrators and I had a long conversation about what it was like to devote your life to social work with a population of people who would never really be functional. Violence among mentally ill patients is extremely rare, but, in that particular setting, so was recovery.

Many people have commented that what Holmes did is evil, an impression certainly fostered by his emulation of Heath Ledger’s portrayal of The Joker. My observation is that it seems easier to accept that someone might be evil than that someone might be incurably, dangerously psychotic. It is almost un-American to consider that there are some illnesses that have no cure, or sufferers whose bootstraps are terminally inadequate, or people who cannot keep themselves from harming others. Even when a mental breakdown seems patently obvious, as in Jason Russell’s post-Kony meltdown in San Diego, the default reaction is contempt or mockery. I have had many students with criminal histories in my classes, including occasional sex offenders, and many students with a variety of mental illnesses, some of them serious. So far, fortunately, there has been only one time, years ago, when I felt like I was in danger, though once in a while a student has been, and every time there’s a media blitz about a student assassin, I wonder what I would have done if the student had been in one of my own classes.

This afternoon, Fox News reported that Holmes sent a psychiatrist at his school a notebook with the details of the massacre he was planning, but that it sat in a mailroom unopened for over a week. It’s not clear whether the psychiatrist was treating Holmes or was one of his professors, but it’s yet another instance of clues to a dangerous young man’s mental state being ignored, dismissed, denied, or just plain missed. (As for the news that gun sales have gone up since the shooting, I’m not even going there.) A judge has issued a gag order on the case, so neither the school nor the police will comment on the record.

I am not sure whether Holmes is evil or not, but it seems futile to treat out-of-control situations and people as though they have control. Interlandi’s article about her father makes it painfully clear how difficult it is to get treatment for an adult who doesn’t want it; and the reporting on the Aurora massacre makes it clear how easy it is for a murderer to get guns, ammunition, and explosives. David Brooks blithely talks about “getting [a mentally ill person] treatment,” but without the means to compel someone to get treatment – and adequate treatment facilities for those who need it – we can expect more violence, not less. It is easy to slap the “EVIL” label on senseless violence, but when violence occurs so frequently, it’s not an aberration: Instead, it’s an inevitable collision between lax-to-nonexistent gun law enforcement, a dysfunctional mental health system, and the tragic consequences of letting ideology trump compassion…over and over again.

Advertisements

What to Wear to the Apocalypse

Average temperature change since 1970, from http://www.climatecentral.org

I remember when the weather used to be a safe topic of conversation. Actually, the weather hasn’t been a safe topic of conversation for about 150 years, but around the time Al Gore released An Inconvenient Truth, the subject became downright inflammatory.

I was an adjunct at the time, and when I heard that one of the feeder districts for the community college where I was teaching – Federal Way School District in Washington State – had placed a “moratorium” on showing the DVD and required teachers using the film in class to include a “credible, legitimate opposing view” (which, according to all but a handful of climate scientists, does not exist), I decided to incorporate it into my syllabus. The now-extinct Seattle Post-Intelligencer reported with barely concealed sarcasm that the regrettably named Frosty Hardison had contacted the School Board to oppose the film:

“Condoms don’t belong in school, and neither does Al Gore. He’s not a schoolteacher,” said Frosty Hardison, a parent of seven who also said that he believes the Earth is 14,000 years old. “The information that’s being presented is a very cockeyed view of what the truth is. … The Bible says that in the end times everything will burn up, but that perspective isn’t in the DVD.”

Frosty, in name and deed, seemed to embody all that was ridiculous about the Federal Way policy. But, in class, the film generated not controversy but daytime sleepiness. The students argued, legitimately, that an argument attached to a politician was suspect; and the lack of popular coverage of the issue, combined with my lack of knowledge at the time, made the whole topic less than engaging in class. Soon, media outlets began picking apart the facts in the film, with the drowning polar bear animations drawing particular skepticism. By the time the “Climategate” scandal hit, naysayers had so thoroughly bombarded the media with talking points that the exoneration of the UK’ s Climate Change Unit had become a lot like the absence of WMDs in Iraq: something factually true that was believed to be false. Besides, the scientists’ failed public relations strategies didn’t change scientific reality.

Awareness of global warming has finally filtered down to the general public. I was at a store earlier this week and commented at checkout, “It’s a lot hotter than advertised outside,” and, rather than a bland response, the clerk said, “Global warming! Well, we were warned the apocalypse would come in 2012…”

Here in DC, we have already had seven days over 100 degrees, including a record-breaking four in a row (so far). Millions lost power in a massive derecho; heat buckled Metro tracks, causing a derailment; we’re in the midst of the worst drought since the 1950s, with 80% of the country “abnormally dry”; up to 15 inches of rain fell in Texas; the Quileute Indian Nation of Washington State, featured in the Twilight film and book series, is facing the obliteration of tribal lands due to rising sea levels; twenty-nine square miles of Colorado is now burnt to ash. Outside the U.S., a chunk of ice the size of two Manhattans has broken off Greenland. While scientists can’t claim conclusively that warming is directly responsible for a particular weather event, they have said repeatedly that the violent weather we’ve experienced in recent years is “consistent with” what would be expected as the planet overheats. In other words, global warming has become obvious even to the average unconcerned citizen.

Unfortunately, however, as Elizabeth Kolbert points out this week in her impassioned New Yorker essay, “The Big Heat” (July 23, 2012), politicians have barely mentioned the climate. With the exception of Bill McKibben’s much-forwarded Rolling Stone article, “Global Warming’s Terrifying New Math,” which has appeared in my facebook feed at least a dozen times now, the climate news that has appeared in news outlets at least every day has been largely ignored in favor of a constant barrage of election news (and now, as we process the tragic Batman shooting in Aurora, CO, with arguments about gun control). If you read only one article I’ve mentioned in this blog entry, McKibben’s is a good one, his main point being that if we burn all the fossil fuels currently in the ground, we will end up as science fiction characters in a mostly uninhabitable world.

We are rightly concerned with sublunary national issues – the 1%’s plunder of the rest of the nation, the war on women, Mitt Romney’s cruelty to both animals and humans, the fall of the JoePa statue, advancements toward legal gay marriage, the hegemony of the SuperPacs, attacks on and defenses of Obamacare. The divisions between red states and blue states have never seemed so livid, and they pour into my email and news feeds in gushes of lurid partisanship.

If you’ve read this blog once or twice, you probably have a good idea of my political leanings. But I find myself worrying about the decimation of species – in the most likely scenario, 25% of species will reach extinction by 2050 – and the disappearance of the world’s coral reefs. When I’m out walking or driving, I try to memorize the contours of plants and the beauty and sounds of birds, because probably many of them will disappear before I die. I see that I live in a wealthy country that will most likely manage to keep its grocery stores filled, unlike Kivalina, Alaska, which will soon be under the ocean; or the Sahel, which is withering away into desert. These effects will both outlast and dwarf the impact of our other social policies.

There can be no lasting social justice without environmental justice. Since life has existed on Earth, every other species has responded to shortages of food, water, and shelter with massive die-offs. Unless we make use of the science we have, the human species will be no different. We can look out at our blue universes and our red universes, but if we are living on a brown, dead planet, our politics will no longer matter.

On the other hand, maybe by fighting together to save the world, we can save ourselves; and maybe by fighting to save ourselves, we can save what’s best in our shared humanity. The first step is to try to see past the red and blue so that we can do what it takes to hold on to what’s left of the green.

My Great Freeway-Flying Adjunct Apprenticeship Adventure

When I was an adjunct, I was always mightily annoyed when full time faculty fell into a piteous lament about the “plight” of part time faculty. It wasn’t just that these moments of disingenuous sympathy often occurred just after the full-timer in question had just bumped me from a class; more, it was the idea that my fellow freeway flyers and I were helpless victims of a system that neither we nor the tenure-track faculty could influence.

In some ways, of course, this assessment of the over-reliance on part time faculty in higher education is absolutely accurate. Part time faculty are underpaid, and hence they often overwork – often at several colleges, often long commutes apart – in order to pay the bills. Students, in something like at least half the classes at most institutions, end up with exhausted professors whose energy must be scattered among some more-than-reasonable number of classes. The faculty themselves, moreover, are expected to provide a quality education even though they do not have a permanent workspace, dedicated computers, or even, sometimes, a file cabinet for stashing their stuff. If these disadvantages were not enough, in many institutions the adjunct offices are housed in overcrowded Siberias relegated to the most distant edges of the campus, where they are conveniently removed from the life of their home departments.

In these Siberias, legend has it that adjuncts never get full time jobs, that they sometimes get full time jobs but never where they have worked as part time faculty, and that they sometimes get full time jobs, but only if they are anointed to these jobs within a certain number of years, which varies depending on whom one asks. When I started as an adjunct, it seemed to me that the dusty, grayish film that coats everything in these offices had penetrated deep into the morale of my colleagues, who were dedicated and skilled teachers staggering underneath a massive lead cape of injustice.

It is true that the numbers of part time faculty vastly eclipse the number of full time positions available, and that this situation will persist until public higher education receives adequate funding. It is also true, however, that nearly all the full time faculty everywhere I have ever worked – six different community colleges and a for-profit, but who’s counting? – were once part time faculty. If part time faculty never get full time positions, how can this be?

Before I got into teaching, I worked in the private sector, where, at least when I entered the workforce, there were two prevailing views: one, employees should get promoted for loyalty and seniority; or two, they should get promoted for doing superior work, regardless of seniority. Well, we all know how that dichotomy played out.

Before I worked in the private sector, I was a student, paying dearly (with the help of my parents) for the privilege of slaving over papers and scribbling notes in overpriced books. Why do we students pay for an education, even though we’re overworked and largely uncompensated? Because we’re not uncompensated. We’re getting an opportunity to learn things, and from people, we would otherwise not be able to access. A degree itself has no particular value; as I’ve occasionally quipped to whiny students, “If you don’t want to learn, just make a diploma in Photoshop.” It is the ability to apply and synthesize an education that makes it valuable.

I have written in a different blog entry about the incredible mentorship I received from full time faculty during my adjunct years. I was amazingly fortunate to work with colleagues who were so generous with their support. At the same time, though, I was the consummate freeway flyer, routinely zipping to three campuses and teaching five or six classes in a day, sometimes covering 200 miles from the time I left in the morning until the time I returned at night. I kept a file box behind my front seat, since I usually didn’t have file space; at one campus, I held office hours in the copy room because I didn’t have an office or a desk. It was only when I became full time that I realized how much time had been usurped as I sorted and moved files from campus to campus. Several times I had classes taken away at the last minute. More often, I accepted more classes than I could handle because I was afraid that if I said no I would never be hired again at that college. In a given year, I worked 166% of a full time load to make less money than a full-timer.

But I still refused to call it “plight.” In colonial times, fourteen year-old boys signed up for seven-year stints with master craftsmen, their only compensation the skill they would take with them when they completed their apprenticeships. Early child labor laws limited workdays to twelve hours, at least on paper, the implication being that the actual work hours were probably longer.

I found it helpful for both my professional growth and my morale to think of adjunct teaching in similar terms: apprenticeship, not servitude. As in college and graduate school, I was paying to gain knowledge, experience, professional development, and familiarity with the discipline. I could use the time passively, teaching my classes and then going home; or I could use my scattered existence as a pretext for learning how different colleges solved common problems. Teachers constantly talk about “stealing” ideas from colleagues. How amazing that I could steal from colleagues at six different campuses! I might have limited opportunities at any given college, but by doing so, I was able to piece together a fairly rich professional existence.

It is the time of year when a lucky few are notified that they have crossed the threshold into full time employment, while the much greater numbers of the not-as-lucky sink into a seasonal rite of discouragement. This discouragement motivated me, finally, to broaden my search to out of state colleges. In the private sector, was a given that just doing my job well was not enough. In the summer months, when disappointment flowed and work opportunities ebbed, I found it helpful to remember how much I was learning from my itinerant existence; and to remind myself that by learning a little bit more, my chances would improve in the next hiring season.

Many of the benefits of my approach weren’t evident until I started working full time and using what I’d learned. Because there are many aspects of contingent employment that are truly exploitative, I can understand why some of my colleagues might find my attitude controversial. I can say, however, that (to paraphrase Simon & Garfunkel) I’d rather be a hammer than a nail…or, in other words, that my apprenticeship had far more dignity than my plight.

Ink Well Mag’s “Milestones” Edition

One of my stories appears in Ink Well, and I would love it if you visited!

Here’s the corrected link. I have had intermittent problems getting the page to load, but the link does work…thanks for reading, if you do.

Ink Well supports emerging creative writers, photographers, and artists. Hope you’ll support them.