David Kordahl

Archive for the ‘Lit’ Category

Higgs for the Masses

In Lit, Physics on 2013/05/13 at 5:04 pm

At the beginning of the last century—so it goes in the popular science books—there were two great revolutions in physics. You can skip down a few paragraphs if you’ve heard this one before. Because in such volumes, the story’s well-rehearsed: Planck explains the blackbody curve, Einstein explains light’s constant speed, and from these two discoveries arrive two of the great cornerstones of 20th century physics: quantum mechanics and relativity. Further explications follow. A decade after his original unveiling, Einstein forges ahead and applies relativistic thinking to gravity, thus inventing general relativity. In the next twenty years, quantum physics, too, skips from success to success, from Bohr’s atom to de Broglie’s matter-waves to Schrödinger’s wave equation.

Tragic versions of this story weave these strands together with the research of Szilárd, who understands that the quantum decay of the nucleus coupled to the relativistic equivalence of matter and energy could create a chain reaction—an insight that, with the additional research of Fermi, will lead to the atomic bomb.

But triumphant versions can’t stop there. After all, the greatest marriage of relativity and quantum mechanics won’t occur until later, with the rise of quantum field theory, the fully relativistic treatment of those old quantum waves. In post-war years, particle smashers grow ever larger, ever more unwieldy, and along with these impressive machines come large, unwieldy methods. Physicists divide infinity by infinity to find their answers—which match the experiments, nevertheless, up to the 11th digit. So they press onward, trying new theories against new experiments, one after the other, until successful quantum field theories have been constructed for electromagnetism and the weak forces that make particles decay (electroweak theory, spontaneously broken via the Higgs mechanism), and for the strong forces that make them stick together (quantum chromodynamics).

Flash forward, then, to 4 July 2012, to a conference at the Large Hadron Collider. The LHC is the world’s biggest machine, a proton-smashing ring whose 27-km circumference stretches from France to Switzerland and back, and on this day its two major experiments, CMS (the Compact Muon Solenoid, hailing from France) and ATLAS (A Toroidal LHC Apparatus, from Switzerland) have gathered to announce their latest results. They’re still testing those quantum field theories—theories, over the years, that have blandly become “the standard model”—of whose parts only one remains unverified: the Higgs field, with its evidence, a boson.

Interest is high. For the week prior, #HiggsRumors has trended on Twitter. And why shouldn’t it? On how many $10 billion projects, non-military, can we expect world governments to collaborate? Hasn’t it been a while since the last “fundamental particles” were discovered—2000 for the tau neutrino, 1995 for the top quark? And with the standard model itself around since 1974, isn’t it about time we know, one way or the other? It’s tense around 10AM (4AM EST), when news comes. But yes! Cue cheering, followed by dashed-off reports. “This is the day,” liveblogs Jester, a Parisian physycist. “The most important day for particle physics in this century, and probably ever.” Definitive evidence of the Higgs boson—or (better to err with caution) of a “Higgs-like particle”—has finally arrived.

Except…now what?

CERN, the LHC’s funder and HQ, has had its share of high-profile cameos (remember that anti-matter briefcase in Angels and Demons?), but readers wanting an inside scoop might eventually track down some from the slew of books detailing the machine’s promised glories. Or better, just two: Lisa Randall’s Knocking on Heaven’s Door, published a few months before the Higgs discovery, and Sean Carroll’s The Particle at the End of the Universe, published a few after. Before digging in, I may as well admit that although Randall’s book is long and somewhat boring, it’s accessible to the layperson and contains a lot of good info, and that although Carroll’s book is short and more consistently interesting, it contains much of the same good info and may be a tough slog for those readers whose trips to the science section of the bookstore are less frequent. But instead of jumping directly to a thumbs up-or-down just from considerations of brevity or ease, it’s worth considering how these superficial similarities give way to rhetorical differences—differences that reveal tough problems of the genre, and maybe even of the subject itself.

Randall and Carroll are both active scientists, both theorists, with Randall focused on high-energy physics a la the LHC, and Carroll focused on theoretical cosmology. Both, too, are sophomore popular science writers, each with a previous book nearer their areas of research than these outings. Warped Passages: Unraveling the Mysteries of the Universe’s Hidden Dimensions positioned Randall’s extra-dimensional theories within the framework of modern physics; From Eternity to Here: The Quest for the Ultimate Theory of Time allowed Carroll to discuss time and thermodynamics from a cosmological perspective. Of course, their positions as working scientists-cum-science popularizers are nothing new. The case could be made that the tradition stretches back to Galileo, when he framed his Dialogue Concerning the Two Chief World Systems between two natural philosophers arguing their cases to an intelligent layman.

Galileo, however, had certain advantages. He was able, more or less, to bring his readers to the current state of knowledge within a few pages, whereas modern popular science writers have to judiciously select highlights from the past 500+ years of scientific backstory, lest bored readers quit before they reach the new stuff. It’s a tricky balance. Randall includes more background than Carroll—enough that readers with any scientific training will have a hard time not branding her a pedantic bore. Carroll, on the other hand, moves briskly from one narrative checkpoint to the next, but this very briskness may outpace readers better suited to Randall’s slow burn.

Randall, after all, has crafted her book with novices explicitly in mind. From the intro: “Modern physics might appear to some to be too far removed from our daily lives to be relevant or even readily comprehensible, but an appreciation of the philosophical and methodological underpinnings that guide our thinking should clarify both the science and the relevance of scientific thinking—as we’ll see in many examples. Conversely, one will only fully grasp the basic elements of scientific thinking with some actual science to ground the ideas. Readers with a greater taste for one or the other might choose to skim or skip one of the courses, but the two together make for a well-balanced meal.”

Reader preferences are, of course, personal, but that bit about our needing a “well-balanced meal” made at least one reader feel that this might not be a trip he wanted to take. But, longtime student of physics that I am, my tolerance for condescension is high, and through the wordy marshes I continued my tramp. In the first seventy pages, Randall makes great pains not to go too fast (we slow learners could not be expected to keep up) and spends entire pages dismantling the doctrines of The Secret, gently explaining why the precepts of scientific materialism can’t sync with the Oprah-endorsed gospel of wealth through positive thinking. We’re forced to wade through discussions of science and religion, risk and expertise, and other topics that practically beg for bullet-point summary.

Even when we get to chapters on the LHC (“One Ring to Rule Them All” is followed by—wait for it—”The Return of the Ring”), the delivery is hampered by pomp. “I am not one prone to overstatement, since I usually find that great events or achievements speak for themselves,” Randall declares. “The reluctance to embellish can get me into trouble in America, where people overuse superlatives so much that mere praise without an ‘est’ at the end is sometimes misinterpreted as slander by faint praise. I’m frequently encouraged to add a few buzzwords or adverbs to my statements of support to avoid any misunderstanding. But in the case of the LHC I’ll go out on a limb and say there is no question that it’s a stupendous achievement.”

Stupendous! Though it may be a courtesy that she goes “out on a limb” in this way, on the next page, prior the LHC vitals, she includes a clotted recitation of authorities, just so we’re sure what to think. “The actor and science enthusiast Alan Alda, when moderating a panel about the LHC, likened it to one of the wonders of the ancient world. The physicist David Gross compared it to the pyramids. The engineer and entrepreneur Elon Musk—who cofounded PayPal, runs Tesla (the company that makes electric cars), and developed and operates SpaceX (which constructs rockets that will deliver machinery and products to the International Space Station)—said about the LHC, ‘Definitely one of humanity’s greatest achievements.'”

One has to wonder why these advertisements remain a part of the text. After enough celebrity cameos (a few paragraphs on Nate Silver in the risk chapter, a meeting with Bill Clinton in which he agrees, of the 1993 Superconducting Super Collider cancellation, that “humanity had forfeited a valuable opportunity”), I began to wonder if Randall’s odd stodginess might have something to do with her veneration of authority—a suspicion that was strengthened in reading “The Role of Experts,” a subsection where Randall forwards a number of dubious claims.

It starts reasonably enough. She explains why, despite global alarm, there never was a credible threat of a black hole apocalypse (N.B. the cosmic rays of higher energy than any possible LHC production), and uses this springboard to lecture on why, while “experts” may have let us down in the last financial brouhaha, such disappointments aren’t likely to emerge from the hallowed halls of science. “Science is not democratic in the sense that we all get together and vote on the right answer. But if anyone has a valid scientific point, it will ultimately be heard.” She cites the example of a young theorist, Lubos Motl, who solved a problem in the Czech Republic and was immediately recognized by a more prominent scientist, Tom Banks, from Rugters. “Not everyone is so receptive,” she admits, “But so long as a few people pay attention, an idea, if good and correct, will ultimately enter scientific discourse.”

This is an amazing level of optimism. It implies that members of established scientific institutions are aware not only of many good and interesting ideas, but indeed of all the thoughts worth thinking, all the good ideas bouncing through this world of seven billion souls—a magical feeling! Perhaps, as Randall suggests, the Internet may change everything, but one can’t help but recall science history’s retrospectively important obscurities (the Teslas, the Mendels) whose contributions are recognized only in far hindsight.

Carroll has learned from the missteps of Knocking on Heaven’s Door (the book appears in his “Further Reading”), and in The Particle at the End of the Universe, from the title’s Douglas Adams nod on, he speaks directly to the nerds, to those readers ready to paw directly into the thickets of detail. On the first page of Chapter 1, he notes how particle physics is a “curious activity” of “essentially no impact on the daily lives of anyone who is not a particle physicist.” This refusal to justify the LHC in terms of anything but itself, however, seems a natural way to cue self-selecting readers to whether or not they’d be interested in this voyage. One shouldn’t pick such a volume because it’s important. “Particle physics arises directly from our restless desire to understand our world; it’s not the particles that motivate us, it’s our human desire to figure out what we don’t understand.”

And we’re off! Within the first 20 pages, we’ve already visited the LHC and know how the Higgs boson fits into the Standard Model of particle physics; in another ten, we’re deep inside a discussion of how the Higgs boson is significant only as evidence of the Higgs field, and how calling it the “God Particle” will invoke most physicists’ wrath.

If this sounds daunting, maybe it is. Carroll’s assumption of audience foreknowledge should delight techies, but it’s at the risk of alienating everyone else. The clear benefit, though, is in the enormous amount of material that’s covered in a very brief span, and with a light enough touch so as not to exclude occasional wit. (Sample geek joke: “SLAC originally stood for ‘Stanford Linear Accelerator Center,’ but in 2008, the Department of Energy officially changed it to ‘SLAC Linear Accelerator Center,’ perhaps because someone in a position of power is fond of infinite recursion.”) Where Randall’s book is discursive, Carroll’s is direct, following a half-logical, half-historical route toward enlightenment. But rather than expecting that he’ll need to coax us into being interested, he cuts straight to the chase.

E.g., after some of the best chapters on quantum fields I’ve seen in a book at this level, here’s the lede for Carroll’s Higgs Discovery How-To:

Let’s take a step back and think about what it takes to discover the Higgs boson, or even find tantalizing evidence for its existence. To dramatically oversimplify things, we can boil it down to a three-step process:
1. Make Higgs bosons.
2. Detect the particles they decay into.
3. Convince yourself that the particles really came from the Higgs, and not something else.
We can examine each step in turn.

Remarkably enough, he then proceeds with a section on each step of the process—exactly as promised. These pages alone are worth the price of admission.

However, the précis above may suggest a bigger separation between the books than actually exists. With such close overlap of purpose, there’s bound to be a lot of shared material, even down to the anecdotes. Both relate how the CMS experiment was delayed six months when the site was found to contain Gallo-Roman ruins. Both give ample space to why the particle detectors are layered like cylindrical onions, with the inner tracker surrounded by the electromagnetic calorimeter surrounded by the hadron calorimeter surrounded by the muon detector, and why it matters they’re in that order. And both remind us of how, despite the LHC’s boasts of high-tech swag, it nonetheless won’t match the onetime projections for the Superconducting Super Collider, that American project whose 1993 cancellation followed our sinking $2 billion into an empty tunnel outside Waxahachie, Texas.

The main thing that separates these projects is their distinct attitudes toward their own scientific authority. Whenever I’m reading a popular physics book, I instinctively place it somewhere on a self-invented scale: the Hawking-Penrose Continuum, after Stephen Hawking and Roger Penrose, those two old lions of the genre. Hawking’s approach to writing for the public—an approach, given A Brief History of Time‘s 10 million copies sold, that’s sort of hard to dismiss—is to skip explanations in favor of answers…which is to say, in favor of whatever conclusions smart guys like himself have managed to conclude. Penrose, in contrast, is an explainer. His books—The Emperor’s New Mind, The Road to Reality—are tough enough to attract few imitators, but their taut expositions turn readers into informed critics of their arguments. Paired together, Hawking and Penrose illuminate pop science’s basic double-bind: Hawking seems egalitarian (anyone can read him), but his refusal to explain his answers raises scientists to the level of prophets; whereas Penrose omits no explanation, but the density of his arguments excludes all readers but the most committed.

On this scale, Randall edges toward Hawking, Carroll toward Penrose. When Randall tells us about a project she undertook to address the LHC’s black hole production, she remarks without elaboration, “With a more careful calculation, we found that the number was much less than physicists had originally optimistically predicted.” How this calculation might have been accomplished, we’re not expected to ask. While Carroll avoids any equations, it’s hard to imagine his including that remark without at least a vague sketch of the process. He manages to avoid such obfuscation through the whole main text, only breaking down in Appendix One (pg. 289), when he finally begs pardon. “Trust me here. It’s hard to think of a sensible explanation that doesn’t amount to going through all the math.”

But in the end, publication dates may turn out to be just as fundamental a difference between these two books as their rhetorical orientations. Randall, recall, has a horse in the race—her extra-dimensional models are being tested at the LHC even now—and, again recall, her book was released pre-Higgs Announcement. She has reason for hope. This hopefulness can’t mask, though, that the preemptive suggestions of what the LHC might find are all pretty darn tentative. “The first, and perhaps most worrisome [reason to doubt supersymmetry], is that we have not yet seen any experimental evidence.” “Not yet having seen any evidence poses a significant constraint on technicolor models.” “Clearly, since we don’t see them, these new dimensions of space must be hidden.” Regardless of such small quibbles as exactly zero evidence, she gives this confident summary: “The wait is a little anxiety provoking, but the results will be mind-blowing.”

The reasons for hope and fear aren’t so far separated. And in Carroll’s case—post-Announcement—I wonder if the timeline of unrealized promise has been replaced by one of forced cheer. Sure, Chapter 12 covers the same proposals as Randall, but his titular “end of the universe” could just as easily refer to the uncertainty that accompanies the Higgs discovery. The finale of Particle echoes its earlier coverage of the SSC, where physicists opposing that project objected to its huge $-inputs, arguing that smaller projects would yield better science-outputs. Even with its peppy reminders that past government investments have returned a handsome 28%, the last chapter’s insistence that the fate of a next-gen collider “is for the human race as a whole to decide” seems dicey, especially with estimates for the International Linear Collider that “range anywhere from $7 billion to $25 billion.”

This just masks the obvious question: What if the LHC uncovers nothing? What if, after all the heroic efforts and high hopes, it turns out that the standard model is perfectly adequate all the way through the highest available energies?

By now, evocations of physics entering the death-spiral of its own success have entered the standard set of pop-sci tropes. Neither of the titles here lets such worries rise naked to the surface, but it’s hard to deny some existential nervousness. Back in 1996, John Horgan published The End of Science, a volume whose framing device was to ask science luminaries if, after so many centuries of false starts, maybe the fundamentals are finally in place. An automatic rebuttal—which, to be fair, Horgan addressed—is that we’ve been embarrassingly wrong about such things before (cf. the comments of Albert A. Mickelson, 1884: “It seems probable that most of the grand underlying principles have now been firmly established”). But with the discovery of the Higgs boson, support for the last remaining piece of the standard model, such questions may easily resurrect. Could this time be different?

Could be. Could be, though, that it’s not science as a whole, but only the accelerator tradition, in its current maximalism, that’s reaching the end of its possible permutations. These books arrive at what may be the terminal edge of that font of known unknowns, and it happens just as the standard model passes from the arena that works of popular science are best suited to explore—from the world of the barely discovered, of the still possibly incorrect—well into textbook territory. This isn’t a bad thing; look back far enough, and the same happened to Newtonian gravity, or Maxwellian electromagnetism, or all the rest. Discoveries won’t stop (both books, for instance, bring word of competing experiments that are busy in their attempts to detect ambient dark matter), but it may be that we’re approaching the end of an era when we knew from which direction new discoveries would arrive.

What we now need, and what physics writers now struggle to produce, is a post-revolutionary lit: works that can stop being so impressed by the breakthroughs of the 20th century that they forget we’re in the 21st. The revolution is here, and we’re ready for something new.

Where to start with American Psycho author Bret Easton Ellis

In Lit, Pop Culture on 2012/11/04 at 4:50 pm

This post emulates the “Gateways to Geekery” column over at the A.V. Club. In fact, I tried pitching it to them but was met by no response. Turns out that website doesn’t accept pitches. Which I should have researched before writing the piece. In any case, I enjoyed working through some thoughts on BEE’s oeuvre and now post this here, FWIW.

Geek obsession: The novels of Bret Easton Ellis

Why it’s daunting: Bret Easton Ellis is one of the few novelists to suffer the privilege of growing up in public. His first novel, Less Than Zero, became a bestseller when he was just 21, The Rules of Attraction came only two years later, and at age 26, when most writers are still struggling to produce a first novel, his third, American Psycho, became the center of a National Organization of Women boycott that caused Simon & Schuster, his original publisher, to drop the book, citing “aesthetic differences” extreme enough for them not to require any back-payment on the $300,000 advance.

We should all be so unlucky. But while American Psycho cemented Ellis’s reputation as a literary provocateur, the arguments surrounding it often conflated the author with his narrators—a despicable group of typically rich, typically white, male narcissists. (In a famous Vanity Fair take-down, Norman Mailer asked, “Is Bateman the monster or Bret Easton Ellis?” and went on to speculate that the author had used his art as a perverse form of therapy.) It didn’t help, either, that Ellis himself seemed to encourage this conflation, from book-jacket author photos that matched narrator descriptions to, in the case of Lunar Park, a debauched narrator named—what else?—Bret Easton Ellis. For fans, this may all come off as good old postmodern fun, but anti-Ellis critics have their pick of damning details, all the way up to his most recent Twitter-tantrums. Despite the almost exclusive focus that the books put on manners and morals, it’s not always easy to unpack the layers that exist between author, narrator, and page. Add to this the preconceptions built by the movie adaptations (not always negative—Less Than Zero and The Informers may be middling films, but American Psycho and The Rules of Attraction are both excellent), and the summed barrage of mediation makes it hard to approach the novels fresh.

Possible Gateway: The Rules of Attraction

Why: As the follow-up to a celebrated debut, The Rules of Attraction may be the least-read BEE novel, but it’s also probably the most accessible. This is for two main reasons: 1. Due to frequently switched narrators, it’s easier here than anywhere else in the Ellis catalog to distinguish between the POV of the author and his characters, with overlapping descriptions serving to highlight dramatic ironies; and 2. It has something approaching a traditional plot. A campus novel set at Camden College (a barely disguised stand-in for Ellis’s alma mater, Bennington), Rules takes the oldest story around—the love triangle—and infuses it with a particularly Gen-X aura of ennui and dread.

Of course, this being an Ellis novel, the update is anything but straightforward. The Fall 1985 term includes all the trademark permutations of sex, drugs, and violence (actual plot-points: a virgin’s drunken gang-bang, the suicide of the one character who believes in true love, a few scattered abortions), and the love triangle itself is more like a rhombus or pentagon or zigzag, wherein Lauren loves Victor, but Sean loves Lauren, Paul loves Sean, Stewart loves Paul, etc. Just about everyone here is bisexual (at least), and whether or not two characters count each other as “lovers,” they’re likely already somehow connected at the hips. The story is eventually intercut by more than ten voices, but most of it is told by three distinct narrators—Lauren is the most self-critical, Paul the most poetic, Sean the least aware of a world outside his own libido—none of whom is able to overcome the feeling that it’s impossible to connect with anyone else. The bleakness of this theme might suggest that the novel is a dull slog, but its abundance of comic dialogue (Ellis’s greatest strength) keeps things as light as possible. Even as they illuminate a vision of social decay, the characters, like the ghouls of a latter-day Fitzgerald, are buoyed up by their frothy, glamorous doom.

Next steps: There’s something to be said for reading BEE’s novels in order. Recurring characters and in-jokes carry over from one to the next (even in The Rules of Attraction, there’s a cameo section by Clay, the narrator of Less Than Zero), and content from one novel can serve to enrich the next. A case in point: while Sean Bateman is the nearest The Rules of Attraction comes to having a main character, he only makes a brief appearance in American Psycho, while Patrick Bateman, Sean’s older brother, shows up in Rules just to hector Sean for his irresponsibility, only to return as the serial-killing cypher of American Psycho. Likewise, if you watch carefully, you might notice that the elusive “Victor” of Rules shows up again as the narrator of Glamorama, post-surname-change, or that the detective of American Psycho eventually returns to levy charges against Bret Easton Ellis in Lunar Park…and the arcana piles up. Spotting convergences is a part of the fun.

That said, if you’re not gonna down them all, the decision on what to read next mainly hinges on a choice between West Coast and East Coast. Disregarding a few outliers (i.e., The Rules of Attraction and Lunar Park), Ellis has written basically two types of books: Los Angeles books, and New York City books. The L.A. stories—Less Than Zero, The Informers, and Imperial Bedrooms—are heavily influenced by the minimalist style of Joan Didion and tend toward terse, elegant expositions. On the other hand, American Psycho and Glamorama—the N.Y.C. stories—veer toward an opposing maximalism, cramming in every name brand and celebrity appearance, with described surfaces having an information density that recalls the image-fiction of Don DeLillo and Thomas Pynchon.

The N.Y.C. books are Ellis’s longest, his most experimental, the works where, if there is something sui generis and untraceable about Ellis, it’s there. Everyone knows the conceit of American Psycho from the film adaptation (banker Patrick Bateman murders hookers and business associates in Manhattan), but the movie can’t capture the sheer volume of the book’s references—page upon page of descriptions based on nothing but the symbols of consumer society. This is not a book like Silence of the Lambs or even Psycho, where the protagonist’s condition can be reduced by a psychological analysis. The uncomfortable implication instead shifts the focus to the systematic effects of late capitalism. A few hundred pages in, Bateman tells us, “There wasn’t a clear, identifiable emotion within me, except for greed and, possibly, total disgust. I had all the characteristics of a human being—flesh, blood, skin, hair—but my depersonalization was so intense, had gone so deep, that the normal ability to feel compassion had been eradicated, the victim of a slow, purposeful erasure. I was simply imitating reality, a rough resemblance of a human being, with only a dim corner of my mind functioning. ” It is a testament to book’s power that, by the time this observation arrives, we believe it. In Glamorama the theme of “erasure” is pressed even further, making narrator Victor Ward, an airhead fashion model whose world is overtaken by terrorists, into a figure who is literally replaced by an image of himself, who throughout cannot tell if something terrible is afoot or if he’s merely starring in a suspense film that he doesn’t quite understand.

Still, though these two books are probably his most interesting achievements, the L.A. stories that bookend his novelistic career bring Ellis back to his home turf, and there is no doubt that Ellis is also expert in his stripped-down evocations of the American West. Less Than Zero (iconic opening line: “People are afraid to merge on freeways in Los Angeles”) is so totally unlike the film adaptation that for the sequel, Imperial Bedrooms, there’s a meta-analysis of how the original book was written by an ambitious friend of theirs, which had been turned into a movie whose premiere they’d all attended. (This explains why Julian—Robert Downey, Jr., in the movie—isn’t dead, second time around.) Clay has become a screenwriter, and the horror of the original novel is amplified as we find that, now no longer a passive observer of the evil around him, he has become an adult participant. There may be some dark moral here about the inability of people to change, but it all leads to a closing line that seems to sum up the sadness of the Ellis hero. “The fades, the dissolves, the rewritten scenes, all the things you wipe away—I now want to explain these things to her but I know I never will, the most important one being: I never liked anyone and I’m afraid of people.”

Where not to start: While one potentially could start with The Informers, the collection of linked stories that arrived between American Psycho and Glamorama, the willful difficulty of its storytelling ellipses and chilly tone aren’t likely to win over many repeat readers. Lunar Park, conversely, might be the most emotionally transparent BEE novel (it’s a father/son story, written after the death of Ellis’s dad), but the weird conceit of a fictional narrator named Bret Easton Ellis who just happens to have written books with the exact titles and content of the previous novels by the real person named Bret Easton Ellis make it a sort of one-volume Ellis Institute for Advanced Studies: a curio that’s enjoyable enough for the initiated, but which may cause first-time readers to wonder what’s exactly the big deal.

The Low Road

In Lit, Pop Culture on 2012/02/20 at 2:59 pm

I read a short book on Horror over the weekend. Its title is Shock Value: How a Few Eccentric Outsiders Gave Us Nightmares, Conquered Hollywood, and Invented Modern Horror, and its author is Jason Zinoman. Here’s what I had to say in an Amazon.com review.

***

Let’s start with the good: this book is consistently entertaining, reads slickly, and is packed with the sorts of anecdotes that we turn to the entertainment pros to deliver. I’d not known before about Dan O’Bannon’s contributions to the genre, either. I’ll also say that Mr. Zinoman has a knack for framing the facts in terms of his thesis narrative, which basically posits that the problem with the “Old Horror” (pre-1970s) was that it was afraid of being nasty, whereas the “New Horror” (post-1970s) was willing to go full-bore for a scare, refusing to respect old conventions of rounded storytelling or good taste.

Unfortunately, there’s a bad side to this, too. Zinoman’s insistence that Michael Myers is a creation of genius because he doesn’t have a back-story reflects the lowest-common-denominator sort of thinking that the film showcases. Though they’re passingly mentioned, there’s not much room in this discussion for directors like Cronenberg or Lynch, who make you think even as they scare you. To Zinoman, this is contrary to the primary ethos of the New Horror, whose “dangerous” qualities are what set it apart.

This cherry-picking attitude also pervades the book’s ending critique of modern horror. While any horror fan knows that the present has spawned its share of terror masterpieces, Zinoman dismisses these works as somehow not counting as much because they’re not “mainstream.” For a piece whose subtitle praises “eccentric outsiders,” this seems like a complaint that’s either inconsistent or willfully reactionary.

All in all, this book isn’t a work of serious criticism but something more akin to an extended series of magazine articles—which isn’t to say that it’s bad. If you want to know more about John Carpenter, George Romero, Brian De Palma, and Wes Craven, this is a key work. If you’re interested in Zinoman’s blinkered overview of how this all connects to the present, on the other hand, you should probably read his Slate.com articles, first…but even then, even if Zinoman’s opinions aren’t quite to your liking, the committed horror fan would have to admit after an evening with this book that he’s spent late nights looking at worse.

***

Anyway, if you’d like to read the Slate articles, here you go. And since I had a simple suggestion on how to improve any discussion of horror films (viz., ADD MOAR CRONENBERG), I leave you with a lovely image of the surgically garbed Jeremy Irons, twin gynecologist at work.

Curational Blogging

In Internet, Lit on 2011/11/15 at 11:34 am

In my constant perusal of the Internet, I’ve noticed a trend that the kids over at tumblr seem to have taken up in full force: Curational Blogging. By this, I mean something specific. For many of the active bloggers over at tumblr’s hip enclave, the role of blogger has to a very large degree become correlated with the role of the curator–one who finds an attractive object, cares for it, and adds a small commentary re why the object is of interest to the curator or to the world at large. Of course, the real upsides to this sort of blogging are that it a) allows one to be tremendously prolific, scaling linearly with input time of web-surf, b) encourages the sharing of pretty things, even if one might not have the proclivity/skills that allow for pretty-thing-production, and c) is super easy. Lazy, the sneerer might say–but I’m trying to avoid such generalisations. After all, a number of such blogs can be enjoyable to browse and in fact manage, despite their far-flung production mechanisms, to display a unified personal sensibility, which I suppose is a large part of what authorship is all about.

Anyway, it’s point c) (the “it’s easy” one) that’s important to me, now, since I usually get stopped up in writing blogs with such exigencies as research…which we all know is exactly the type of attitude that keeps a person from being a successful blogger. With that in mind, here’s my first Curational Blog–an extract from Freud’s Civilization and Its Discontents. It is posted here only because I was thinking of it today, and (being unemployed) I had the time and interest to look it up. It’s probably demonstrably inaccurate, but that doesn’t matter: as a text that vibrates with my personal sensibility, it’s an addition to the modern oeuvre. Enjoy!

“This brings us very close to the more general problem of conservation in the mind, which has so far hardly been discussed, but is so interesting and important that we may take the opportunity to pay it some attention, even though its relevance is not immediate. Since the time when we recognized the error of supposing that ordinary forgetting signified destruction or annihilation of the memory-trace, we have been inclined to the opposite view that nothing once formed in the mind could ever perish, that everything survives in some way or other, and is capable under certain conditions of being brought to light again, as, for instance, when regression extends back far enough. One might try to picture to oneself what this assumption signifies by a comparison taken from another field. Let us choose the history of the Eternal City as an example. Historians tell us that the oldest Rome of all was the Roma quadrata, a fenced settlement on the Palatine. Then followed the phase of the Septimontium, when the colonies on the different hills united together; then the town which was bounded by the Servian wall; and later still, after all the transformations in the periods of the republic and the early Caesars, the city which the Emperor Aurelian enclosed by his walls. We will not follow the changes the city went through any further, but will ask ourselves what traces of these early stages in its history a visitor to Rome may still find today, if he goes equipped with the most complete historical and topographical knowledge.

“Except for a few gaps, he will see the wall of Aurelian almost unchanged. He can find sections of the Servian rampart at certain points where it has been excavated and brought to light. If he knows enough——more than present-day archaeology——he may perhaps trace out in the structure of the town the whole course of this wall and the outline of Roma quadrata. Of the buildings which once occupied this ancient ground-plan he will find nothing, or but meagre fragments, for they exist no longer. With the best information about Rome of the republican era, the utmost he could achieve would be to indicate the sites where the temples and public buildings of that period stood. These places are now occupied by ruins, but the ruins are not those of the early buildings themselves but of restorations of them in later times after fires and demolitions. It is hardly necessary to mention that all these remains of ancient Rome are found woven into the fabric of a great metropolis which has arisen in the last few centuries since the Renaissance. There is assuredly much that is ancient still buried in the soil or under the modern buildings of the town. This is the way in which we find antiquities surviving in historic cities like Rome.

“Now let us make the fantastic supposition that Rome were not a human dwelling-place, but a mental entity with just as long and varied a past history: that is, in which nothing once constructed had perished, and all the stages of development had survived alongside the latest. This would mean that in Rome the palaces of the Caesars were still standing on the Palatine and the Septizonium of Septimius Severus was still towering to its old height; that the beautiful statues were still standing in the colonnade of the Castle of St. Angelo, as they were up to its siege by the Goths, and so on. But more still: where the Palazzo Caffarelli stands there would also be, without this being removed, the Temple of Jupiter Capitolinus, not merely in its latest form, moreover, as the Romans of the Caesars saw it, but also in its earliest shape, when it still wore an Etruscan design and was adorned with terra-cotta antifixae. Where the Coliseum stands now, we could at the same time admire Nero’s Golden House; on the Piazza of the Pantheon we should find out only the Pantheon of today as bequeathed to us by Hadrian, but on the same site also Agrippa’s original edifice; indeed, the same ground would support the church of Santa Maria sopra Mi-nerva and the old temple over which it was built. And the observer would need merely to shift the focus of his eyes, perhaps, or change his position, in order to call up a view of either the one or the other.

“There is clearly no object in spinning this fantasy further; it leads to the inconceivable, or even to absurdities. If we try to represent historical sequence in spatial terms, it can only be done by juxtaposition in space; the same space will not hold two contents. Our attempt seems like an idle game; it has only one justification; it shows us how far away from mastering the idiosyncrasies of mental life we are by treating them in terms of visual representation.

“There is one objection, though, to which we must pay attention. It questions our choosing in particular the past history of a city to liken to the past of the mind. Even for mental life, our assumption that everything past is preserved holds good only on condition that the organ of the mind remains intact and its structure has not been injured by traumas or inflammation. Destructive influences comparable to these morbid agencies are never lacking in the history of any town, even if it has had a less chequered past than Rome, even if. like London, it has hardly ever been pillaged by an enemy. Demolitions and the erection of new buildings in the place of old occur in cities which have had the most peaceful existence; therefore a town is from the outset unsuited for the comparison I have made of it with a mental organism. We admit this objection; we will abandon our search for a striking effect of contrast and turn to what is after all a closer object of comparison, the body of an animal or human being. But here, too, we find the same thing. The early stages of development are in no sense still extant; they have been absorbed into the later features for which they supplied the material. The embryo cannot be demonstrated in the adult; the thymus gland of childhood is replaced after puberty by connective tissue but no longer exists itself; in the marrowbone of a grown man I can, it is true, trace the outline of the childish bone-structure, but this latter no longer survives in itself——it lengthened and thickened until it reached its final form. The fact is that a survival of all the early stages alongside the final form is only possible in the mind, and that it is impossible for us to represent a phenomenon of this kind in visual terms.

“Perhaps we are going too far with this conclusion. Perhaps we ought to be content with the assertion that what is past in the mind can survive and need not necessarily perish. It is always possible that even in the mind much that is old may be so far obliterated or absorbed——whether normally or by way of exception——that it cannot be restored or reanimated by any means, or that survival of it is always connected with certain favourable conditions. It is possible, but we know nothing about it. We can only be sure that it is more the rule than the exception for the past to survive in the mind.”

Klosterman Agonistes

In Internet, Lit, Pop Culture on 2010/12/12 at 8:18 pm

Would-be writers, in general, harbour perversely intense relationships with people they’ve never met. I’m not speaking of the ‘healthy’ (or at least vocationally beneficial) relationships that the fictionist is expected to nurture with made-up characters. I’m speaking now of the relationship of the Fan with the Author—the Author who, in the case of the would-be writer, is a person said Fan would like to become. There are the famously well documented cases of such mania: Norman Mailer’s adulation of Ernest Hemingway; Nicholson Baker’s stalking of John Updike. I suppose that the best documentation of this syndrome is Fred Exley’s A Fan’s Notes, the (perhaps deliberately) tedious novel/memoir that chronicles his dual obsessions with Frank Gifford and Edmund Wilson in more exhaustive detail than any reader could possibly request. Each of these anecdotes is passed on with a certain measure of self-deprecation and wry candour, due to the lofty reputations of such men as Hemingway, Updike, and Wilson, compared to those of their respective fanboys. So it is with some trepidation that I reveal one of the writers I’ve geekishly followed, at least retroactively, through his entire career: one Chuck Klosterman, a writer of celebrity profiles and heavy-metal tributes, an apparently unserious man who I’m slightly embarrassed to read, much less write about as a Fan. The reason I’ve been thinking about him again is that last week he published this in the NY Times, and, along with the tell-tale signs I spotted while reading his most recent book, Eating the Dinosaur, I’ve come to a (not so) shocking conclusion about this stalwart media enthusiast, my hero. Chuck Klosterman, I’ve decided, is sad.

But first, some background. Superficially, it might seem like Chuck and I are on two different sides of the rainbow in our approaches to reality: he’s a successful pop-culture analyst and rock critic, a man who has written essays claiming to have watched every episode of MTV’s The Real World at least 3 times; meanwhile, I study physics. But really superficially: we both came from North Dakota as youngsters, and like most people who emerge from that sort of experience as fully functioning adults not directly connected to agriculture or church, we both seem to have the sense that we’re in on some secrets about the Real America that the rest of Debased America has probably missed. (See, e.g., this interview.) Reading C.K., for me, is to discover of how I might have turned out without Mennonite school, missionary excursions, violin practice, etc. Klosterman is the embodiment of the slacker obsessive as self-promoter, of the metal-head who has made good. In his books, he’s written his own sort of Fan’s Notes to the mass media. To him, like some other rural mediamaniacs I’ve known, the existence of the crassly commercial mass-culture was less a cause for scorn and anger than it was a constant source of hope. To wit, any photo of Nikki Styx, Axl Rose, or KISS is proof positive that the entire world doesn’t mirror the normative blandness of North Dakota.

Chuck is madly prolific: he’s written 5 books of essays and a novel to date, and I’ve only read 3 ½ of them [1] (so far). Last summer, after purchasing the aforementioned Eating the Dinosaur, I went back to investigate his early career, and what I found as the unstated meta-narrative (to be pretentious, yes) is a gradual loss of faith in the goodness of the mass media. Not that he started out as a doe-eyed Believer, exactly, but it’s all by steps. His first book, Fargo Rock City, is a spirited defence of Heavy Metal, the cherished music of his youth. It wasn’t until near the end of that book that he delved into the details of how the media portrayal of the Rocker influenced his later life—specifically, that it led to his adult alcoholism (though believe me: his description of this is a lot funnier than my blunt statement of it). That chapter was the strongest section of FRC; I think the reader reaction to it taught him something about effective self-portrayal, because (skipping a book) by the time we get to Sex, Drugs, and Cocoa Puffs, his 3rd, he’d gone through a personalising tonal shift. Here’s the first sentence: “No woman will ever satisfy me.” The reason? Media-induced expectation, of course. The rhetorical trick of writing about himself as a stand-in middlebrow Gen-Xer, as someone who’d been just as influenced by the inundation of the advertised cool as you were, was actually pretty brilliant. It allowed him to write about overplayed topics (Star Wars, Internet porn, Jesus Freaks) not from the perspective of a “media elitist,” but from the perspective of a Real American upon whom such tricks were played.

That’s when, in this weakly constructed narrative, the Sad Thing happened.

The Sad Thing being that Mr. Klosterman, at some point in his unusually successful career, realised that he’d become yet another member of the media, whose role it is to tell the Kids what to think about stuff. It doesn’t take super-deep reading to put this together; Dinosaur makes this more or less explicit. In an essay called “All the Kids are Right”, he doesn’t even attempt a critical analysis of Lady Gaga’s ascendance, except to note that the reason Rock is a perpetually disruptive genre is that adults (like him) can’t control its future. That role falls on the Kids, who instinctively follow what’s fresh and new. What interests Chuck now is different celebrities’ attempts, against the odds, to Be Authentic. C.K., in other words, is your go-to man for a defence of Weezer, who (he claims) since becoming famous have continued to write exactly the same sorts of directly confessional, emotive songs that they did early in their career, except today [frontman Rivers Cuomo], as a result of his fame, has become so different from his original fans that his music, to them, comes off as glib and phony. You can probably see the connection I’m trying to make by referencing that, w/r/t Klosterman’s own double bind. Once Everyman hits the top, he’s no longer Everyman.

If there’s one overriding theme to Eating the Dinosaur, it’s the difficulty of maintaining authenticity in the modern, mediated world. IMHO, the collection’s most striking essay is its last, a depressing little piece called “FAIL” that dissects Industrial Society and Its Future—a.k.a., The Unibomber Manifesto. “FAIL” brings Klosterman back to his time-honoured technique of putting himself in the seat of the One to be Interrogated [2]. It’s an oddly schizoid exercise, since there’s an obvious cultural prerogative not to support a known terrorist (killed: 3, injured, 23), but at the same time Chuck seems to agree with almost everything that Ted Kaczynski had to say. The Manifesto is a longish (~35,000 word) meditation on the connexion between technological progress and human freedom, the basic vision of which is that the human desire to employ new technology will always be more powerful than the desire to exercise human freedom. The modern ‘leftist’ (Kaczynski’s term) is a person whose social conditioning to participate in the society is so strong that he’ll participate in it at any cost to his own well-being, and the widespread ubiquity of this condition is what will cause the gradual enslavement and immense suffering of society once it’s forgotten how to resist technology’s temptations. Hence the Unibomber’s violent campaign against the members of society he held most directly responsible for the technological enslavement: engineers, airline employees, computer science profs.

C.K. makes this into a masterpiece of self-loathing. He identifies himself as the modern leftist and admits readily that technology—i.e., the ever-more-electronic mass-media—has not made him happier. He feels less ensouled and less free than ever before. Yet at the same time, he feels that the Internet is exactly the thing that he loves the most. A paradox, yes, and one that he’s not willing to force to a conclusion. The book’s closing words are at once humorous and chilling:

I love the Internet. And I will probably love whatever technological firebomb comes next. My apologies, Ted. Your thirty-five-thousand word document makes sense to me, but I cannot be saved. You’ll have to blow up my hands.”

When I read that, I wondered to myself if, indeed, this sort of tension could be indefinitely sustained. Then came last week’s NYT editorial (already hyper-linked above; here’s another chance if you missed it), and I’m finally convinced that the answer to my question is NO. If I’m right, Klosterman is about to enter a new phase. This new article–”My Zombie, Myself: Why Modern Life Feels Rather Undead”—is on one level ‘Classic Klosterman.’ It takes an overexposed pop-cultural conceit (zombies, a la The Walking Dead), puts a slightly contrarian on that phenomena (zombies aren’t a backlash to the vampire trend [which is in fact all about the Twilight-induced teenage chastity fantasy]; zombies have been growing steadily in popularity for the last 40 yrs.), and then connects the overall topic to some broader ‘philosophical’ issue (the horror of zombies is a lot like modern life: savagely horrific, yet boringly manageable). Connecting this article to “FAIL”, however, this Fan has sensed a trend that seems to be approaching a critical mass. Although it’s logically possible for Klosterman to write critical missives contra electronic culture while participating fully as one of its highly visible members, my theory is that practical tensions will eventually cause a large enough pressure to ensure some kind of a fissure. It’s possible for thoughts and actions to be at odds for a while, but it’s my experience that eventually these things reconnect. No cognitive dissonance can last forever.

So I’m looking for the dawning, soon, of a reformed Klosterman. Chuck, sleep lightly. You know I’ll be watching.

[1] The ½ comes from the first of his books that I tried, Chuck Klosterman IV, a volume of collected magazine pieces. I got it from the public library as a guilty pleasure, thinking myself to be basically above this sort of celebrity tripe (though still greedily curious about the finer details of Billy Joel’s personal life), and after reading about Joel’s deep existential sadness, I lay in bed awake for many more hours, gulping in facts and interpretations regarding the complex virgin/whore dichotomy implicit to Brittney Spears, Led Zeppelin cover bands, and Morrisey’s strangely devoted Latino groupies. Up.

[2] Probably the reason I’m especially interested in this article is that, upon reading it, I felt somehow gypped: I’d read the old Manifesto, too, with the goal of writing a similar essay, before I saw Chuck’s version. In other words, I felt I got scooped, even though (objectively speaking) C.K. is the media master and I’m virtually unpublished. Up.

Varieties of Horrific Experience

In Lit, Politics on 2010/12/05 at 6:01 am

Horror writer Stephen King—probably America’s best-loved writer—wrote a study of his chosen genre early in his career, a (relatively) obscure, baggy volume entitled Danse Macabre. In its “Forenote,” King claims that the book is intended as an answer to those repetitious reporters who following each of his books’ releases would ask him, Why? Why write such grisly things? With DM available, he could simply tell them, “Go read my book.”

For most of the book, King gives a discursive overview of his turf, but he eventually provides a theoretical response to the reporters’ FAQ. The answer—or, rather, the moral justification (the “reason,” for genre connoisseurs, being simply that it’s fun to be safely frightened)—is that horror, far from being a potential corrupter of the youth, “is as conservative as a Republican in a three-piece suit.” King’s vision of Horror is that it is the group of artistic works that present the dark secrets of humanity in such a way that the very darkness of these things is emphasized, emphatically. When the typical tropes of serial killers, mad scientists, and vampires/zombies are presented as horrific, the subtext is actually a tacit endorsement of the anti-[serial killer/mad scientist/vampire/zombie] status quo. E.g., to hate serial killers is to hate random murder and to love law enforcement; to hate mad scientists is to support gov’t regulation of scientific research; and to loathe vampires and zombies is to affirm the specialness of the human soul (or maybe the power of the Catholic Church against evil—pick an implied moral as you wish).

King’s statements concerning the uses of horror are probably more interesting as a comment on his own approaches to writing than as a guide to the literature in general1, but they present a particularly simple Straw Man that I’m interested in. Keep them in mind while I present a conflicting POV in the next few paragraphs.

Another horror maverick who has grown more respectable in his dotage is the Canadian auteur David Cronenberg, whose body-horror classics like Rabid, Scanners, and Videodrome plumbed the terror arising from unforeseen interconnections of the mind and body. The one of these I watched most recently was Rabid. Rabid (Inept American Title: Rage) was influential in its day because it was the 1st feature film to treat vampirism as a medically explicable infection; today, the “vampire virus” idea having been treated ad nauseum via countless zombie movies, it’s more interesting for those fantastically icky special effects that all of Cronenberg’s horror movies are known for (see: the phallic stinger emerging from Marilyn Chambers’s armpit when she needs to suck men’s blood). It was controversial in its day for more than one reason (like many Canadian films, it was gov’t-financed, and since it starred the hard-core actress Marilyn Chambers [of Beyond the Green Door fame], some critics who mattered alleged its porniness), but the criticism that Cronenberg singled out as especially specious was that his movie was reactionary and “conservative.”

In the DVD Director’s Interview in which this was discussed, Cronenberg offered an opinion of the Horror genre that I’m repeating here as a foil to King’s. Certain critics pointed out that Rabid played upon its audience’s fear of sexually aggressive females and, hence, claimed it was a reactionary comment upon the then-recent Sexual Revolution. Cronenberg’s (rather weak) argument to the contrary was that the Horror genre is de facto liberal; the intrinsic appeals of works in this mode to the irrational and unknown elements of the self automatically force one to question the status quo. Since Horror encourages thought about and reactions to actions/events outside of the old societal norms, it’s almost always accurate to categorize Horror as a sneakily subversive influence.

So…who’s right? King or Cronenberg? Is Horror society’s Saviour, or its Destructor? Is it ‘conservative’, or ‘liberal’? Nice, or Naughty?

About here, I’m willing to admit that anyone who’s even a casual fan of Horror will be able to spot the problems with my setup for herself. Fear, by itself, doesn’t denote a distinct ideological bias. It’s the object of the fear that determines the politics of a work.

And this is a sticky point, because for an objet d’art to have the power to induce fear in its reader, the reader must come halfway toward the work and admit some of the basic premises for its magic to work. Some examples. The Turner Diaries, the schlocky thriller that provided Tim McVeigh with his blueprint for the OK City Bombings, doesn’t come off as scary or even plausible to anyone who hasn’t already come halfway toward its gun-totin’ White Supremicism; it is only offensive and has no power to convince. Alternatively, unless you’re already critical of Wall St. and its Reganite ways, it’s hard to imagine that a book like American Psycho could be viewed as anything other than an over-the-top serial killer/fashion narrative. Yet to receptive audiences, each of these books has been greeted with extreme acclaim, the acclaim that meets books that are able through simplification to clarify what the receptive audience had always suspected was the case all along.

(Of course, certain Horror works play on such quiet and natural premises [e.g., it’s not a good thing for characters to get unexpectedly stabbed in the back, attacked by faceless monsters, buried alive, etc.] that it’s not immediately obvious that these are ideological premises at all. Be aware, however, that just because something is implicit in most everyday social situations doesn’t make it ideologically neutral; the most commonly held beliefs are the sorts of tacit assumptions that Horror can exploit most broadly, to the greatest effect.)

The reason that the world is in general so confusing is that it’s impossible to separate things out into separate categories and deal with the complications one by one. Everything comes all at once, without any built-in pause for reflection. I have no doubt that if (by some weird act of Fate) either S. King or D. Cronenberg were to happen upon this blog, neither would recognize his own view of Horror in my gross caricature, but that is by design: to make this seem like a puzzle to be clarified, I have simplified and made dogmatic those statements which were probably meant by their speakers to be exploratory complaints and vague suggestions. So it is with Horror: only in the rarest cases might you come upon the odd strictly regressive or strictly progressive curio. Most cases will have one shock, one jolt, one idea, all coming right after each other—and, embedded as they are in the flesh of story, the philosophical biases are usually only implied. Then it’s up to hacky blog posters to prepare obsessive discussions that claim otherwise.

  • Not that this is especially strange for writers; DFW’s essay on John Updike is a perfect case in point.

Kundera Would Hate This

In Internet, Lit on 2010/11/29 at 4:37 am

First, the offending quote: “Once the writer in every individual comes to life (and that time is not far off), we are in for an age of universal deafness and lack of understanding.”

The above impertinence comes from the novel The Book of Laughter and Forgetting, by the Czech émigré Milan Kundera. I came upon this quote in college, during a phase of reading the Great Metafictionalists, and until today, it’s all I remembered of the passage from which it’s lifted. Although I haven’t followed Kundera’s recent work very closely, I remember being so impressed by the book’s ambitious mingling of tropes—a paragraph of history following an imposition of fiction chasing a rumination on philosophy—that I was quick to forgive any disagreements I held against its fundamental tenets. With the start of a project like the Homing Pigeon Experience, however, I decided to reread his objections against self-expression for the Common-Man—a Common-Man who, unfortunately, probably looks at least a little like me.

The argument, if it can be called that, comes in fits and starts, buried as it is in polyphonic prose that uses the situations of individual characters to rub resonances into the bluntly stated conjectures. Since as a Common-Man I owe Kundera (a literary superstar and Nobel-shortlister) little fairness, I’m not even going to mention the frame stories. Cut straight to the didacticism: in Part Four, Lost Letters, the narrator—who I’m perhaps unfairly conflating directly with the Real Kundera—says that it’s the essence of the Writer to have a desire to define his universe completely, to bar all outside sources from competing with the truth of the created world inside his books. This is why, when he observes a “mass epidemic” of graphomania (which he idiosyncratically defines as an “obsession with writing books,” the obsession “to have a public of unknown readers”), he deems it enormously pernicious:

”If general isolation causes graphomania, mass graphomania itself reinforces and aggravates the feeling of general isolation. The invention of printing originally promoted mutual understanding. In the era of graphomania the writing of books has the opposite effect: everyone surrounds himself with his own writings as with a wall of mirrors cutting off all voices from without.”

So despite his personal graphomania (as demonstrated obviously by the fact of the book’s existence as an object in the reader’s hand), Kundera allows that a mass generalization of his own personality traits might culminate in an apocalyptic conclusion—the future of “universal deafness and lack of understanding” of the opening quote.

Now, the most obvious tack might be to attack Cranky Old Kundera as a guy who doesn’t (didn’t) get it, who circa 1978 couldn’t possibly have understood all the Web 2.0 interactivity that the future media would bring, but in terms of mass communications, I’m not sure he’s so wrong. How many times have we all read in the comments sections below newspaper articles statements beginning with the modifier, “I haven’t read the article yet, BUT…”? Kundera—who, don’t get me wrong, is cranky (see, e.g., “Beauty has long since disappeared. It slipped beneath the surface of the noise—the noise of words, the noise of cars, the noise of music, the noise of signs—we live in constantly. It has sunk as deep as Atlantis,” a quote that I’m not even taking all that far out of context)–might have thought that it was going to be an overpopulation of books that would drown meaningful discussion, but it’s hard to pretend that history hasn’t come up with a viable substitute. I know that I’m not innocent of lazily gazing into my computer monitor, clicking on random hyper-links in an attempt to find some stimulation, somewhere, that will take me away from the mental strain that work requires of me. If you’re going to tell me that the general tenor of Big Media has grown progressively more elevated since the rise of the Internet, then I’ll be aware that anything could be in store for the rest of the conversation—from black holes in Bermuda to the spiritual sustenance of a Palinite Twitter feed.

And yet…and yet…surely he doth protest too much. When I first read Kundera’s objection, my immediate response was, Geez, what a snob. Why should I read his writing?, and that response still makes some sense to me. Perhaps the response to Kundera should be the natural response of the Common-Man to the self-proclaimed Writer: a pat on the head, accompanied by an admission of, “Purty interesting, though.” What I’ve come to know is that one needn’t arrive at the same conclusions as a Writer to find his works both instructive and satisfying. (See Sontag for more on this.)