May 28, 2008

Is Web 2.0 Breeding Blond Beasts of Prey?

By: James Poulos

Put together the latest round of knit-browed studies of Web 2.0’s psychological consequences, and that’s what you’ve got. We now know, for starters, that “Web users are getting more ruthless and selfish when they go online:”

The annual report into web habits by usability guru Jakob Nielsen shows people are becoming much less patient when they go online. Instead of dawdling on websites many users want simply to reach a site quickly, complete a task and leave. Most ignore efforts to make them linger and are suspicious of promotions designed to hold their attention.

From Scientific American, we then learn that, within this population of increasingly efficient and uncompromising information narcissists, “Self-medication may be the reason the blogosphere has taken off:”

Scientists (and writers) have long known about the therapeutic benefits of writing about personal experiences, thoughts and feelings. But besides serving as a stress-coping mechanism, expressive writing produces many physiological benefits. Research shows that it improves memory and sleep, boosts immune cell activity and reduces viral load in AIDS patients, and even speeds healing after surgery. A study in the February issue of the Oncologist reports that cancer patients who engaged in expressive writing just before treatment felt markedly better, mentally and physically, as compared with patients who did not.

Topping the list of greatest bloggers never to blog, in other words, is Susan Sontag.

According to Alice Flaherty, a neuroscientist at Harvard University and Massachusetts General Hospital, the placebo theory of suffering is one window through which to view blogging. As social creatures, humans have a range of pain-related behaviors, such as complaining, which acts as a “placebo for getting satisfied,” Flaherty says. Blogging about stressful experiences might work similarly.

[…] “You know that drives are involved [in blogging] because a lot of people do it compulsively,” Flaherty notes. Also, blogging might trigger dopamine release, similar to stimulants like music, running and looking at art.

That which does not destroy me makes me blog more. Emily Gould has admitted clearly enough that the me-ness at work in the blogosphere best makes sense at the level of the whole culture, not single persons:

“It’s all about blogging, which is all about individuals,” she says, explaining why her piece is about more than just her.

Already we can see the day approaching when the DIY Oprahs of the world will exhaust tastemakers’ interest in typical therapeutic-confessional fare. Emily, for one, has learned to think before she spills; Sontag’s polymath principle — thou shalt be interested in everything, and nothing else — widens the laserbeam of even the most hyperactive infonarcissist, away from themselves and toward the wide smithereens of the interverse.

What would a public world increasingly populated by Susan Sontags look like? Perhaps we already know. Hitchens’ eulogy of Queen Polymath is a sort of aspirational capsule biography for any young blogger with delusions of intellectual omnicompetence:

if it seems as if [insert name here] was always somewhere in print—it is because she timed her interventions very deftly. By the middle [insert decade here], someone was surely going to say something worth noticing about the energy and vitality of American popular culture. And it probably wasn’t going to be any of the graying manes of the old [insert highbrow establishment journal here] gang. Sontag’s sprightly, sympathetic essays on the diminishing returns of “high culture” were written by someone who nonetheless had a sense of tradition and who took that high culture seriously (and who was smart enough to be published in [reinsert highbrow establishment journal here]).

But was Sontag really a ‘blond beast of prey’, even in the cheekily downgraded and domesticated neo-Nietzschean meaning of the phrase? Whether or not it was karma, Sontag’s unreconciled reckoning with cancer and dying could be said to have run right over a proud Nietzschean’s endure-and-triumph dogma. Even an ubermensch is mortal; how does a mind long since accustomed to surveying life like a lion or eagle cope with the non-negotiable unendurability (or is that unendurable non-negotiability?) of death?

Such was the experiment, the last experiment, to which Sontag put herself. The experience was not pretty, of course, but, as her son David Rieff grimly chronicles, it wasn’t even noble. Sontag tried to contrast her unvarnished, mature, realist approach to illness with the 19th century’s romanticization of terminal suffering.

But no amount of familiarity could lessen the degree to which the idea of death was unbearable to her. In her eyes, mortality seemed as unjust as murder. Subjectively, there was simply no way she could ever accept it. I do not think this was denial in the ‘psychobabble’, Kübler-Ross sense. My mother was not insane; she knew perfectly well that she was going to die. It was just that she could never reconcile herself to the thought. So to those who knew her well, there was nothing surprising about her decision to go for the transplant. Life, the chance to live some years more, was what she wanted, she told her principal doctor, Stephen Nimer, who had warned her of the physical suffering a bone marrow transplant entailed, not ‘quality of life’.

Unfortunately, we are pulled inexorably toward a time and place that admits of no infonarcissism or gives it no purchase. The dogma of science holds that if what you don’t know can kill you, what you do know can be rendered safe. Through knowledge, we seek, at a minimum, survival. Death remains the stubborn unknowable, foreclosed, as such, to science — a thing that still ranges wild beyond the comforting enclosures of online group therapy, too. Death turns out to be the real predator, and we infonarcissists the prey. Or so it feels when the worst thing we do is die, when our inability to ‘beat this thing called death’ is taken as an unforgivable outrage to the dignity of the self. David Rieff:

What made my mother’s situation even worse was that even at the most experimentally oriented hospitals, it was rare for such transplants to be performed on any patient over 50. And as far as I could find out, as I surfed the web trying to get up to speed on MDS (an act that can give you a false sense of having understood: information is not knowledge), successes in patients beyond their mid-sixties were rarer still. In other words, my mother’s chances of survival were minuscule.

In still other words, on a long enough timeline, the life expectancy of everyone drops to zero. Philip Rieff, David’s father, dedicated most of his life as a social theorist to showing how knowledge was better than information for the same reason that guilt was better than pride: it was truer. Sontag’s quibble, writes her son, was with Genesis; Rieff the father’s quibble (no wonder they divorced) was that we are all guilty of mortality, and we had better not forget it, and that the remembrance of knowledge is a far different thing from remembering information.

Why have I taken this extended detour into the territory of Sontag’s death? To suggest now that her infonarcissism and her obsessive fear and hatred of death need not go hand in hand. Central to Nietzsche’s whole idea of living well was reconciling oneself early and often to the insignificance of even the most significant person’s ego. Despite his tendency to title chapters of his autobiography “Why I Am So Clever” and “Why I Write Such Good Books,” in no contemporary sense can Nietzsche be called self-congratulatory. For Nietzsche, whole generations and populations served the ‘purpose’ of coughing up a few exceptional individuals, but, in an important symmetry, the ‘purpose’ of those individuals was not to revel in the experience of how awesome it was to be themselves. It was to push humanity ever onward, ever higher, in the full awareness that even our most incredible standards of human excellence would be set and defined by the mortal span of human lives. (Read his praise of Genoa, his praise of all good endings.) Nietzsche was savvy about the full implications of technological progress, but I do not think he ever envisioned or would have supported radical life extension. At a minimum, he would have looked disapprovingly on the way Sontag’s courage seemed to be a crutch for her fear, because it sacrificed as noble a death as was available to ignoble mere life.

And what does this, in turn, have to do with Web 2.0 again? It’s entirely possible that our infonarcissism can transcend both the petty ruthlessness of compulsive eBay outbidding and the petty mushiness of therapeutic expressivism. It’s possible for us to use the bad, petit-bourgeois examples of ambitious yet trivial selfishness to hone a culture that strives to moderate our voracious appetite for information with a nimble and discriminating refinement. Motivated by something like this ideal, Reihan exhorts us all to save Brijit. Personally, I think we can and should do better with real live human info-aggregators, however less efficient and less omnivorous we might have to be. Web 2.0 creates whole new realms of information and invites us to dive in. But it also creates new possibilities and priorities concerning knowledge about ourselves, one another, and the productive power of cultures. We can get carried away with ourselves here, but that’s true everywhere. And I confidently hope the younger generations among us can approach Sontag-like levels of intellectual athleticism without seeking childishly to be immortal as well. This is a lesson worth learning, because now, more than ever, we will have to ward off the temptation of the Lawnmower Man — or, as the script was originally called, Cyber God.

(Photo courtesy Flickrer benjcarson.)