Monthly Archives: November 2004

Morbid self-fascination

I’ve been feeling sick lately, with symptoms that were unfamiliar to me, and ultimately I got so preoccupied with myself that I had to go to the doctor. It feels as though the main symptom of my illness has been a morbid self-fascination, a biofeedback gone haywire, which has be constantly monitoring myself to the exclusion of all external stimuli, so that the sheer act of communication feels like an unbeliveable nuisance — do you mind? I’m trying to hear the blood coursing through the veins in my ears. Wait — was that a palpitation?

This started me thinking that self-awareness itself is the very definition of illness, the core symptom that underlies all experiences of being sick. Whatever your ailment, you are thrown back upon yourself in a way you normally aren’t, foecred to think of yourself and your body first, before you can agree to any course of action, before you can conduct any kind of social exchange. And the degree to which our society encourages self-awareness, self-monitoring (c.f. Weber’s argument in The Protestant Work Ethic and the Spirit of Capitalism), is the degree to which ours is a sick society. There is no greater freedom than the freedom from thinking constantly about yourself, which is basically the definition of insecurity. Our culture, however, is an insecurity-generating machine, with the consumer economy based on selling things meant to assuage the insecurity carefully engineered not only by omnipresent ads, but by the very discourses that structure the way we conceive of ourselves (the law, medicine, education, etc.). Sickness is an awareness of a lack — or at least we’ve come to see it that way, because so much of our life experience revolves around perceiving lacks in ourself and trying to rectify them. We are missing some right, some piece of clothing, some feeling of self-possession. Every moment of self-awareness seems like a moment of owning ourselves, but its really a furtive admission that we don’t have self-possession, that are desperate self-inventories are just cataloging the ways in which we are dependent on the structures around us to feel an illusion of completion, of security.

When I went to the doctor, I was trying to shift the burden of my self-awareness onto her, make her be aware of me, instead of it being my sole responsibility. This is what people are constantly doing by trying to get on TV, or by making films of themselves, or by vicariously projecting themselves into reality shows. They are trying to share the burden of their self-regard by becoming aware of how much others are regarding them. This is why the temptation to hypochondria is strong. It’s hard to be honest with the doctor, when she asks you to describe your symptoms. I always feel like I’m telling a story, and telling it wrong, badly, and I feel the urge to spruce it up with some more colorful symptoms, with more exciting details. To embellish it. I don’t want to disappoint her. In some ways I was confessing, in other ways I was offering a defense for myself. But I was acutely aware that the me I was describing in laying out my symptoms was not the me I live with in my consciousness, day in, day out, especially the me that has been paranoid about my health. I was talking in this utterly phony voice, a voice I use when I’m talking to bank tellers or barbers. How could this inane chit-chat do anything to resolve my health concerns? Shouldn’t they be sticking me into the CAT scan machine? Drawing blood or something?

But now, having had the doctor’s benediction, I’m able to think about something else rather than my symptoms. Nothing has changed, but yet everything’s diffferent. I’m aware of different appetites now then I was a few hours ago. All she needed to do was not seem especially alarmed by what I was telling her, and I was suddenly free to move on myself. Its amazing what a moment of attention can do for you.


Bound galleys

On my way home from work today I saw a man watching a bootleg copy of National Treasure on his portable DVD player. He seemed quite pleased with himself, showing off how up to the minute he could be, but not as pleased as I was to be noticing him, and drawing my little conclusions about how the movie’s content itself was entirely insignificant to the viewer, what mattered was that we registered that he was somehow above the law, in having the pirate DVD and in cranking its sound up on the subway car. It was the allure of novelty, the essence of consumer capitalism, pushed to a new extreme, thanks to technology.

Of course, it would have been better for him to have an advance screener to play on the subway before the movie was even released publicly. Then he could advertise his special access to everyone, like I do when I occasionally read a book in bound galleys that comes to the office where I work. When I am reading such a book, its content is secondary to the message I’m using it to send, that I can get things before everybody else can. It’s fun to take such books home on Thanksgiving vacation, say, and see how intrigued people are at the machinery of promotion. Of course, you can always go to the Housing Works bookstore on Crosbie Street (among other places) to get bound galleys to impress your own non-NYC friends. You don’t need to be a magazine grunt.

Lately, though, that thrill has worn off, and I just feel entitled to seeing such books for free, ahead of time. At a certain point, it just comes with the territory, and you begin to pretend its fundamental to your being able to perform your job, to know what’s happening a few steps ahead of the reading public who you’re supposed to serve. At that point, you’ve been hopelessly co-opted, and you’re another cog in the culture-industry machine, facillitating the unnecessary flow of commodities, keeping people preoccupied with trivia. Accepting promotional materials makes one complicit with the whole process; you don’t beat the system because you are supposed to feel special for getting it for free. There is no scoreboard to be had here.

On Easter island

I’ve always been freaked out by images of those large stone idols from Easter Island, and it’s not just because the graced the cover of Styx’s Pieces of Eight album. They seemed to be mammoth harbingers of doom, built to a scale indifferent to humans and seeming to portend a time when humans would be superfluous. (I feel this way about gigantic sculptures in general; at Storm King, a scultpure park in upstate New York, I had the same creepy feeling that these big monstrosities were out to eradicate society, partly in the way they claim so much of the space around them to themselves and thereby ruin it for anything other than standing there and feeling how insignificant we are as a species. Some people consider this feeling “sublime.”)

An article in the November 19 2004 TLS confirmed my suspicions that these were emblems of great evil. As its author points out, the megaliths “defy common sense” and do it on such an extravagant scale that its terrifying. you realize how tenuous common sense is, how easily it can be replaced with something so obviously absurd to outsiders. The article offers a quick rundown of the history of Easter island, of how its society and ecology were utterly ruined by the insane potatch-gone-mad need to build ever larger idols than competing clans. The islanders killed all the trees to make scaffolding for their great idols’ erection, and they blithely believed the gods they so honored would bring trees back to the island. They believed these same gods would see that their idols were stood up, when no more trees remained to build the necessary platforms. With no trees, nothing remained to build shelters or boats necessary for their seafood-based diet. The topsoil was carried into the sea by the wind. Warfare became endless, cannibalism rampant. The parallels to the disasterous ecological course we’re on as a civilization are obvious — we are so attached to our fetishes (our consumer goods) that we don’t think twice about depleting unrenewable resources, and we won’t hesitate to “fell the last tree” if it means another wooden doodad for someone to entertain themselves with. According to the article, anthropologists call this “ideological pathology,” a kind of path dependence of the collective imagination that prevents individuals from meaningfully conceiving of alternate modes for society.

Now, social theorists — the Theory of literary studies — often are dumped on for the gnomic, inscrutable texts, and their hostile “nihilistic” attitudes toward the status quo. But they can’t be accused of ideological pathology, despite claims that schools of thought like Marxism and Freudianism are moribund, disproven. The pathology is not a matter of clinging to a false ideology; it’s a matter of refusing to question the prevailing one, and it might just be better to confront the existing hegemony with the blunted tools of discarded thinkers than to refuse to confront it all, or worse, to celebrate it.

Cross of gold

James Surowiecki has an interesting article in the latest New Yorker about gold investors, and about what cranks they are in a post-gold-standard world. “If you invest in gold, you’re basically betting that someday a greater fool will come along, who thinks gold is worth more than you do.” That pretty much sums up the fate of a “commodity” that has virtually no utility, whose historical function was to embody an entirely hypothetical value, to encapsulate the process of exchange itself. Like words to a post-structuralist, gold has no stable meaning, it only acquires one after the fact, it is always in the process of becoming what it is supposed to be worth. But investing in gold was a way to purchase the notion of exchange, entirely independent of use value, while thinking you are doing the precise opposite, getting something of ineffable worth that transcends market fluctuations. Gold investors want pure theoretical value, value so perfect it can’t be exchanged for anything; it absorbs all possible exchanges within itself and negates them. Gold is literally god, the transcendental signifier, the touchstone on which all other values are based.

Of course, various efforts have been made to establish the source of gold’s value. When classical economists hit upon labor as the source of value, they argued that the difficulty of mining gold, the amount of labor required, gave gold its value. Smith suggests metals made for effective currency because they weren’t perishible, and they could be spilt up and rejoined as one pleased. The physiocrats at least had a kind of logic for fetishizing land: it turns one seed into much fruit. But gold, is presumably placed on Earth in finite supply by God, which creates a zero-sum game with the ultimate value of things. You need to acquire as much of it as you can, because then you have a greater percentage of the total possible wealth. Of course this is absurd, but it’s easier to grasp than the drive for perpetual growth, which constantly undermines the value of what you’ve worked to acquire.

Gold allows for the fantasy of value without labor, of usefulness without effort, of a truly benificient God, a pre-fallen world where no labor was required to make things valuable. It allows us to indulge that dream that value inheres in things themselves, and not in what we think of them or do with them. It allows us to indulge a splendiferous passivity.

And let’s not miss an opportunity to quote William Jennings Bryan’s immortal words: “Having behind us the producing masses of this nation and the world, supported by the commercial interests, the laboring interests, and the toilers everywhere, we will answer their demand for a gold standard by saying to them: You shall not press down upon the brow of labor this crown of thorns, you shall not crucify mankind upon a cross of gold.”

Uniform nostalgia

American men once wore hats, and then they stopped rather abruptly somewhere around 1960. When Kennedy neglected to wear one to his inauguration, it’s said, everyone knew it was all over for hats. I’m sure milliners everywhere mourn the death of the hat, and those invested in the men’s fashion industry probably try to periodically instigate their return, but in truth, it is the fashion for public individuation, as expressed through statement making clothes that ended the hat. That and the rise of youth culture — the hat seemed to mark entrence into adult culture, an adult way of life. Now everyone repudiates that, no matter how old they are. Out west, people even wear their short pants and their golf shirts to work. (Is that too elitist?)

The hat (and its corollaries, the overcoat and the suit) accomplishes one thing above all else, and that is anonymity. It makes men in general look the same in public. The prevelance of the hat enforced certain boundaries, it ensured that your personality was reserved for truly intimate moments and allowed one to adopt a kind of generic public persona suitable for conducted civic business. You weren’t expected to look young, or anally fastidious. You were expected simply to look like everyone else, so that level of interaction could be swiftly set aside. Of course, in the sixties, as Thomas Frank documented in The Conquest of Cool, conformity was made anaethema by ad campaigns making it a personal duty to evade conformity and publicize how effective you are at “becoming who you are,” which is, of course, absoutely unique in every possible way, from the cereal you eat to the car you drive to the slogans you choose to emblazon on your clothing. With conformity out, it was only a matter of time before the hat, which disguises identity, would be out as well, and the theatricalization of public space, in which everyone is perpetually acting themselves out, would be moving ahead full steam.

Now, hats likely connoted their own subtle shades of meaning, and people of those times likely could have read a man by his hat the way we might read a man by his hair cut. But the hat always functioned as an overt sign, it never pretended to be a mark of one’s authentic being. I’m guessing it always projected a discrete role in the public sphere. Today, as Richard Sennett argues in The Fall of Public Man, (and to whom I owe much of this line of reasoning) one is obliged to seem authentic in public, to be one’s most intimate self at all times, eroding the boundary that defines public and private space. The result is that privacy disappears, as surely as the hat did. We don’t mind this, because we are conditioned to think that the more we are on display, the more our authentic being is getting validated — which explains the mania for self-exposure (i.e. this blog). Privacy invasion becomes the most austere form of self-validation.

I’m probably not alone in my nostalgia for the uniform, but it’s a phony nostalgia, because I’ve never been asked to wear one. I’ve tried to adopt the voluntary uniform, and wear essentially the same outfit to work everyday, but it’s tough to do this without becoming even more self-conscious, and the beauty of the uniform was that it freed you from self-consciousness, it allowed you to reserve your “self” for home. You certainly can’t go back to wearing hats. Hats now are always affectations unless they are explicitly keeping your ears warm.

A new toy

I’m still coping with having been furbished with a new work computer, one of those new Apple Imacs that rests its entirety on a wee gray easel. It’s loaded with OS X, which is new to me as well, and it all feels overdesigned and vaguely emasculating. I’m not sure why I should experience subjection to sleek, hyperconscious design as humiliating or effeminating; I wonder if I’m alone in this. At any rate, I don’t feel like I should work with this machine; I feel like I should pose with it, or simply marvel at it the way we are expected to marvel at the breakthroughs of industrial design at the Cooper Hewitt. I look around my cubicle for a placard explaining who the artists were and what innovations they are known for.

OS X is especially full of animations and features whose primary purpose seems to be to encourage the user to stop and think, Wow, neat. It’s presumed performance enhancements are all cloaked behind these genie effects and Ken Burns-like pan-and-scan screen savers and icons with more animated effects than a anime film, so that rather than be more productive, I’ve spent most of my time trying to shut these features and effects off. Perhaps I’m perverse in finding no delight in these doodads, but I always feel faintly infantilized whenever an application icon begins to do a little dance, as if I’m in a crib, and my moniitor is a play-mobile.

The design calls so much attention to itself that it begins to impede utility, pre-empt it as the device’s main purpose. It seems more important that I’m working on an Imac than that I’m doing whatever work I’m trying to do. Am I right in thinking that design usurps utility, that form and function have become antagonists in a zero-sum game rather than being happy helpmates, complementing each other?

One might defend these machines as bringing more of an atmosphere of play to office work, which is of course notoriously numbing and spirit crushing. But one might also argue that such playfulness strips office work of what little dignity remained in it. The rounded corners and smoothness of the machine are analogous to how its supposed to smoooth out the workday, offering booby-prize consolations for having to be chained to it all day, processing information rather than using it or generating it (or exiting that dataworld altogether for the realm of senses). The user-friendliness eases the way for you to integrate yourself with the machine, to meld with it. Is this what we want? Don’t we want the machines to remain alien, a tool, not a part of us, but something that we apply? Is it a good thing to feel more at home in virtual space, which is ultimately a prison in your own head?

Having a new computer was extremely — and surprisingly — disorienting. It was unusually stressful, which caught me off guard a bit. So much work time is spent in the virtual space of the computer, that having a new one made me feel as if I had suddenly moved to a new country or bought a new house. Most troubling of all was having to face the fact of how important that virtual space was to me, becoming more important than the physical space I occupy all the time. It makes me feel more and more like a sentient machine already.

The mediated personality

Thomas de Zengotita has an excellent essay in the December 2004 Harper’s about the “mediated personality,” his term for what happens to us from spending so much of our lives as the fawned over center of attention of so many elaborate media outlets. Television, radio, film, tourist accoutrements, museums, retail outlets — all these pitch to us as individuals and offer us representations of reality that far surpass anything we could experience independent of them because they offer us reality as if it were designed for us, with us in mind, with us at the literal center of the universe. Writes de Zengotita: “The alchemy that fuses reality and representation gets carried into our psyches by the irresistible flattery that goes with being constantly addressed in such fabulous ways.” He illustrates this by pointing how much effort it would require for you to place yourself in a scenario where you’re not at the center of it all — you’d have to make yourself essentially inaccessible, without cell phones and radios, in the midst of a wilderness without billboards or stores or paths or benches. Only then would you be somewhere where things aren’t “designed to affect you.”

Social theorists have discussed what he’s talking about from different angles, of course: Foucault approached the ways we are situated in ourselves by the various discourses of our society, be it that of law and order (Discipline and Punish) or medicine (The Birth of the Clinic) or mental health (Madness and Civilization) or what have you. Perhaps his most famous metaphor for this, which de Zengotita almost off-handedly references, is the panopticon, adapted from Bentham’s idea for a prison in which all the cells were able to maintain surveillance on all the isolated others. The isolation forces an illusion of individuality based on personal responsibility, the surveillance suggesting a significance to one’s most inauspicious, inarticulate, and inconsequential acts. So we are impelled to self-consciousness, granted a self from outside ourselves which we nonetheless must maintain, which opens up a rich vein for commercial manipulation. And in this situation, no one is really to blame — you can’t pin it on the “man” because we are all the “man” in this scenario, executing the whims of a decentralized and dispersed power structure merely by observing others, by giving them their stage. We are oppressed by our very self-consciousness, by something we have been trained to see as so integral and essential to our humanity, our presupposed uniqueness as an individual. Foucault explains it this way: “He who is subjected to a field of visibility, and who knows it, assumes responsibility for the constraints of power; he makes them play spontaneously upon himself; he inscribes in himself the power relation in which he spontaneously plays both roles; he becomes the principle of his own subjection” (Discipline and Punish, 202). It sort of sounds like the Freudian concept of superego externalized, its incubation occurring not merely within the family dynamic but within the larger arrangements of society. But instead of an internalized authority figure, we live haunted by the notion that others are literally seeing what we are doing, that we must live as if we are observable from every possible angle.

A key to understanding why we consent to this, which de Zengotita rightly points out, lies in the flattery of being so studied and observed, which we often experience as being catered to. Althusser writes of this in “Ideology and Ideological State Apparatuses,” the experience of being “hailed” by the culture and thereby being brought to understand that we exist as “unique” individuals. We consent to subject ourselves to the ideology embodied in these hailing devices (ads, historical markers, salespeople, films, newspaper accounts, etc.) in exchange for the security of knowing we have a discrete existence as an individual, that we are irreplaceable, and thus can never die.

De Zengotita draws on this in his take on celebrities. Celebrities, according to de Zengotita, have “instilled and reinforced the values and conditioned people’s life choices, especially style, the attitude that gets you through the day. These star types posit and reflect the selves their fans have chosen to be. . . . These performer-heroes are all about us. . . . That’s big-time flattery — and a pivot point in the dialectic of mediation. . . . Is it not ultimately the spectators, in their hiddenness, who hold sway? All the gratifications of voyeurism accrue to a judge nobody knows.” Celebrities call attention to how they are surveilled to give us the pleasure of having the power of the transcendent observer, the man in the panopticon’s tower, seeing all while remaining unseen.

But de Zengotita insists we are not content with that transcendence. He argues that the entitlement implicit in the media’s attentions makes us yearn to have more overt attention lavished on us. “Celebrities held a monopoly on the most scarce and precious resource in a mediated society: attention,” de Zengotita explains. Thus when technology (camcorders, video phones, cable TV, the Internet) permitted, people rushed to make celebrities of themselves: teenagers are suddenly desperate to become Real World cast members and people like me are suddenly blogging their most mundane thoughts. For de Zengotita, this proves that everyone lives with a level of self-consciousness that makes Method actors of everyone, living worked-up responses to life rather than actually experiencing life in some more straightforward, unmediated fashion. He suggests that we have come to hold the condition of anonymity as a kind of trauma, akin to those of people who appear on daytime TV talk shows, who recoup their losses for being betrayed by fate by earning a modicum of public recognition. Implicit in this is Richard Sennett’s thesis in The Fall of Public Man, which was that the collapsing of public and private selves has brought on a “tyranny of intimacy,” wherein since all experiences, no matter how trivial, are supposed to reveal your authentic self, we lose all control over our public self and lose all capacity for the civilized, impersonal public discourse necessary for civic duty. We can’t take any activity seriously that’s not in some way self-aggrandizing. (Perhaps why political discourse what it is now; positions are always taken personally, and politicians are always fixated on their image rather than the quality of their positions.)

So there’s a dialectic between the joys of anonymity and public recognition that mirrors the ways in which mass-produced, standardized objects can seem so perfect for us specifically, after we’ve bought them. We want to be somebody and nobody all at once, and our chronic discontent with what we are makes it that much easier for us to ignore what’s really there, if we’re even capable of noticing it anymore.