In his book Sour Grapes, Jon Elster has a chapter about “willing what cannot be willed,” or what he also calls “states that are essentially by-products.” He offers the example of spontaneity: you cannot try to be spontaneous; you can only recognize that you had been acting spontaneously after the fact.
“When we observe that some such state is in fact present,” Elster notes, “it is tempting to explain it as the result of action designed to bring it about — even though it is rather a sign that no such action was undertaken.” This Elster calls the “intellectual fallacy of by-products,” which presumably leads to a belief that we can reverse-engineer the pleasure we take in certain conditions that can’t otherwise be pursued directly.
Reading about ASMR, as in this article about Buzzfeed’s Facebook Live show ASMR News Now, made me think of this fallacy, and how ASMR seems to hinge on defying the idea that you can’t manufacture inexplicable pleasures. ASMR is usually explained as a kind of brain tingle brought on by sounds that conjure intimacy and monotony in equal measure: “soft voices, kind words, a conceit of caregiving,” as Nitin Ahuja explains it in this essay. The sensation seems to steal upon those who experience it, yet it apparently can be triggered reliably by ASMR practitioners who can slur their sibilants in the right rhythm while performing some mundane activity chosen for its unobtrusiveness, its lack of capacity to bear deeper meaning. The ASMR practitioner often performs concentration — through such routines as folding towels, say — so that listeners can let their own need to concentrate dissolve.
The typical ASMR scenario thus seems to stage meditative conundrums of concentrating on not concentrating, dramatizing how the care we often yearn for must be both an expression of special attention and of being taken for granted. It’s about using technological mediation to will an unwillable state, to make our approach to a desirable “by-product” state suitably indirect. The frisson of ASMR is thwarting the principle that you can’t tickle yourself, you can’t plan to give yourself goosebumps. ASMR says you can.
ASMR suggests there is a way out of the contortions of self-consciousness that come from trying to be natural. Elster cites Stendhal’s diary on this recursive desire to act natural and claims Stendhal “turned to fiction” as a “way of enacting his desire by proxy.”
I wonder if we sometimes hope that our social-media profiles could function in a similar way, allowing us to actively experience what happens to that profile a kind of radical passivity that passes for “naturalness.” Our data gets processed and what we really want to know or how we really want to be is presented to us as not an artifact of our consciousness, of our deliberate consideration, but instead somehow implicit in our past activities.
This desire to have our “real selves” captured behind our backs and revealed to us becomes an alibi for permitting extensive surveillance of the self, for embracing the “inevitability” of surveillance as a prerequisite to self-knowledge. Finally surveillance will let us chart the path to “being natural” without immediately feeling unnatural about it. Inherent in this is our ability to take for granted that “naturalness” is less a state of being than a commodity, and like other emotional commodities, is available on demand by consuming the appropriate goods. When I want to feel “authentic,” I can look at a list of books Amazon recommends for me and simultaneously delight in how well my data pegs me and in how much of me escapes Amazon’s understanding.
Stendhal, Elster notes, didn’t try to “make an impression on others by faking qualities that he does not have.” Rather he wanted to become “a person who could not care less about making an impression.” One of the seductive things about surveillance is that you know you are making an impression — as so much data —regardless of whatever effort you make. You can trick yourself into thinking that the effort to be natural has become superfluous, and your “naturalness” will be constructed for you from that data for your later consumption.