Algorithmic gender, etc.

As more information about ourselves is captured within “Big Data” platforms and systems alongside the data about other individuals, algorithms will be used to assign identity markers to us, place us in categories, regardless of whether these correspond to how we think of ourselves. The system will infer our identity, according to categories it defines or invents, and use these to shape our environments and further guide our behavior, sharpen the way we have been classified, make the data about us denser, deeper. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control” (gated link boo-hiss) by John Cheney-Lippold examines some of the ramifications of this.

1. Our self-identifications will just be one data point contributing to how the system identifies us. Within the data platform, in other words, we won’t have the last word on what our identity is; the platform will ascribe it on the basis of statistically determined parameters derived from all the users’ data in aggregate. 

We might see ourselves as “male,” for example, but if our generated data doesn’t fit the markers that the system uses to identify maleness or corresponds to patterns that nonmales have established, this won’t matter. The system will not regard us internally as male, and it won’t treat us the way it treats other “males”. (The same process holds for religious affiliation, where self-identifying Christians may not be regarded and treated as such based on their behavior within a platform.)

As Cheney-Lippold argues, “How a variable like X comes to be defined, then, is not the result of objective fact but is rather a technologically-mediated and culturally-situated consequence of statistics and computer science.” What is “male” will not be a biological or voluntaristic condition within Big Data, but will instead depend on how the algorithm for “maleness” is programmed, what variables it takes into account and what patterns it searches for and imposes on data and populations. These algorithms will then shape users’ interactions with any environments the system touches. 

2. The definition of a particular identity marker may be much more fluid within a data-based system. It can adapt to the emergence of new patterns or be reprogrammed to accommodate certain data points. The categories themselves can be algorithmically dictated, so that their meaning evolves with the data collected and the processes used to organize and analyze it all. So “maleness” may change from moment to moment, depending on changes in the data that feeds into its algorithm as well as on direct code changes to the algorithm itself. Traditional categories may seem static, but are dynamically redefined moment by moment, context by context by algortihms. And algorithms also throw up new categories and subcategories that inflect the traditional ones contingently.

3. Because the definitions of categories are constantly changing, so individual users’ identities are also perpetually in flux within the system. You may be male one minute, not male the next, male again tomorrow, and so on. Cheney-Lippold writes:

a user’s ascribed gender can and may change as new user information arrives into the cybernetic system ... algorithms allow a shift to a more flexible and functional definition of the category, one that de-essentializes gender from its corporeal and societal forms and determinations while it also re-essentializes gender as a statistically-related, largely market research-driven category. Gender becomes a vector, a completely digital and math-based association that defines the meaning of maleness, femaleness, or whatever other gender (or category) a marketer requires.

4. This changes the nature of stereotyping. “The capacity for cybernetic categorization to regulate certain categories’ meaning according to algorithm marks a move away from offline stereotypes and into a form of statistical stereotyping.” The stereotypes may be seen as quasi-objective if they are derived from a data set (and then fed back into a system to regulate users’ behavior). Data may mirror pre-existing social biases and then launder them into objectivity by letting those biases shape the definition of derived categories. 

The way biases are programmed into algorithms — what data is collected, what patterns they are trained to look for, etc. — may be hidden and laundered this way as well. Sexism can be reified into code and disappeared, if code is seen as some sort of neutral processing of data.

As Cheney-Lippold notes, drawing on Foucault, “ontologies are embedded within power relations” — what defines any particular social category is a matter of politics, but that politics can be hidden, circumvented, superseded, by code, automation, and algorithms.

5. The platforms may not ascribe gender (or any other classification) as an either-or, but as a place within a matrix, an intersecting point on many different continuums. Any category may be endlessly nuanced into subclassifications. Within the system “gender” can be expanded to admit any number of possibilities beyond “male” and “female,” each with its distinct set of data patterns. It becomes a matter of human users deciding to recognize those refined, expanded ascriptions as “gender.” If we let algorithms deduce our gender, we may be open to letting gender be something beyond binary. 

More likely, platforms could report our deviation from what it regards as “true maleness” or “femaleness” and invite us to contribute more data or perform more actions to try to “correct” this. Gender can be gamified, and we can pushed to achieve 100% maleness, or what have you, according to the shifting determinations of what “maleness” is.

The platforms and algorithms can reify certain identity markers that were once defined more nebulously and indeterminately in social interaction, but then black-box those concretized definitions, so that individuals are forced to always police themselves and modify their behavior continually in an effort to achieve their specifications. The algorithm can thereby implement social control by holding the secret truth of a socially vaunted categorization. We may strive to perfect our maleness according to what a platform’s copious data “proves” it should be, but we will then become dependent on the platform that collects that data and analyzes it to produce a “objective” definition of what “male” is. We will have to continually interact with that system, give it more information, submit to its tests.

6. The way identity markers are defined within data platforms will always serve the extension of the platform, not the personal preferences of the individual users. They will impute an identity that is conducive to perpetuating the platform’s influence over users and sustain their interaction. (This is the lesson one can derive from the way gambling machines are engineered to prolong the time spent on machines.)   

7. One can’t hack the algorithms that control us and shape the social categories that govern us (or how we are individually categorized). The categories are inferred from populations, not from individual behaviors; platforms implement control at the level of populations, and these are constituted ad hoc, depending on what sort of control is desired. 

As Cheney-Lippold explains

Control then works at levels far past the purview of liberal individualism, situating subjects within networks of power that govern indirectly and without proximity. The individual user is incapable of really experiencing the effect that algorithms have in determining one’s life as algorithms rarely, if ever, speak to the individual. Rather, individuals are seen by algorithm and surveillance networks as members of categories. … The identifications that make us as subjects online are becoming more opaque and buried, away from our individual vantage points and removed from most forms of critical participation. They are increasingly finding mediation outside the realm of traditional political intervention and inside the black boxes of search engines and algorithmic inference systems (Becker and Stalder, 2009).

We don’t even know what our identity within the platform really is, let alone the levers by which to change it — and that’s assuming there is a static “it” that could be changed. There is no use complaining that an algorithm imposes gender discrimination if it is impossible to know how that algorithm defines gender at any moment. The definition is always being modulated on a case-by-case basis, depending on the other factors in play. I might be “male” to an algorithm being run for law enforcement, “female” for one being run by a food company; etc. etc.

The full range of our identity markers can be algorithmically reconstituted depending on our context; our imputed identity, even in terms of any of the familiar and seemingly fixed categories (gender, class, race, religious affiliation, etc.), can change from web page to web page. Our data is reprocessed from moment to moment, positing a different self for us to inhabit and imposing a different set of culturally inflected prejudices on us. Trying to wrest control of these from within the system only refines the data by which the process is implemented. “We are effectively losing control in defining who we are online, or more specifically we are losing ownership over the meaning of the categories that constitute our identities.”

A fatalistic response to this is to embrace the way identity is imposed on is, consume our “selves” as perpetually novel, ultra-personalized consumer goods. (Baudrillard prescribed this as a kind of “hyperconformity.”) One can accept the ready pleasure of consumerism rather than pursue the freedom of autonomy, which is always imperfect and requires boundless innovation in our techniques of resistance. 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s