Monthly Archives: May 2016

From Sianne Ngai, Ugly Feelings

This makes me think of Facebook Live videos, and how they will be used to substantiate ideological propositions (e.g. “what race looks like”; the racial coding of what appears as “unmediated behavior”). Liveness’s value is its supposed uneditedness, its directness. But only the most popular streams are viewed, effectively editing the others out of existence algorithmically. The surviving videos then get to seem like representative slices of the real, shorn of the need to examine how they represent an edited vision of what “reality” is supposed to look like, how only certain videos confirm a consensus or popular understanding of what is true about society.

Trending

One must be enculturated into journalistic practices. Journalism — recognizing what is supposedly in the “public interest”  — is a habitus that is acquired and to some extent taught; it is not self-evident in events that certain ones constitute news. 

No journalism comes without a set of biased values built into it, pertaining to what should be considered in the public interest. The recourse to “trending” is fantasy about getting to do away with that enculturation of journalists; instead you use your platform’s size to guide engagement toward whatever is already being engaged with. You get to make news on demand by generating an ever-changing list of “important” topics, without having to define what rises to the level of “important." 

The only "important” thing in that world is the network that is big enough to be capable of tracking what is “trending.” Facebook is the only newsmaker.

News products produced commercially reproduce the ideological conditions that allow them to be profitable products. They are made to reconstitute the demand for them. “Trending” is one flavor of that product. “Serious and in the public interest” is another flavor. 

But these flavors are shaped by genre conventions, not fidelity to some real conditions of newsworthiness.Trending stories are essentially sponsored stories for Facebook. 

Facebook is invested in the idea that truth depends on scale, and the size of their network gives them privileged access to the truth: no one else has as many users, so no one else knows what is trending, and trending is “newsworthy” by definition — it is almost tautological. 

Facebook isn’t trying to correct biases or appear objective; they are saying that sheer scale ultimately cannot be biased, and any bias lingering is users’ fault (they made their filter bubble) or the result of insufficient automation to remove curatorial bias and editorial judgment. 

Facebook wants to promote the idea that scale is the only “real” form of editorial judgment, because no one currently can compete with them in those terms.

Looking at Facebook for some sort of overarching left-right political bias misses the point of how it works altogether. Its Newsfeed is designed only to keep you consuming, not to shape your views. It wants only to persuade you to stay on Facebook, by whatever combination of content performs that trick. 

Facebook only cares about offering Trending insofar as it gets you to not close the tab. And that is a literal sideshow to the main even on Facebook, the newsfeed. The newsfeed algorithms are built to do one thing: manufacture demand for more Facebook, which takes the form of more of whatever content Facebook can seize upon and redistribute to you.

If the algorithmic brainwashing is working, the newsfeed doesn’t make users want more news of some ideological stripe; it makes us want more of Facebook making choices for us: That is the product enjoyed, not the content of what stories are being chosen. The stories are just the fuel powering the fun mill that is the newsfeed’s continual turning.

Objectivity is a product

Objectivity is best understood not as an empirical matter, but as a product that is desired and designed and sold to individual consumers. News consumers want to consume the feeling of objectivity, but actual objectivity (even if it were possible) is besides the point. We occasionally want to believe that we are exceeding our personal biases and getting in touch with some sort of shared reality, but we want that on our terms.

Facebook is set up to serve us that “objectivity” in highly personalized terms. It blurs the degree to which it shapes what users see and obfuscates the process with machine-driven curation, which seems to stand in contrast to the “editorial judgment” used traditionally by news media.

Facebook’s curation model is meant to give people what makes them feel good and stay on the site while mystifying the process by which their information has been curated. It can seem as if it stuff your friends are sharing and therefore is connecting you to that community. Or it can seem as if machine learning produced your feed and it is therefore “objective,” with any human biases purged from it. Or it can seem that you yourself have curated your own feed by liking things in the past. 

The point is that the decision-making about what counts as news for you individually is constructively ambiguous. You can choose to experience it as being as objective or as collective or as narcissistic as you want, depending on what your needs are. 

But the idea that any presentation of news could somehow avoid being “manipulated” is fantasy. The process of selection and transmission implies an interested position on what should be considered newsworthy; there is no universal agreement about that, and all news broadcasts participate in the negotiation of what will be understood as news. 

Complaining about “manipulation” reinforces the idea that we could have direct access to truth that is not distorted by human agency and somehow transcends politics. It is a fantasy about escaping politics.

Facebook address this fantasy of escaping politics on a number of different fronts. It provides users filtered information, which suppresses the experience of ideological difference, and it stages popularity or “trending” as a mode of verification, as if popularity confirmed significance or informational truth value. 

Even if Facebook’s measures of popularity weren’t biased, as the Gizmodo report seems to suggest, the highlighting of “trending” stories still would have nothing to do with objective truth or some universally correct interpretation of historical facts. It merely replicates the distortions of attention that already exist in a given society and reinforces them. Issues that were invisible remain so, with the added insult of the disinterest being represented as “objective,” rather than a product of more overt power relations.