Objectivity is best understood not as an empirical matter, but as a product that is desired and designed and sold to individual consumers. News consumers want to consume the feeling of objectivity, but actual objectivity (even if it were possible) is besides the point. We occasionally want to believe that we are exceeding our personal biases and getting in touch with some sort of shared reality, but we want that on our terms.
Facebook is set up to serve us that “objectivity” in highly personalized terms. It blurs the degree to which it shapes what users see and obfuscates the process with machine-driven curation, which seems to stand in contrast to the “editorial judgment” used traditionally by news media.
Facebook’s curation model is meant to give people what makes them feel good and stay on the site while mystifying the process by which their information has been curated. It can seem as if it stuff your friends are sharing and therefore is connecting you to that community. Or it can seem as if machine learning produced your feed and it is therefore “objective,” with any human biases purged from it. Or it can seem that you yourself have curated your own feed by liking things in the past.
The point is that the decision-making about what counts as news for you individually is constructively ambiguous. You can choose to experience it as being as objective or as collective or as narcissistic as you want, depending on what your needs are.
But the idea that any presentation of news could somehow avoid being “manipulated” is fantasy. The process of selection and transmission implies an interested position on what should be considered newsworthy; there is no universal agreement about that, and all news broadcasts participate in the negotiation of what will be understood as news.
Complaining about “manipulation” reinforces the idea that we could have direct access to truth that is not distorted by human agency and somehow transcends politics. It is a fantasy about escaping politics.
Facebook address this fantasy of escaping politics on a number of different fronts. It provides users filtered information, which suppresses the experience of ideological difference, and it stages popularity or “trending” as a mode of verification, as if popularity confirmed significance or informational truth value.
Even if Facebook’s measures of popularity weren’t biased, as the Gizmodo report seems to suggest, the highlighting of “trending” stories still would have nothing to do with objective truth or some universally correct interpretation of historical facts. It merely replicates the distortions of attention that already exist in a given society and reinforces them. Issues that were invisible remain so, with the added insult of the disinterest being represented as “objective,” rather than a product of more overt power relations.