Big Data is Watching You. Part One

Big Data is Watching You_Part one

Introduction.

In her 2019 book The Age of Surveillance Capitalism, S. Zuboff defines the issue addressed in ten different ways.  Indeed, it is a phenomenon that cannot – and shall not – be encapsulated within one static and fixed conceptualization. It can be observed from different angles (economic, social, philosophical, legal, anthropological, psychological) and on the backdrop of different socio-cultural paradigms (totalitarianism, neoliberalism, and colonial, postcolonial, or decolonial studies, just to name a few).  Long story short, surveillance capitalism is a complex process that, through stealthy and unnoticeable mechanisms, (a) extracts data from every aspect of human lives; (b) normalizes and legitimizes these extraction procedures ; (c) uses this to predict future human behaviors; (d) finally exploits the emerging knowledge to manipulate the behavior of human beings for economic purposes. However, this schematization is certainly reductive to condense surveillance capitalism. 

This is the first of a two-part article that has a two-fold aim. First, it offers a general overview of how the economic apparatus of surveillance capitalism works. Secondly,  it builds the basement for more detailed developments in the future. As for now, I will divide the starting assessment into four sections,  following the bullet points mentioned above (extraction, prediction, normalization, and behavior manipulation). In each section I  describe the mechanisms at play, explain the key academic definitions and survey the ideological stances at stake. In this part of the article, I focus on the extraction of data and its normalization, while in the forthcoming second part, I will address how data supply is used to earn revenues thanks to the prediction and manipulation of human behaviors.

1) Extraction.

Data is constantly extracted from our lives, both in wholly digital and physical scenarios. In the first case (including, but not limited to, websites, social networks, apps, and the metaverse),  cookies - small pieces of code inserted in the user’s browser that enable the re-identification of the user in the same website - are used to collect our (meta)data. Originally,  they were used to perform the so-called stateful interactions only (like the use of virtual shopping bags). But third parties soon understood their potentiality to track the totality of users’ activities on the internet, so the same user can now be identified across different websites through social media IDs (1).

On the contrary, (meta)data extraction in physical contexts is performed by sensors and actuators embedded in everyday objects connected to the internet so that most of the time we are not even aware of being surveilled (2). Cookies, sensors, and actuators capture the behavioral surplus (3). The expression requires an explanation. The adjective behavioral refers to the traces or breadcrumbs involuntarily disseminated in every interaction with internet-enabled devices, such as our voices, emotions, and personalities. Not only Amazon’s Alexa listens constantly to our conversations, but also social networks understand our political orientation. Additionally, sophisticated AI systems are currently used to detect the emotions of applicants during a recruitment session, or citizens’ states of mind at a metro stop. The second term of the expression (surplus) suggests that these extractive systems capture much more data than that required to provide their services (4). A social network, indeed, could work without knowing - or wanting to know - your election preferences. Despite the problem of the interoperability of standards - the incapacity of different websites and/or devices to communicate information to one another due to the different protocols they use to store and manage data (e. g. the mutual interactions between a smartwatch and a health monitoring app) (5) -  information coming from cyber and physical sources can now be merged and linked together. The result is a pervasive and relentless practice of surveillance that penetrates every aspect of our daily life, making it a simple input for capitalism. Even though it may sound unclear, I will develop this topic In the forthcoming second part of the article.

These extractive practices are based on the “datafication ideology”, as brilliantly depicted by Van Dijck (6). Datafication rests on two questionable ontological-epistemological underpinnings: (a) data mirrors objectively human life processes and (b) is construed as “raw material” that bears the potentiality of economic return (that is, it can be lawfully used to make money). Actually, the second point may be de facto true (“data is the new oil”, surveillance capitalists claim, and the World Economic Forum disagrees with this parallelism only in that there is a virtually unlimited supply of data and its use is potentially endless), and only de iure questionable (do surveillance capitalists have the right to extract data?). 


Nonetheless, the first statement is definitely more problematic. First, neutral data does not exist. Instead, data always reflects the inequalities and injustices of the social contexts from which it is collected (7). Biases and prejudices affect (and jeopardize) certain demographic groups or minorities, while extraction processes happen as if no bias shaped the reality modeled by the data. In other words, data collection practices include society’s biases, which seriously undermines their alleged objectivity. Second, the “extraction”/“collection” metaphor is imprecise and misleading. (Meta)data, indeed, does not exist before its collection, but it is generated when the flux of human existence is classified and labeled according to categories that the extractive system receives from its architect or develops on its own (in the cases of extractive systems that employ unsupervised machine learning). It is a form of “interpretation” (8) or, to use a more colorful expression, “an act of power” (9). This is particularly relevant to understand that, strictly speaking, we are not expropriated of our personal data. From a theoretical point of view, it is human life itself, rather than the (meta)data that derive from it, that is construed by surveillance capitalists as “a repository of raw material” or “a biopolitical public domain” open to harvesting and appropriation (10). 

2) Normalization.

In this context, “consent is sublimated into the coded environment” (11). The erasure of consent does not really spring from the surveillance capitalists’ infringement of compliance with the requirements of the EU General Data Protection Regulation (GDPR). Even though some infringements may occur, the exceptionally expensive fines that companies must sustain in those circumstances are a solid incentive for them not to do so. The sublimation of consent is subtler. It should be noted that 76 working days would be required to read all the privacy policies – the fundamental instrument to secure consent according to the GDPR – a user encounters in a year. Consequently, the pragmatic conditions to exercise informed consent simply do not subsist. 

In truth, the real issue is that extraction practices are publicly acknowledged, normalized, and, in the extreme case of Quantified Self-movement (12), even desired. Surveillance happens in the light of the day, but the real scope of this phenomenon is not adequately grasped by common sense. This is why the car manufacturer Volvo could recently exploit this lack of knowledge to boost its revenues. Starting in June 2022, the company launched the “Live unpredictably” ad to sponsor one of its SUV models (more information here). In this ad, a voiceover calmly presents itself as “our algorithm” and proclaims to be able to predict every aspect of our lives. The questionable point is that, through the purchase of the car,  we would be able – in some unspecified way – to deceive the algorithm’s predictions. This says much about the minimization that the phenomenon of surveillance capitalism undergoes due to the habituation and adaptation process (13). 

However, the normalization of surveillance has not been originally introduced by surveillance capitalism. Human societies have always displayed periodic forms of control like the Christian sacrament of confession, fiscal taxation, and perhaps also population censuses. But the shift from periodic to constant forms of surveillance was attained, according to Foucault, only with the disciplinary turn of social governance occurring between the end of the 18th and the beginning of the 19th centuries. Due to several strategies prominently symbolized by J. Bentham’s Panopticon (14), citizens started conducting their existence as if they were constantly surveilled by sovereign watchmen that they could not see.  However, in spite of its similarities with its antecedents, contemporary surveillance presents a significant difference: 

“If the panopticon effect relied on our inability to tell exactly when we were being watched, so that we behaved all the time as though we were, the ‘inverse panopticon effect’ of pervasive data surveillance relies on us knowing that we are being watched all the time but lapsing into behaving as though we are not, thus naturalizing acceptance of a world in which surveillance and continuous tracking operate unnoted in the background” (15). 

In other words, we ceased to crave an escape from the pervasive sight of surveillance. 1984’s main character, Winston Smith, felt the urge of crawling into the alcove of the wall in his apartment to remain outside the range of the telescreen, whereby Big Brother watched him (16). Instead, we are quite complicit with the logic of surveillance capitalism and tend to agree with the stance that the expropriation of personal (meta)data is “a regular currency for citizens to pay for their communication services and security” (17). We have a vague sentiment about the pervasive presence of surveillance, but this confused notion is not powerful enough to trigger the repulsion of this system. The invisibility of the techniques of surveillance is instrumental to the preservation of this form of power: what cannot be seen, cannot be questioned either. 

3) Connection to the second part of the article.

This is how surveillance capitalism penetrates in every aspect of our life, converts it into data and normalizes these extraction procedures. Once data is collected, it undergoes complex forms of processing aimed “to produce virtual representations” (data doubles) of “identifiable, flesh-and-blood human beings”, “optimized for modulating human behavior systematically” (18). The next part of this article will focus on these topics. We will see how, “based on algorithmic reasoning, it is a real person who gets sent to jail, refused health insurance, or offered a favorable price in the supermarket or an opportunity for social housing” (19). Also, we will investigate how the quantification of every aspect of human life is able to shape how reality appears to us. And the result of this is that surveillance capitalism nudges us towards existential choices (like the identification of a career path or of a romantic partner) that we have the illusion of taking on our own. With the complete picture of the working mechanisms of surveillance capitalism laid out, we will devote specific articles to each of the aspects broadly covered in this general introduction. 


Sources

  1. Cfr. J. E. Cohen, Between truth and power, Oxford University Press, Oxford,  2019, pp.  54-57; N. Couldry and U. A. Mejias, The cost of connection, Stanford University Press, Stanford, 2019, p. 21.

  2. L. DeNardis, The internet in everything, Yale University Press, New Haven-London, 2020, p. 5.

  3. The expression comes from S. Zuboff, The age of surveillance capitalism, Public Affairs, New York, 2019.

  4. S. Zuboff, The age of surveillance capitalism, Public Affairs, New York, 2019, pp. 65-97.

  5. See on this topic L. DeNardis, The internet, cit., pp. 132-159.

  6. J. Van Dijck, Datification, dataism and dataveillance, 2014, in Surveillance & Society 12(2): 197-208.

  7. N. Couldry and U. A. Mejias, The cost, cit., p. 147.

  8. Van Dijck, Datification, cit.

  9. K. Crawford, Atlas of AI, Yale University Press, New Haven-London, 2021, p. 127.

  10. J. E. Cohen, Between truth, cit., p.  50.

  11. Ivi, p. 59.

  12. N. Couldry and U. A. Mejias, The cost, cit., pp. 168-173.

  13. S. Zuboff, The age of surveillance capitalism, Public Affairs, New York, 2019, p. 138.

  14. M. Foucault, Sorvegliare e punire, Einaudi, Torino, 2014, pp. 213-247.

  15. N. Couldry and U. A. Mejias, The cost, cit., p. 100.

  16. G. Orwell, 1984, Mondadori, Milano, 2013, p. 9.

  17. J. Van Dijck, Datification, cit.

  18.  J. E. Cohen, Between truth, cit., p. 67.

  19. N. Couldry and U. A. Mejias, The cost, cit., p. 131.

Previous
Previous

The Future of Artificial Intelligence: ChatGPT

Next
Next

Much Ado About Nothing