SurveillanceCapitalism

Edit

The tech giants use our data not only to predict our behaviour but to change it

Shoshana Zuboff

This article makes it pretty clear about what is wrong with tech mining your data.

Corporations and Regulation

Tech companies’ innovation rhetoric effectively blinded users and lawmakers for many years.

“Gilded Age” barons in the late-19th century United States insisted that there was no need for law when one had the “law of evolution”, the “laws of capital” and the “laws of industrial society”.

The logic of surveillance capitalism begins with unilaterally claiming private human experience as free raw material for production and sales. It wants your walk in the park, online browsing and communications, hunt for a parking space, voice at the breakfast table …

These experiences are translated into behavioural data. Some of this data may be applied to product or service improvements, and the rest is valued for its predictive power. These flows of predictive data are fed into computational products that predict human behaviour.

Markets in human futures compete on the quality of predictions. This competition to sell certainty produces the economic imperatives that drive business practices. Ultimately, it has become clear that the most predictive data comes from intervening in our lives to tune and herd our behaviour towards the most profitable outcomes. Data scientists describe this as a shift from monitoring to actuation. The idea is not only to know our behaviour but also to shape it in ways that can turn predictions into guarantees. It is no longer enough to automate information flows about us; the goal now is to automate us. As one data scientist explained to me: “We can engineer the context around a particular behaviour and force change that way … We are learning how to write the music, and then we let the music make them dance.”

Surveillance capitalists know everything about us, but we know little about them. Their knowledge is used for others’ interests, not our own

One example is privacy law’s call for “data ownership”. It’s a misleading notion because it legitimates the unilateral taking of human experience – your face, your phone, your refrigerator, your emotions – for translation into data in the first place. Even if we achieve “ownership” of the data we have provided to a company like Facebook, we will not achieve “ownership” of the predictions gleaned from it, or the fate of those products in its prediction markets. Data ownership is an individual solution when collective solutions are required. We will never own those 6m predictions produced each second. Surveillance capitalists know this. Clegg knows this. That is why they can tolerate discussions of “data ownership” and publicly invite privacy regulation