Data as consideration and the quiet commodification of the self

“If you are not paying, you are the product” has become a cliché of the digital age. But European law is now doing something more precise, and more unsettling, than that slogan suggests. It is starting to treat our personal data as a kind of payment, a contractual “counter-performance”, while simultaneously insisting that such data are not a commodity at all. That legal tension goes to the heart of what is happening to the modern self in data-driven markets.

From “free” services to data as consideration

For years, economists and lawyers have described personal data as a de facto currency of the digital ecosystem. “Free” online services are funded not by charity but by the collection, aggregation and monetisation of users’ information through targeted advertising, dynamic pricing and data brokerage.

EU consumer law has now caught up with this reality and laws governing digital content and digital services explicitly cover contracts where a consumer does not pay money but instead provides personal data to access apps, platforms or cloud services. In those cases, the law treats the provision of data as a form of consideration, triggering the same remedies for non-conformity as if a price had been paid.

The law’s examples are very familiar. A user opens a social-media account, hands over a name and email address, then uploads photographs and posts that can be used for marketing. A consumer installs a “free” location-tracking app that shares movement data with third parties. In each case, the service is supplied in exchange for access to personal data rather than euros.

Scholars describe this as “data as counter-performance” or “personal data as consideration” – a contractual quid pro quo, rather than a gift.

“Not a commodity” – on paper

Yet the same law also insists that, because data protection is a fundamental right under EU law, “personal data cannot be considered as a commodity”.

This produces an odd legal double vision. On one side, consumer law tells us that personal data may function like a price, with all the contractual consequences that follow. On the other, data-protection law and fundamental-rights doctrine insist that privacy and data protection cannot simply be traded away like property, even by the individuals concerned.

The European Data Protection Supervisor and many commentators worry that overtly “pricing” personal data risks undermining this rights-based framework. If privacy becomes something you sell to access basic digital infrastructure, then the poorest and most vulnerable will bear the highest burden, accepting deeper surveillance in exchange for essential services.

So we end up with a compromise: EU law recognises that people do in practice “pay with data”, and therefore deserve contractual remedies when digital services fail, but it simultaneously refuses to bless a fully commodified data market.

Surveillance capitalism and the sliced-up self

For the individual, however, the key question is not whether personal data count as “money”. It is what kind of self we become when our everyday behaviour is continually translated into tradable information.

Shoshana Zuboff famously describes “surveillance capitalism” as the unilateral claiming of private human experience as raw material for translation into behavioural data, which are then packaged as prediction products and sold into “behavioural futures markets”.

What is being commodified here is not just isolated facts - a postcode, a purchase, but the patterns of attention, habit and vulnerability that define a person’s way of being in the world. Location history reveals intimate routines and relationships; engagement metrics reveal moods and triggers; search histories expose fears, illnesses and hopes.

As legal scholars have pointed out, when personal data operate as contract consideration, the bargaining chips are often these deeply embedded aspects of personality, offered up under conditions of radical information asymmetry. Platforms know far more about the economic value and long-term uses of our data than we ever can.

The risk is that we move from “data as currency” to “self as currency”: a world in which our identities are progressively disassembled into data points, evaluated for their market value and traded over our heads.

Self-commodification in the age of personal branding

This process is not purely imposed from above. Cultural norms in the platform economy actively encourage people to treat themselves as products. Sociologists describe “self-commodification” as the reorganisation of personal life and relationships on the model of market relations, visible in the rise of personal branding and the monetisation of identity.

On social-media platforms, this becomes a three-way trade:

  • People curate their online selves to maximise visibility and engagement.

  • Platforms translate that activity into granular data and predictive profiles.

  • Advertisers buy access to those profiles in order to capture attention and influence behaviour.

Legally, it may look like a series of contracts funded by data as consideration. Psychologically, it encourages us to internalise a market logic in which every aspect of selfhood is something to be optimised for sale - time, emotions, appearance, opinions. The line between using a platform and working for it begins to blur.

Autonomy, consent and the illusion of choice

Contract law traditionally relies on the idea that parties understand the bargain they are striking. That assumption is strained in data-driven markets. Terms of service and consent dialogues are lengthy, opaque and often bundled. Individuals rarely grasp the downstream uses of their data, the inferential power of AI models or the ways in which the information they “pay” today will shape the offers, news and opportunities that reach them tomorrow.

From the perspective of autonomy, this matters more than the metaphor of currency. If the very data used as consideration are later fed back into systems that nudge, segment and manipulate our choices, then the transaction is recursive. We are not simply exchanging data for services. We are trading a measure of future self-determination: our attention, our susceptibility to targeted influence, our place within algorithmic hierarchies of visibility and value.

Zuboff calls this an “assault on human autonomy” rather than a mere privacy problem, because what is at stake is the practical capacity to act according to one’s own reasons rather than those inferred and shaped by hidden systems.

Beyond the “data as currency” metaphor

What follows from this is not that every digital contract involving personal data is illegitimate. Rather, we should be cautious about metaphors that naturalise commodification and obscure power.

If we speak only of “data as currency”, it sounds as if we are merely updating familiar markets. In fact, we are gradually building a regime in which the self is decomposed into tradable signals within infrastructures that are extraordinarily difficult to opt out of.

Law and policy can push back in at least three ways:

  • Re-centre data as an extension of personhood, not pure property, by maintaining strong, non-waivable rights even where data function as contractual consideration.

  • Constrain “take it or leave it” data bargains, especially for services that are effectively essential for social participation.

  • Increase transparency and intelligibility, for example through meaningful “data receipts” that show what is being exchanged, on what terms, and with what downstream uses.

  • Encourage alternative business models, including subscription and public-interest platforms, that do not depend on deep behavioural extraction.

European law has already taken a first step by explicitly recognising data-for-services contracts while reiterating that personal data are tied to fundamental rights and cannot simply be treated as ordinary commodities. The next step is cultural as much as legal: refusing to see ourselves as bundles of tradable metrics, and insisting that any digital bargain preserve, rather than erode, the conditions of genuine autonomy.

Previous
Previous

The AI Companion Crisis