Your Privacy Is Important to Us! – Restoring Human Dignity in Data-Driven Marketing

about & support

Foreword by Eric K. Clemons

Preface, Acknowledgements and Abbreviations

Bibliography


PART I – INTRODUCTION


1. Why this book?
    (#methodology #delimitations #structure)


2. Data-Driven Business Models
    (#surveillancecapitalism #valueextraction #harm)

PART II – LAW


3. Regulating Markets
    (#law #markets #architecture #consumerprotection)


4. Data Protection Law
    (#gdpr #personaldata #lawfulprocessing #legitimatebasis)


5. Marketing Law
    (#ucpd #professionaldiligence #averageconsumer)

PART III – PSYCHOLOGY AND TECHNOLOGY  


6. Human Decision-Making
    (#agency #psychology #boundedrationality #willpower)


7. Persuasive Technology
    (#technology #choicearchitecture #friction #prompts)


8. Manipulation
    (#coercion #deception #subliminalmarketing #paternalism)


9. Transparency
    (#information #communication #complexity #asymmetry)

PART IV – SOCIETY


10. Human Dignity and Democracy
      (#humanwellbeing #privacy #discrimination #proportionality)


PART V – CONCLUSIONS AND BEYOND


11. Conclusions
      (#humandignity #datadrivenmarketing #beinghuman)


12. Next Steps
      (#action #conversations #future)

CHAPTER TEN

Human Dignity and Democracy

#human well-being  #privacy  #discrimination  #proportionality

The purpose of this book is to discuss the application of data protection law and marketing law to value extraction in the guise of data-driven marketing. In particular, we have addressed how business models create new forms of information asymmetry, placing consumers at greater disadvantage vis-à-vis traders. Empowerment and informed decisions require agency, transparency and the absence of manipulation. This may be difficult to ensure in a context where technology is used both to observe and to shape behaviour1—at scale and in real time as well as across platforms.

1. Fundamental (human) rights

In addition to affecting the economic interests of consumers, data-driven marketing poses a risk for fundamental rights that protect human beings and democracy as such. We have dealt with the fundamental right to the protection of personal data, which is a subset of the broader concept of privacy. In this chapter, we add a constitutional perspective by including reference to guaranteed fundamental rights and freedoms that go beyond market regulation but are challenged by activities in markets.

Law is not an end in itself, but a means to protect certain values. It follows from Article 2 TEU that

‘The Union is founded on the values of respect for human dignity, freedom, democracy, equality, the rule of law and respect for human rights […].’

In essence, law is a matter of balancing interests, which becomes more complex with this broader perspective. This complexity can be illustrated by the aims pursued under the Treaty of the European Union (TEU):


Article 3


‘1.  The Union’s aim is to promote peace, its values and the well-being of its peoples.

[…]

3.  The Union shall establish an internal market. It shall work for the sustainable development of Europe based on balanced economic growth and price stability, a highly competitive social market economy, aiming at full employment and social progress, and a high level of protection and improvement of the quality of the environment. It shall promote scientific and technological advance.

It shall combat social exclusion and discrimination, and shall promote social justice and protection, equality between women and men, solidarity between generations and protection of the rights of the child.

It shall promote economic, social and territorial cohesion, and solidarity among Member States.

It shall respect its rich cultural and linguistic diversity, and shall ensure that Europe’s cultural heritage is safeguarded and enhanced.

[…].’

As mentioned in Chapter 3 (regulating markets), teleological interpretation necessitates that ‘every provision of [EU] law must be placed in its context and interpreted in the light of the provisions of [EU] law as a whole, regard being had to the objectives thereof and to its state of evolution at the date on which the provision in question is to be applied’.2 In this vein primary law can be perceived as a lens through which secondary law should be observed. Thus, secondary law serves as an important vehicle for realising the objectives stated in the Charter, which must be interpreted with due regard to the accompanying explanations.3

As discussed in previous chapters, there is already a legal framework to ensure consumer empowerment in the context of data-driven business models. Here, the hypothesis is that the need for agency, transparency and absence of manipulation may be corroborated by fundamental rights laid out in the Charter. We focus on human dignity, privacy and discrimination, which must be balanced against the right to conduct a business. The argument is that this broader range of fundamental rights may play a significant role in consumer protection law, including the protection of personal data, in the context of data-driven business models.

If we analyse the application of individual legal fields (bottom-up) without considering the bigger picture, not only do we fail to place EU law in its context and interpret it in light of EU law as a whole; we may also fail to understand the potential for personal, social and societal harm that some aspects of data-driven business models pose. It makes a significant difference whether we balance data-driven marketing only against market interests, including competition and consumers’ economic interests, or consider other interests, such as human dignity and democracy.

The importance of fundamental rights in the context of AI has also been recognised by the United Nations, which has suggested a moratorium on the sale and use of AI systems until adequate safeguards are put in place.4 As discussed in the previous chapters, consumer protection law, including the protection of personal data, does provide safeguards for the use of AI to personalise marketing.

In his analysis of a possible constitutionalisation of consumer law, Hans-Wolfgang Micklitz asks whether there is a ‘fundamental and human right to be treated as a consumer’, which may ‘open up a new round of debate about the relationship between the Internal Market and Fundamental Rights’. He adds that:5

‘Academics will have the task of investigating whether the constitutionalised consumer is just another variant in the overall history of consumer protection or whether this concept will turn into a new page in the history of private law—of constitutionalised private law beyond the state through the Charter of Fundamental Rights and not forgetting the European Convention on Human Rights.’

We argue that this constitionalisation must be real. Firstly, this is so because consumers are also citizens, and their engagements in markets—when they are on the consumer side of data-driven business models—have significant effects not only on the individual user but also on social and societal relationships. Secondly, fundamental rights ‘form an integral part of the general principles of law the observance of which the Court ensures’,6 and because ignoring the Charter in market law would render the protection of citizens and democracy illusionary. Thirdly, if we accept that data protection law is part of consumer law, this in itself requires a constitutional perspective.

Thus, we argue that the protection of human dignity, privacy and non-discrimination is likely to corroborate consumer protection law, including data protection law.

1.1. Fundamental rights and business

A free market, together with individualism, human rights and democracy, is part of what can be described as the ‘liberal package’.7 From the preamble to the Charter, it follows that its aims include balanced and sustainable development and ensuring free movement of persons, services, goods and capital, and the freedom of establishment.

The Charter is not aimed at traders and markets, as it follows from Article 51(1) of the Charter that:

‘The provisions of this Charter are addressed to the institutions, bodies, offices and agencies of the Union with due regard for the principle of subsidiarity and to the Member States only when they are implementing Union law. They shall therefore respect the rights, observe the principles and promote the application thereof in accordance with their respective powers and respecting the limits of the powers of the Union as conferred on it in the Treaties.’

Rights of the Charter are the same with regard to meaning and scope as corresponding rights in the ECHR as interpreted by the ECtHR, without preventing more extensive protection under EU law.8 Interpretation of human rights is dynamic and ‘it is of crucial importance that the [ECHR] is interpreted and applied in a manner which renders its rights practical and effective, not theoretical and illusory.’9 It could be argued that practical and effective application of the Charter must entail that individuals and democracy as such should not be less protected when infringement of the envisaged fundamental rights and freedoms results from traders’ exploitation of human frailties, including by persuasive technology.

Even though the Charter is directed at states, fundamental rights may be directly applicable to businesses or impose an obligation for Member States to ensure that the aims are pursued,10 and it remains an important question how fundamental rights apply to non-state actors.11 In the context of privacy, the ECtHR has ruled that in some situations states have positive obligations to secure respect for private life, by adopting measures—beyond an opportunity to claim compensation for damages—designed to secure this right; even in ‘the sphere of the relations of individuals between themselves’.12

The CJEU has established that the Charter’s prohibition of discrimination is ‘sufficient in itself to confer on individuals a right which they may rely on as such in disputes between them in a field covered by EU law’.13 The protection of human dignity and respect for privacy are not equally precise, but are also far from devoid of meaning.

A principle of sincere cooperation is expressed in Article 4(3) TEU, which says that Member States must (a) ensure fulfilment of the obligations arising out of the Treaties and (b) facilitate the achievement of the Union’s tasks and (c) refrain from any measure which could jeopardise the attainment of the Union’s objectives.14

1.2. Commercial practices and fundamental rights

Both the GDPR and the UCPD are secondary law, and as such they derive their authority from a higher level of the legal hierarchy, i.e. primary law. Because privacy, including the protection of personal data, is secured in the Charter, it is a straightforward approach to take a fundamental rights’ perspective in the interpretation of the ePrivacy Directive (privacy) and the GDPR (personal data).

When it comes to marketing law, the Charter (only) provides for ensuring a high level of consumer protection in ‘Union policies’ (Article 38 that is based on Article 169 TFEU), which flavour is ‘more political aspiration than independently enforceable legal norm’.15 Article 38 must be ‘observed’ but does not provide subjective rights that must be ‘respected’.16

The UCPD applies to protecting the economic interests of consumers, and does not address ‘legal requirements related to taste and decency’ to the extent that the economic interests of consumers are not affected. Thus, Member States may apply national rules to protect human dignity and prevent sexual, racial and religious discrimination and the depiction of nudity, violence and antisocial behaviour.17

It is clear that the UCPD applies in situations where a commercial practice infringes on both the economic interests of consumers and taste and decency, which may include human dignity, privacy and non-discrimination. Doorstep selling is an example of a commercial practice that may affect both economic interests and privacy. In a case (predating the UCPD) concerning the free movement of goods, the CJEU found that consumer protection could justify restrictions of doorstep selling of silver jewellery. The court mentioned that in the assessment of proportionality, it must be considered that such practices have a

‘potentially higher risk of the consumer being cheated due to a lack of information, the impossibility of comparing prices or the provision of insufficient safeguards as regards the authenticity of that jewellery and the greater psychological pressure to buy where the sale is organised in a private setting.’18

The court focused on both the psychological pressure and the lack of information, including by means of price comparison. Both aspects are relevant to data-driven marketing and fall within the scope of the UCPD. Even though the internet allows for easy price comparison of offers from different traders, personalised marketing means that consumers may find it difficult to compare their offers with offers provided (by the same trader) to other consumers. This may be a similarly important comparison, as we discuss below in the context of non-discrimination.

As there is no doubt that the UCPD applies in such situations, the real question concerns the extent to which ‘taste and decency’ may be included in the interpretation and assessment of professional diligence:

‘Professional diligence’ means ‘the standard of special skill and care which a trader may reasonably be expected to exercise towards consumers, commensurate with honest market practice and/or the general principle of good faith in the trader’s field of activity.’

The wording is sufficiently broad to include interests pertaining to, for instance, human dignity, privacy and discrimination, provided that the commercial practice in question is likely to affect (also) the economic interests of consumers. The question is not settled by the CJEU, but it seems well aligned with the aims of the Charter and the need for an effective interpretation of the UCPD to include taste and decency in determining professional diligence.

The opposite result—because of the full harmonisation nature of the directive—would mean that consumers would not be protected from taste and decency-aspects of marketing when the commercial practice in question would also be relevant to the economic interests of consumers.

2. Human dignity

Human dignity is the mother of all human rights, as it ‘constitutes the real basis of fundamental rights’.19 Article 1 of the Charter provides that ‘human dignity is inviolable’ and ‘must be respected and protected’. The concept is both important and vague, and has been explained in the following manner:

‘All in all, human dignity has its roots deep in the origins of a conception of mankind in European culture that regards man as an entity capable of spontaneity and self-determination. Because of his ability to forge his own free will, he is a person (subject) and must not be downgraded to a thing or object.’20

Thus, human dignity can be said to protect ‘the authority of human beings to govern their own lives’.21 In the context of data-driven business models, human beings may be observed as programmable and hackable algorithms, as we react predictably (in probabilistic terms) to external stimuli that may be delivered through the technology we use.

Data-driven business models rely on the commercialisation and commodification of human attention and the human experience.22 When users’ attention and agency are being conditioned for behaviour modification, consumers become more valuable for traders.

‘Machine becomes manlike and man becomes machinelike, a convergence that may augment our capabilities yet reduce our humanity, however we define it.’23 People may ‘no longer see themselves as autonomous beings running their lives according to their wishes, but instead will become accustomed to seeing themselves as a collection of biochemical mechanisms that is constantly monitored and guided by a network of electronic algorithms.’24

Both persuasive and addictive technology and the commercialisation of all individual behaviour and experience—public and private—may be said to undermine human dignity to some degree. This is especially true to the extent that they negatively impact personal, social and societal integrity and cohesion, as we discuss below. Manipulation can be seen as an insult to both autonomy and dignity, the latter by being humiliating.25 However, human dignity is a double-edged sword, as we should also be free to err and learn, recognising the antifragile nature of human beings.

Antifragility can be defined as ‘anything that has more upside than downside from random events (or certain shocks) […]; the reverse is fragile.’26 To ensure the antifragile nature of human beings, the random events must be recognisable and not devastating in order to support healthy development (‘what does not kill me makes me stronger’27) rather than resignation and apathy. Relying too heavily on technology may lead to ‘digital amnesia’.28

Freedom in a market perspective entails the agency to make rational choices that are informed; reflect the consumer’s goals, values and preferences; and are restrained by available choices, bounded rationality and choice architecture. In order to cater to human antifragility in markets (and elsewhere), transparency and the absence of manipulation are important.

2.1. Paying with human dignity

Human dignity is found in Title I of the Charter, which consists of Articles 1–5 under the heading ‘dignity’. Article 3(1) provides that ‘everyone has the right to respect for his or her physical and mental integrity.’ As predatory business models can intrude upon the integrity of persons, including by means of invading privacy and agency, this provision could serve as inspiration for understanding human dignity in the context of data-driven marketing. This could include situations where Artificial Intelligence is used to increase engagement, with addiction, outrage and polarisation as side effects.

Article 3(2)(c) further provides for ‘the prohibition on making the human body and its parts as such a source of financial gain’ in the fields of medicine and biology. This provision is obviously not applicable to data-driven marketing, even though surveillance capitalism relies on body parts, including the brain,29 for financial gains and though, at least some data can be said to fall within the field of biology, including by proxies.

In the context of patents, the CJEU has established—with reference to safeguards concerning human dignity—that the human body may not be appropriated in its natural environment, but it may be part of a product which is patentable.30

It could be argued that the design of behaviour modification experiments that are carried out without democratic oversight or scientific approval is contrary to human dignity, especially considering that their purpose is financial gain. In essence, the question is whether payment for products should include a part of oneself.31

2.2. Human well-being

It follows from Article 3(1) TEU that the aims of the European Union include ‘Union values and the well-being of its peoples’.32 In the early stages of the development of EU consumer policy, mentioned that ‘the improvement of the quality of life is one of the tasks of the Community and as such implies protecting the health, safety and economic interests of the consumer’.33 The U.S. president John F. Kennedy has also stated that:34

‘[…] If consumers are offered inferior products, if prices are exorbitant, if drugs are unsafe or worthless, if the consumer is unable to choose on an informed basis, then his dollar is wasted, his health and safety may be threatened, and the national interest suffers. On the other hand, increased efforts to make the best possible use of their incomes can contribute more to the well-being of most people than equivalent efforts to raise their incomes.’

In his presentation of this ‘special message’ he also stated that that the aims of consumer protection were important to ‘the well-being of every American family’.

Attention is a scarce resource—ultimately a zero-sum game35—that, along with memory, lies at the core of our identities36 and is important for how we live and experience our lives. As observed by William James:

‘When we reach the end of our days, our life experience will equal what we have paid attention to, whether by choice or default. We are at risk, without quite fully realizing it, of living lives that are less our own than we imagine.’37

Human well-being is a complex concept that may seem more philosophical than legal, and material wealth is primarily important for the well-being of poorer people, who, for instance, may be susceptible to payday loans.

If one were to point at a default aim for human well-being, it could be to live longer and richer lives with deep and meaningful human interactions. Having a purpose,38 being understood and having agency are important for well-being;39 manipulation is not. The quality of human life also depends on our capacity for presence and boredom,40 as well as our relationship to the world.41

Psychologists seem to agree that well-being depends primarily on the quality of social relations42 and meaningful work43 when certain ‘human needs’ are fulfilled, as described by Abraham Maslov.44 Sigmund Freud has emphasised the important role of ‘taking responsibility’,45 and it remains safe to assume that he did not have in mind the consumer reading through extensive terms and conditions.

Unsolicited distractions may undermine human well-being, as they may erode ‘our capacity for deep, sustained, perceptive attention’ that are the building blocks of ‘intimacy, wisdom, and cultural progress’.46 The revelations by The Wall Street Journal in the ‘Facebook Files’ include the observations that Facebook (now Meta) has been conducting studies over several years to understand how Instagram affects its millions of young users; and that ‘32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse’; and that ‘among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram’.47

The UCPD is without prejudice to rules relating to the health and safety aspects of products (Article 3(3)), which means that the fairness of a commercial practice, according to the directive, does not affect such rules, but that a failure to comply with such a rule may be considered an unfair commercial practice.48 It is clear from Article 3(4) UCPD that if there is a conflict between the directive and other community rules regulating specific aspects of unfair commercial practices, the latter is to prevail and will apply to those specific aspects.49

In a case concerning a national provision restricting the marketing of oral and dental care services, the CJEU found that the UCPD did not preclude the provision, as it concerned the protection of public health and the dignity of the profession of dentist. However, the provision—which imposes a general and absolute prohibition of any advertising relating to the provision of oral and dental care services, inasmuch as it prohibits any form of electronic commercial communications, including by means of a website created by a dentist—was precluded by Article 8(1) of the E-Commerce Directive.50

Data-driven business models at the predatory end of a benign–predatory continuum may infringe not only on the economic interests of consumers but also on the interests relating to health and safety. In addition, the more or less unintended and/or unanticipated consequences of these business models include negative effects on social coherence and the democratic debate. Article 38 of the Charter (consumer protection) explicitly mentions ‘the health, safety and economic interests of consumers’. However, the Product Liability Directive51 does not apply to services or situations where information which, by its nature, constitutes a service, is incorporated into a physical item.52

2.3. Loneliness and social relations

Being alone is stressful, and ‘lonely people tend to have more of just about every kind of mental and physical illness than people who live in rich social networks’.53 In terms of health, loneliness is worse than not exercising, as harmful as being an alcoholic and twice as harmful as being obese.54 Human beings need close relationships to thrive, but without time, attention and empathy, relationships will not thrive.55 Facebook’s own research show that about 12.5% of its users report ‘engaging in compulsive use of social media that impacts their sleep, work, parenting or relationships’.56

Advertisements work best on lonely individuals.57 Thus, optimising for profits may entail real-life consequences for human health, and amplify the trend over past decades in the direction of more loneliness and more shallow interactions between human beings.58

Social influence and competitive behaviour ‘stem directly from the drive for self evaluation and the necessity for such evaluation being based on comparison with other persons’.59 In terms of herding, behaviours seem ‘normal’ because other people are doing them,60 which is a good driver for behaviour modification. In this vein, it could be argued that it is an unfair commercial practice to exploit loneliness and friendship in marketing (cf. also Cialdini’s liking principle). As expressed by Shmuel I. Becher and Sarah Dadush:

‘By encouraging consumers to behave emotionally, relationship as product lowers consumers’ defenses. It encourages consumers to overlook their self-interest and invest more money, attention, and time in buying products and services and interacting with firms. At a societal level, relationship as product can damage trust and decrease well-being.’61

The idea of ‘relationship as a product’ could be extended to ‘vulnerability as a product’ in behaviour modification, especially when AI is used to optimise for the most profitable influence.

Even though smartphones and social media allow for connecting people, their design and our use of these technologies add to the shallowness of social interaction,62 which again allows for outrage and polarisation in society, as discussed below.

2.4. Dopamine and addiction

In neuroscience, drugs, gambling, shopping and other addictions are linked to the neurotransmitter dopamine,63 which plays an important role in relation to behaviour, cognition and learning. As discussed in Chapter 7 (persuasive technology), human–computer interaction can be designed to affect moods and produce rewards in the guise of releasing dopamine in the user’s brain.64

Our urge for instant gratification coupled with ‘bounded willpower’65 has provided fertile ground for a dopamine industry, in which content modification thrives. The addictiveness of a product is directly proportional to the amount and speed of the dopamine release. The development of tolerance is also important for addiction, and the scary part is that tolerance does not seem to decline when the product is not used, i.e. the brain is altered permanently.66

The significance of addictions to, e.g., tobacco, alcohol, drugs and gambling, is that the user’s ‘real tastes have been taken over by a special monkey on his shoulder’;67 and there is in general extensive regulation in place for those addictive products. Persuasive technology can be designed to dispense dopamine automatically, at a distance and at scale, and thus create addictions.

Self-binding strategies—such as alcoholics not having alcohol in the house—are important to cope with addictions. However, smartphones and social media are important for social and societal interactions, and it may thus be difficult to develop and implement effective self-binding strategies and tactics for digital addictions.68

The Zeigarnik effect—which describes the fact that incomplete experiences occupy our minds far more than completed ones69—is another way of explaining addictive behaviour. For instance, a posting on social media services is likely to keep the activity and service more readily present in your brain, and thus draws on cognitive resources. The tension is relieved when a task is completed, but many tasks online are often open-ended.

The ping from or presence of a smartphone may prompt the user to check in with technology.70 The term in neuroscience is ‘cue-dependent learning’ or ‘classical (Pavlovian) conditioning’.71 It is found that anxious and fearful users check in more frequently.72 Finally, Cialdini’s reciprocity principle is also likely to create a drive to repay likes and accept invitations to connect.73

To the extent that persuasive design is used to create addiction-like behaviour, it may lead to anxiety and stress, which even further deplete willpower and diminish the consumer’s ability to control emotions74—an ability that is important for effective decision-making. This creates a vicious cycle in which the value extraction becomes even more efficient.

We may rationalise addictive behaviour by means of preference adaptation and herding to avoid cognitive dissonance. Our appetite for distractions seems infinite,75 and some economists may say that our ‘revealed preferences’ signal that we are in love with our technologies, considering the amount of screen time and how often users check their phones. Or as Tim Wu asks:76

‘What are the costs to a society of an entire population conditioned to spend so much of their waking lives not in concentration and focus but rather in fragmentary awareness and subject to constant interruption?’

2.5. Cognitive overload

Despite their efficiency, our brains have limitations. It is suggested that our working memory can typically hold seven (±two) pieces of information.77 And the ‘Dunbar number’ suggests—based on a correlation between brain size of primates and average social group size—that human beings cannot maintain stable social relationships with more than 150 people.78 For closer friends and family, the number is around 15.79

The idea of a stressful world is not new: just consider Sigmund Freud’s account from the early twentieth century:

‘The enormous expansion of communications, due to the world-wide telegraph and telephone networks, has entirely transformed the conditions of trade and commerce. Everything is done in haste, at fever pitch. The night is used for travel, the day for business; even “holiday trips” put a strain on the nervous system. Great political, industrial and financial crises carry this excitement into far wider areas of the population than ever before. […] people are forced to engage in constant mental activity and robbed of the time they need for relaxation, sleep and rest […] Big-city life has become increasingly sophisticated and restless. The exhausted nerves seek recuperation in increased stimulation, in highly-spiced pleasures, and the result is even greater exhaustion.’80

With complexity comes a broader range of probabilities,81 and technology augments our abilities to manage information and engage with people. Computers make things look easy, so we don’t have to think.82 And that may in fact threaten our antifragile nature and challenge both agency and human dignity. Complexity may lead to ego depletion and even resignation,83 which makes defaults and nudges more effective—and important.

In a complex and stressful world with widely scattered attention,84 we may lose ‘our means and ability to go beneath the surface, to think deeply’.85 The ability to skim text may be as important as the ability to read deeply, but it seems ‘that skimming is becoming our dominant mode of reading’.86 When we outsource our memory and thinking,87 we also outsource important parts of our intellect and identity, and there is the risk that the internet becomes ‘a replacement for, rather than just a supplement to, personal memory’.88 As expressed by Robert B. Cialdini:

‘I have become impressed by evidence indicating that the form and pace of modern life is not allowing us to make fully thoughtful decisions, even on many personally relevant topics. Sometimes the issues may be so complicated; the time so tight, the distractions so intrusive, the emotional arousal so strong, or the mental fatigue so deep that we are in no cognitive condition to operate mindfully. Important topic or not, we have to take the shortcut.’89

A study shows that the mere presence of smartphones reduces available cognitive capacity, and that there is a direct correlation between these cognitive costs and smartphone dependence.90

3. Privacy

We introduced privacy in Chapter 4 (data protection law) and emphasised that privacy plays an important role in both democracy and individual autonomy, where a balance must be struck between solitude and companionship:91 both are necessary for developing democracy and individual agency.

The right to privacy includes protection from surveillance and is not limited to the processing of personal data. It does not matter whether the information is sensitive or whether the persons surveilled have been inconvenienced in any way.92 In our context, this is relevant to the surveillance carried out in the context of data-driven business models. Under the heading ‘respect for private and family life’, Article 7 of the Charter provides that:

‘Everyone has the right to respect for his or her private and family life, home and communications.’

In contrast to the protection of personal data (Article 8), Article 7 does not explicitly mention consent as a means of interfering with this right. Thus, interference must be justified under Article 52(1) of the Charter, as we discuss below. The provision corresponds to Article 8 ECHR,93 which in subsection 2 provides for the following legitimate limitations:

‘There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.’

In that vein the ePrivacy Directive is of particular interest, as it also pursues this broader concept of privacy.94 The protection of privacy is also likely to be a relevant factor in the interpretation of both the GDPR95 and the UCPD, especially if we perceive technology in a broader perspective, including as a social, cultural and political phenomenon.96

The rights enshrined in both Articles 7 and 8 of the Charter ‘must be considered in relation to their function in society’.97 In this vein, surveillance may also be understood as infringing on human dignity. For the UCPD, it may only apply if consumers’ economic interests are at stake, which is likely to be the case in the context of data-driven business models.

The protection is clearly not absolute but subject to a proportionality test that requires interferences to be legitimate (in accordance with the law) and proportionate (necessary) to the aim pursued (interest). Measures must ‘remain a genuine derogation to the principle of confidentiality’.98 The ‘economic well-being of the country’ could also include the protection of the internal market (European single market) as envisaged in article 3(3) TEU mentioned above.

3.1. The ePrivacy Directive

According to Article 1(1), the ePrivacy Directive harmonises national law to ensure ‘an equivalent level of protection of fundamental rights and freedoms, and in particular the right to privacy, with respect to the processing of personal data in the electronic communication sector’. Thus, the Directive does not apply to offline activities but may be relevant for, e.g. a rewards program that involves the use of a smartphone. Recitals 6 and 7 from this 2002 Directive state:

‘(6)  The Internet is overturning traditional market structures by providing a common, global infrastructure for the delivery of a wide range of electronic communications services. Publicly available electronic communications services over the Internet open new possibilities for users but also new risks for their personal data and privacy.

(7)  In the case of public communications networks, specific legal, regulatory and technical provisions should be made in order to protect fundamental rights and freedoms of natural persons and legitimate interests of legal persons, in particular with regard to the increasing capacity for automated storage and processing of data relating to subscribers and users.’

Member States must ensure the confidentiality of ‘communications and the related traffic data’, in particular by prohibiting third-parties’ ‘listening, tapping, storage or other kinds of interception or surveillance of communications and the related traffic data […]’ (Article 5(1)). The provision covers ‘any operation enabling third parties to become aware of communications and data relating thereto for purposes other than the conveyance of a communication.’99

Interference may be justified by means of consent (as in the GDPR) of the users concerned. Another exception is found in Article 15(1), which allows for restrictions, including ‘data retention’, that constitutes ‘a necessary, appropriate and proportionate measure within a democratic society’, for, inter alia, the prevention, investigation, detection and prosecution of criminal offences.

The latter exception is relevant because there is a string of case law that may elucidate the implication of the protection of privacy. The cases concerned data retention laws requiring telecom operators to store and give access to metadata (traffic and location data) for the purposes of fighting serious crime. This traffic data is lawfully processed for the purpose of subscriber billing and interconnection payments.

The judgments—Digital Rights Ireland100 (April 2014), Tele2 Sverige101 (December 2016), Ministerio Fiscal (October 2018),102 Privacy International103 (October 2020) and La Quadrature du Net and Others104 (October 2020)—were delivered in Grand Chambers (15 Judges) of the CJEU, which is usually a signal of complexity and/or importance.

Firstly, the CJEU established that the transmission of traffic data and location data is a particularly serious privacy interference, and that the information that can be thus derived, including by profiling, may be ‘no less sensitive than the actual content of communications’.105 Secondly, ‘only the objective of fighting serious crime is capable of justifying such a measure’,106 and safeguarding national security is capable of justifying ‘measures entailing more serious interferences’ than other objectives.107 Thirdly, general and indiscriminate transmission’ exceeds the limits of what is strictly necessary and cannot be considered to be justified within a democratic society.108

3.2. Consent and transparency in the context of marketing

The above-mentioned cases concerned state interference with privacy, and it is not clear to what extent the findings apply to non-state actors, such as traders relying on data-driven marketing. However, it may seem reasonable to assume that fighting serious crime is more important from a societal perspective than marketing.

The ePrivacy Directive allows for limited use of traffic data for marketing electronic communications services and the provision of value-added services (Article 6(3)), as well as using location data other than traffic data (Article 9). In both cases, such use is dependant on (a) consent, which may be withdrawn at any time, and (b) proportionality, i.e. to the extent necessary for the purposes.

The CJEU has emphasised that users of electronic communications services should be able to expect, at least in principle, anonymity and privacy in electronic communication, ‘unless they have agreed otherwise’.109 However, communication systems must be ‘designed to limit the amount of personal data necessary to a strict minimum’,110 and interference may not be the rule.111

Online, it may seem as though surveillance is the default option, i.e. it requires effort (friction) to stay incognito. Danah Boyd observes that ‘there’s a big difference between being in public and being public’, and that the notion of ‘digital natives’ is ‘an effort to force the global elite to recognize the significance of an emergent mediated society’.112 Similarly, Tim Wu suggests that

‘The most pressing question in our times is not how the attention merchant should conduct business, but where and when.’113

Further, the CJEU has observed that retaining data without informing the user is ‘likely to cause the persons concerned to feel that their private lives are the subject of constant surveillance’.114 This may, at least for access by ‘public authorities’, have a chilling effect on the freedom of expression.115

Despite consent and transparency, the processing of personal data in the context of data-driven business models may entail the risk of leaving the data subjects with a feeling of ‘constant surveillance’. In addition, the accumulation of large amounts of data is likely to increase the risk of abuse and unlawful access. Thus, it is not unlikely that the protection of privacy will limit the amount of data that can be lawfully processed under the GDPR and affect certain privacy-infringing commercial practices under the UCPD.116

The right to privacy corroborates the argument for keeping personal data from the primary product separate from ancillary uses, such as data-driven marketing. As a matter of transparency, contextual marketing is easier to understand than personalised marketing. Some people, for instance, may believe that anything appearing at the top of a Google search must be true, because ‘why else would it appear at the top?’117

Surveillance may include online behaviour, facial expressions (visual) and oral uttering (audio) as well as proxies for emotions. One of the reasons that information from primary products, such as search, social media and health devices, should not be used for marketing is that we need to be honest to engage with these services;118 the product simply does not work without the user’s being honest. For example, you cannot search for ‘cancer treatment’ on Google without revealing your interest in the subject.

As observed by BJ Fogg, computers have as a distinct advantage that they can go where (a) ‘humans cannot go’ and/or (b) ‘may not be welcome’. The latter, in particular, suggests a privacy focus. Computers’ ability to evoke feelings and be more persistent may also a play a role in understanding computer use in a marketing context.

Despite information and consent, it may seem counter-intuitive that we are under pervasive surveillance, and the framing of the internet as ‘a predominantly commercial enterprise seriously limits the privacy’.119 In connection with social media, it has been noted by the sociologist Zygmunt Bauman that ‘our fear of disclosure has been stifled by the joy of being noticed’.120

Tracking, including by means of GPS location, cookies, fingerprinters and spy pixels, must in general be considered more of an infringement on privacy than the use of personal data for personalising advertising. The same is true for monitoring the contents of private conversations in individual texts, e-mail messages, voice-calls, etc.

For the use of personal data processed in these contexts, the question is, in essence, whether traders have ‘the right to judge our fitness for their services based on abstract but statistically predictive criteria not directly related to those services?’121

4. Non-discrimination

The Charter’s Article 21 concerning non-discrimination is found under the title ‘equality’.122 Discrimination is closely linked to human dignity, and in our context also to privacy, as surveillance may provide the data that can be used for discrimination.123

Discrimination may be a natural consequence of personalisation—i.e. treating individuals differentially—but ‘personalisation’ is a nicer framing. Discrimination constitutes another example of possible cross-fertilization between data protection law and consumer protection law. Discrimination can be assessed by considering (1) the parameters, including personal data, used for personalisation, (2) the manifestation of personalisation and (3) the effect that this has on the individual. Article 21(1) of the Charter prohibits

‘any discrimination’ based on ‘any ground such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation’ (emphasis added).

The risks to the rights and freedoms of natural persons under the GDPR comprise ‘physical, material [and] non-material damage, in particular: where the processing may give rise to discrimination […] or any other significant economic or social disadvantage’ (recital 75 GDPR, emphasis added).

The UCPD focuses on whether the personalisation is likely to materially distort the economic behaviour of the average consumer, regardless of whether personal data are processed. This means, in particular, that material information must not be omitted and that the discrimination must not amount to an aggressive commercial practice, including by means of the trader’s exploiting a position of power vis-à-vis the consumer.

Data-driven business models allow for personalising and segmentation at unprecedented scope and scale. Artificial Intelligence can ultimately be designed to tell different people different things in order to win all of them, and to identify consumers as ‘targets or waste’.124 In that vein, it could be noted that poor people are easier to manipulate, constitute a new market opportunity,125 and are particularly vulnerable, including with regard to their economic interests.126

Personalised marketing may include surveillance of behaviour, targeted advertising, the creation and use of a persuasion profile, and adaptation of terms. The latter includes, in particular, prices that can be adjusted by means of price indications, offers and discounts.

The prohibition concerning discrimination on grounds of nationality is a general principle of European Union law and is, e.g., elaborated on in the Geo-Blocking Regulation.127 The Regulation prohibits unjustified geo-blocking and other forms of discrimination based, directly or indirectly, on the user’s (1) nationality, (2) place of residence or (3) place of establishment.

4.1. Price discrimination

The Charter’s prohibition on discrimination must also apply to (unjustified) price discrimination, but there is no specific legal prohibition of price differentiation,128 which seems widely accepted, e.g. in the flight industry,129 where travellers pay different prices for the exact same service delivered at the exact same time. ‘Personalised discount’ is another and more benign framing of price discrimination.130

We must accept that some people are better negotiators or have more time to pursue bargains. However, we may also have intuitive expectations that individuals—at least in principle—all have access to the same information and terms.

It could be argued that—given the less complicated reality when it was adopted—the 1998 Price Indications Directive131 was intended to establish equality through ‘transparent operation of the market’ (recital 1). In recital 6, the directive states that the obligation to provide price indications

‘contributes substantially to improving consumer information, as this is the easiest way to enable consumers to evaluate and compare the price of products in an optimum manner and hence to make informed choices on the basis of simple comparisons.’

When stores provide the same price information to all visitors, price indications realise the potential for ‘simple comparisons’, a comparison that becomes easier when price information is available online. However, when prices are individualised, an obligation to indicate ‘a’ price does not realise the goal of ‘simple comparisons’ and may hinder ‘transparent operation of the market’.

In this vein, one could argue that—at least under certain circumstances—price differentiation could be contrary to professional diligence with regard to the overarching aims of the Price Indications Directive and the general aim of transparent markets. To the extent that such differentiation undermines the user’s price comparison in the market, such a practice would likely amount to the distortion of the consumer’s economic behaviour. Thus, price differentiation could meet the requirements for being an unfair commercial practice.

One could similarly argue that price discrimination may amount to an aggressive commercial practice; this is corroborated by the fact that enforcement and online comparison become more difficult and that being barred from structured market information affects the economic behaviour of the consumer.

When personal data are used for price differentiation, the GDPR applies. This entails the trader’s consideration of the general principles as well as the legitimate basis for processing. In that vein, the reasonable expectations of the data subject play an important part. In the context of automated individual decision-making, including profiling, it is expressly mentioned that differential pricing based on personal data may have a significant effect if, e.g., ‘prohibitively high prices effectively bar someone from certain goods or services’.132 This is a clear example of privacy interferences that affect the economic behaviour of consumers.

Article 20 of the Services Directive133 provides for non-discrimination based on the recipient’s nationality or place of residence ‘without precluding the possibility of providing for differences in the conditions of access where those differences are directly justified by objective criteria’.

The Geo-Blocking Regulation134 clarifies certain situations where differential treatment cannot be justified. It is explicitly stated in Article 4(2) that the prohibition ‘shall not prevent traders from offering general conditions of access, including net sale prices, which differ between Member States or within a Member State and which are offered to customers on a specific territory or to specific groups of customers on a non-discriminatory basis’ (emphasis added).

With the New Consumer Deal Directive, an obligation to inform consumers when a price has been personalised on the basis of automated decision-making is inserted in the Consumer Rights Directive.135 It follows from recital 45 that the provision is without prejudice to the GDPR, and that:

‘Traders may personalise the price of their offers for specific consumers or specific categories of consumer based on automated decision-making and profiling of consumer behaviour allowing traders to assess the consumer’s purchasing power. Consumers should therefore be clearly informed when the price presented to them is personalised on the basis of automated decision-making, so that they can take into account the potential risks in their purchasing decision. […]’

It is obvious that this information requirement cannot in itself justify price discrimination, which must be assessed under the GDPR, including the provisions on automated individual decision-making, including profiling, the UCPD (professional diligence) and the Charter. The provision provides a shallow transparency, as it will not allow the user to understand the logic involved, its significance and the envisaged consequences, as discussed in Chapter 9 (transparency).

4.2. Targeted advertising

John Wanamaker (1838–1922) is credited with saying that ‘Half the money I spend on advertising is wasted; the trouble is I don’t know which half.’ Thus, from the trader’s perspective, it may make a lot of sense to target advertising and measure its effects.

It is unclear to what extent the rules on automated individual decision-making, including profiling, apply to targeted advertising. Personalised marketing could in principle amount to discrimination, depending on the qualifying criteria, but it seems intuitively justified to e.g. direct marketing for dresses towards women rather than men.136 In a joint opinion, the EDPB and EDPS have suggested that the future AI Regulation should prohibit any type of social scoring.137

Using individual profiles to target advertising may—to some extent—be considered an aggressive commercial practice, as it inherently involves some kind of discrimination.138 It could similarly be considered exercise of undue influence—exploiting a position of power as dealt with in Chapter 8 (manipulation)—where account should be taken of inter alia timing, location, nature and persistence (Article 9(1)(a) UCPD).

One way to create transparency in the context of targeted advertising is for the trader to be explicit about why a particular advertisement has been served. Facebook has adopted such a tool, which in a particular case made it clear that advertising for payday loans was targeted at people with an interest in gambling. The Danish Consumer­ombudsman found it to be an unfair commercial practice,139 and it is difficult to see that serving such advertising to this particular (vulnerable) group should be necessary for the purposes of marketing and the performance of the contract entered into between Facebook and the user.

5. The freedom to conduct a business

Article 16 of the Charter recognises ‘the freedom to conduct a business in accordance with Union law and national laws and practices’, which entails the freedom to engage in an economic or commercial activity without ‘disproportionate restraints’.140 The freedom to conduct a business is expressly mentioned in the proposed Digital Services Act,141 which aims, inter alia, to ‘mitigate discriminatory risks [… and contribute to …] the right to human dignity online’ and observes that ‘the costs incurred on businesses are offset by reducing fragmentation across the internal market’.

5.1. Proportionality

The freedom to conduct a business is not an absolute right, as it must (‘of course’) be exercised with respect for Community law and national legislation.142 Thus, this right must be considered in relation to its social function and restrictions, which must (a) correspond to objectives of general interest pursued by the European Union and (b) not constitute, with regard to the aim pursued, a ‘disproportionate and intolerable interference, impairing the very substance of those rights’.143

As follows from Article 52(1) of the Charter, limitations to rights and freedoms must (1) be provided for by law, (2) respect the essence of the rights and freedoms and (3) be necessary (subject to proportionality) and genuinely meet objectives of general interest, including the rights and freedoms of others.144

The right does not imply ‘any right to engage in any activity or to perform any act aimed at the destruction of any of the rights and freedoms recognised in this Charter or at their limitation to a greater extent than is provided for herein’.145 The freedom to conduct a business must be reconciled with the requisites for protecting other fundamental rights, including aspects of consumer protection;146 privacy, including personal data;147 and health,148 in order to strike a ‘fair balance between the various fundamental rights protected by the Community legal order’.149 Thus, when claiming the right to conduct a business, this right must be understood with respect to EU and national law and be weighed against possible infringement of, inter alia, human dignity, privacy and non-discrimination.

When transposing and interpreting directives, such as the UCPD, Member States must allow for ‘a fair balance to be struck between the various fundamental rights’ and interpret their national law in a manner consistent with those directives, fundamental rights and other general principles, such as the principle of proportionality.150

The CJEU has recognised that funding requirements go hand-in-hand with the freedom to conduct a business, but also that this freedom must be balanced against, inter alia, ‘the protection of the interest of consumers’ (as television viewers) and ‘excessive advertising’.151 In that vein, it must be considered whether the restraint affects the actual substance of the freedom to choose an occupation, for instance, by going beyond merely controlling marketing to prohibit the offering and marketing of products.152

The freedom of expression and information—envisaged in Article 11 of the Charter—apply to commercial communication153 and may corroborate the right to conduct a business in this context, but the trader’s freedom of commercial expression is not likely to carry much weight compared to the protection of human dignity and democracy.

The freedom to conduct a business does not extend to protecting ‘mere commercial interests or opportunities, the uncertainties of which are part of the very essence of economic activity’,154 but it does recognise the freedom of contract,155 which in this context is closely related to the consumers’ agency and right to self-determination.

In determining ethical issues with persuasive technology, BJ Fogg has identified three focal points: (a) the intentions of the trader, (b) the methods used to persuade and (c) the outcomes of using the technology, and he suggests as a first step ‘to take technology out of the picture’, and ‘simply ask yourself, “If a human were using this strategy to persuade me would it be ethical?”’156 This is similar to applying the IT law dogma of functional equivalence (non-discrimination between online/offline activities).

As a ‘reality check’, it could be useful to imagine sales personnel in the real world following and observing consumers every hour of the day—across stores, at work, in schools, in bars, at home, etc.—in order to meticulously observe, analyse and record behaviours and emotions. This could include occasionally interacting with the consumer to observe and record responses. Based on these observations and interactions, a detailed profile could be developed with a view to offering the right products and terms, at the right time and in the most persuasive manner.

To justify interference with the right to conduct a business, it is obviously necessary to consider other interests, including those pertaining to human dignity and democracy, and to consider whether those objectives can be attained by less restrictive measures.157 In this equation the possibility of consumer empowerment is important as well as whether there are meaningful, believable alternatives to the business model.

5.2. Agency and the right to self-determination

We have discussed above a concept of freedom in the context of consumers’ having a right to self-determination and being free from paternalism (in both narrow and broad senses).

It is easy to assume that more choices equals more freedom and well-being, but the opposite may be true, as providing people with more choice may paralyse them rather than liberating them.158 This also relates to the scarcity of attention159 and the cost of choices.160 These constraints on decision-making do not refute Adam Smith’s idea that individual freedom of choice provides for the most efficient production and distribution of society’s goods,161 but it may be an argument for market intervention that will empower consumers to comply with their obligations in the market.

The argument is not to limit the number of products offered, but to make it easier for the consumer to actually take informed decisions by enhancing transparency. This can be achieved by both disclosure of information and removing practices that add unnecessary friction to the decision process. As observed by Yuval Noah Harari, censorship in the twenty-first century ‘works by flooding people with irrelevant information’; and therefore having power today means ‘knowing what to ignore’.162 Or as Seth Godin puts it:

‘Marketing is now so powerful that caveat emptor is no longer a valid defence.’163

From the dawn of EU consumer policy, the focus has been on enabling ‘consumers, as far as possible, to make better use of their resources, to have a freer choice between the various products or services offered’. One of the main priorities has been to ensure protection against ‘forms of advertising which encroach on the individual freedom of consumers’.164 Similarly, the European Commission emphasised in its 2020 consumer agenda the need to tackle ‘commercial practices that disregard consumers’ right to make an informed choice, abuse their behavioural biases, or distort their decision-making processes’.165

Freedom of contract and the free price system are essential elements of the market economy . . . and so are privacy and agency. In essence, the regulation of data-driven business models from the perspective of marketing law and data protection law is a matter of (a) how much the individual/average user is expected to understand and (b) the limit for what users should be legally allowed to accept.

The user’s ability to understand the deal is a function of inter alia (1) the complexity, (2) the envisaged consequences (impact), (3) the user’s reasonable expectations,166 and (4) the trader’s effort to establish genuine transparency, including (5) by sufficiently engaging or appealing to the user’s capacity for reflection and deliberation. Or as asked (rhetorically) by Eric K. Clemons:

‘Can consumers really judge the benefits and rewards, costs and risks, of using a platform? Would warnings on Internet software sites be sufficient, the way warning labels are supposed to be sufficient in limiting tobacco smoking?’167

In that vein, the CJEU has recognised that the imbalance of information and expertise between consumer and trader may be more pronounced in ‘a sector as technical as the telecommunications services sector’.168

In the context of consent, information about and the ability to refuse or withdraw consent without detriment is an important part of understanding the deal. In that vein, it must be clear that from a fundamental rights’ perspective, paying in money is profoundly different from ‘paying’ with human dignity, privacy and agency.

Both marketing law and privacy law (including the processing of personal data) establish a threshold for legitimate interference with privacy for the purposes of marketing. The principle of data minimisation entails that data must be ‘adequate, relevant and limited to what is necessary in relation to the purposes’;169 and the accuracy principle could be used as an argument for limiting the inference and use of data of a probabilistic nature.170

An important question to ask concerns the extent to which consumers—in the name of their right to self-determination—should be free to choose to relinquish their rights to human dignity, privacy and non-discrimination as consideration for using a commercial service.

5.3. Meaningful, believable alternatives

Charging admission is the obvious alternative to the re-selling of human attention,171 which include (a) contextual marketing and (b) personalised marketing, the latter being the more complex and intrusive version. DuckDuckGo is a privacy-focused alternative to Google Search that has been profitable since 2014.172 Similar competitors may be envisaged in the area of social media but due to network effects, market penetration is much more difficult: For search, you only need good search results; for social media, you need your connections.173

Contextual advertising might work better for search than for social media, as the search queries in themselves indicate an interest in finding something and provide a good idea of what the user is looking for. Similarly, social media could be designed to let the user look up information, which would provide a better base for contextual advertising than the frictionless algorithmically mediated newsfeed.

5.4. The cost of ‘free’

There is good reason to believe that ‘free’ is an attractive—and even inevitable—price point,174 but if this low price is reached by undermining human dignity and democracy,175 ‘free’ may in fact be very expensive when externalities are also included in the calculations. Harm from these externalities may be felt on personal, social and societal levels. As expressed by Jamie Bartlett:

‘We need to be aware that cheap or free services have invisible costs: whether that’s your own rights or those of the workers who are employed by the companies that run them.’176

From a consumer perspective, the cost of ‘free’ is much more difficult to calculate and compare when the consumer must make a prognosis for the impact of behaviour modification on pursuing his goals, values and preferences. As mentioned by George A. Akerlof and Robert J. Shiller:177

‘Markets with externalities and manipulations do not work perfectly, even though people are free to choose, and there is a potential role for government.’

When we pay with attention, the price is ‘all the things you could have attended to, but didn’t: all the goals you didn’t pursue, all the actions you didn’t take, and all the possible yous you could have been, had you attended to those other things’.178 When we are exposed to manipulation, we pay with agency (human dignity), which is linked to the opportunity cost of attention, but is more intrusive.

When products are offered free of charge, friction is removed, which means less appeal to our capacity for reflection and deliberation than if the consumer were to make a payment, as discussed in Chapter 8 (manipulation). Payment also have the advantage that it sends an implicit signal that the user is operating in the financial/commercial domain, not the social domain.179

If the price point is ‘free’, competitors may have a hard time competing at a higher price point.180 The competition may be more likely to revolve around the effectiveness of extracting value from surveillance and manipulation, while keeping users in an apathetic state with regard to their privacy. As mentioned above, it is possible to imagine business models relying on fees and contextual marketing, but this would require that consumers stop trading privacy for convenience and for being noticed,181 and the real question may be whether we have the time to wait yet another decade for a possible market solution.182

Add to this that it has been argued that there is no real documentation for the effectiveness of targeted advertising (compared to, for instance, contextual advertising) and that there may be a possible economic bubble pertaining to the re-selling of attention which poses a potential threat to traders, including media, gig-workers, artists and others who rely on advertising (targeted or not) to fund their activities.183

It may be impossible to reframe ‘free’ to mean ‘expensive’, so maybe it would make more sense to require that reality fit the current frame.

5.5. Aligned goals

From a market perspective, competition is supposed to benefit consumers. The idea of transparency is supposed to ensure that consumers can pursue their goals, values and preferences. In essence, the market failure pertaining to data-driven marketing rests in divergent objectives of the trader and consumer, and there may be a need for ‘working with the machines, rather than against them and certainly not for them’.184

For instance, when an online bookstore suggests books based on what the user and others have bought, it is better aligned with the user’s likely preferences than when the same data are used to personalise the marketing of the suggested books to the user’s persuasion profile. Here, the distinction is between what is advertised and how and when it is advertised.

The aim of legislation and interpretation of existing law must be to secure a reasonable balance between these divergent goals, possibly aiming for ‘a high level of consumer protection’ by maximising our capacity for reflection and deliberation, which may require less information and fewer decision. As expressed by James Williams:

‘We can, however, describe the broad outline of our goal: it’s to bring the technologies of our attention onto our side. This means aligning their goals and values with our own.’185

Where consumer law generally focuses on the (internal) market, the GDPR is ‘intended to contribute to the accomplishment of an area of freedom, security and justice and of an economic union, to economic and social progress, to the strengthening and the convergence of the economies within the internal market, and to the well-being of natural persons’.186 Therefore, the GDPR may be our best hope for restoring human dignity in data-driven marketing. It actually follows from recital 4 GDPR that ‘the processing of personal data should be designed to serve mankind’.

6. Democracy

We also need to bring these traders onto the side of democracy to keep externalities from data-driven business models from hurting democracy, including its institutions and citizens.

Attention and agency, including our capacity for reflection, deliberation and empathy, are also important in a democratic perspective. The same is true for the infrastructures on which data-driven business models rely, including social media, search and the devices we use to dispense our attention. This is an additional argument for keeping digital media safe with regard to democratic values.187

From a fundamental rights perspective, we would not accept unjustified infringement upon human dignity, privacy and non-discrimination from a state actor, and there are no compelling arguments for accepting similar interference from a non-state actor. For the user, it does not matter whether infringements come from states or a trader.

That behaviour modification is also readily available for political gains became particularly clear with the Facebook–Cambridge Analytica scandal, affecting the 2016 U.S. presidential election and the 2016 U.K. ‘Brexit’ referendum; and again in January 2021, when ‘big tech’—without any democratic oversight—decided to ‘cancel’ the U.S. president, who had previously benefited from products of theirs that were used in an attempt to ‘cancel’ the presidential election.

Even though technology provides significant benefits to society, it falls within the obligations of democratic oversight to protect citizens from activities that undermine human dignity and democracy. As expressed by Peter Hulsrøj and Marco Aliberti:

‘It is important to stress, however, that choice manipulation is not only commercial. With access to big data, political parties will customise their message to you, and might do so with scant regard to truth. Facebook is showing the way. Religion has not started yet, but big data in the hands of the evangelical movement or the Catholic Church opens fantastic possibilities (together with customized priest avatars).’188

Autonomy is an important feature of AI systems, and unintended outcomes may cause harm to individuals and societies.189 Unintended is not the same as unanticipated, and traders have the capacity to test and monitor their AI systems and an obligation to do so under the GDPR. It could be emphasised that ‘risk is in the future, not in the past’.190 It is in the interest of democracy that market failures do not become personal, social and societal failures. And there are good reasons why we would not like the content of Wikipedia to be personalised191 or leave voting to algorithms,192 even when they know our goals, values and preferences better than we do ourselves.

There is an overarching growth agenda in the EU, as made clear in the Digital Content Directive,193 which in its first recital states that ‘the growth potential of e-commerce in the Union has not yet been fully exploited’, and that ‘[…] making it easier for businesses to supply digital content and digital services, can contribute to boosting the Union’s digital economy and stimulating overall growth’. Growth that is based on undermining the values and aims of the EU Treaties and the Charter is not desirable. As expressed by Jamie Bartlett:

‘It’s clear that these technologies have, on balance, made us more informed, wealthier and, in some ways, happier. After all, technology tends to expand human capabilities, produce new opportunities, and increase productivity. But that doesn’t necessarily mean that they’re good for democracy.’194

Consumer protection law exists in the context of markets where traders are offering (competing) products that are bought and consumed by individuals. In this part of European economic law, consumer protection is secondary to the goal of efficient markets.195 Thus, consumer protection should be understood against this backdrop, and legislation is generally limited to situations where it can be justified due to a market failure, i.e. situations in which the allocation of goods and services is not efficient, as when traders’ pursuit of pure self-interest leads to results that are not efficient.

In modern societies, time and agency are some of the resources most precious to human beings, and therefore the freedom to dispose of our time has great importance. At the current stage of humanity, time for the individual is a finite resource. Consumers need time to comply with their obligations in the market, i.e. to inform themselves and make rational choices. Adding information and complexity (‘friction’) in markets will deplete consumers’ mental capacity and divert it from being used for other purposes, including engaging in democracy.

Trading on consumers’ time and attention has implications for the functioning of both democracy and markets. Even though everyone should not participate equally in politics,196 democracy must be discussed and challenged in order to stay relevant,197 and in that sense democracy is ‘antifragile’, as it gets stronger with volatility.198

Empowerment must ensure that users have the capacity to take ownership of attention management in order to pursue their individual goals, values and preferences in both markets and democracy. The internet is an important common good that has been seized by commercial interests in a manner whereby surveillance and behaviour modification is the default.

Algorithmic mediation of information—in a broader sense than commercial marketing—will also affect democratic public debate and political processes,199 especially when user attention and engagement are optimised for profit rather than informing citizens and public debate. Algorithmic mediation of information is less transparent than traditional curation due to opaque and proprietary (‘black boxed’) algorithms,200 and fragmented abated realities may undermine informed debate201—bearing in mind that ‘surveillance is obviously a fundamental means of social control’.202

Because marketing is generally perceived as relatively harmless, ‘the legal system usually does not attempt to prevent it’.203 Data-driven marketing is far from harmless, however. For example, as a result of the 2018 changes to Facebook’s algorithms, political parties shifted their content to be more negative to increase the reach of their postings, because the new algorithm rewarded outrage.204

The proposed Artificial Intelligence Act focuses on physical and psychological harm but not economic harm as regulated in the UCPD. Given the wide range of interconnected potential harms from data-driven business models, there may be a need to rethink consumer protection law in order to ensure both horizontal coherence (between legal disciplines and different harms) and vertical coherence (with fundamental rights).

There is undoubtedly a huge societal potential for AI, and it could be argued that society is deprived of important growth possibilities when AI experts use their knowledge and skills to optimise marketing205 in abated realities rather than pursuing aims that further personal, social and societal goals, including challenges relating to social cohesion and climate.

In Chapter 3 (regulating markets) we used a computer’s operating system as a metaphor for constitutions. The reason we did not use hardware is that operating systems can be updated and replaced. Territory, humans and culture are important parts of the societal hardware on which our democracies run.


1. See also Karen Yeung, ‘“Hypernudge”: Big Data as a mode of regulation by design’, Information, Communication & Society, 2017, pp. 118–136.

2. Case C‑283/81, CILFIT v Ministero della Sanità, ECLI:EU:C:1982:335, paragraph 20

3. Explanations Relating to the Charter of Fundamental Rights, Official Journal, 2007, C 303/02. Cf. Article 52(7) of the Charter.

4. United Nations, ‘Urgent action needed over artificial intelligence risks to human rights’, 15 September 2021, <https://news.un.org/en/story/2021/09/1099972>.

5. Hans-W Micklitz, ‘The Consumer: Marketised, Fragmentised, Constitionalised’, in Dorota Leczykiewicz & Stephen Weatherill (eds), The Images of the Consumer in EU Law (Hart 2016), pp. 21–41, pp. 40–41.

6. Case C‑112/00, Eugen Schmidberger, ECLI:EU:C:2003:333, paragraph 71 with references.

7. Yuval Noah Harari, Homo Deus (Harper 2017), p. 269.

8. Article 52(3) of the Charter.

9. ECtHR case 28957/95, Christine Goodwin v. the United Kingdom, 11 July 2002, paragraph 74.

10. See, e.g., Case C‑112/00, Eugen Schmidberger, ECLI:EU:C:2003:333.

11. See in general Rikke Frank Jørgensen (ed.), Human Rights in the Age of Platforms (The MIT Press 2019).

12. ECtHR, K.U. v. Finland, no. 2872/02, 2 December 2008, paragraphs 42–43; ECtHR, I v. Finland, no. 20511/03, 17 July 2008, paragraphs 36 and 47; ECtHR, Odièvre v. France, no. 42326/98, 13 February 2003, paragraph 40; ECtHR, X and Y v. the Netherlands, no. 8978/80, 26 March 1985, paragraph 23; ECtHR, Airey v. Ireland, no. 6289/73, 9 October 1979, paragraph 32; and ECtHR, Marckx v. Belgium, no. 6833/74, 13 June 1979, paragraph 31.

13. Case C‑414/16, Egenberger, ECLI:EU:C:2018:257, paragraph 76, with reference to Case C‑176/12, Association de médiation sociale, ECLI:EU:C:2014:2, paragraph 47. See also Case C‑569/16, Bauer, ECLI:EU:C:2018:871; and Dorota Leczykiewicz, ‘The Judgment in Bauer and the Effect of the EU Charter of Fundamental Rights in Horizontal Situations’, European Review of Contract Law, 2020, pp. 323–333.

14. See to that effect Case C‑73/16, Puškár, ECLI:EU:C:2017:725, paragraphs 57–59 with references, concerning Article 47 of the Charter (Right to an effective remedy and to a fair trial).

15. Stephen Weatherill, EU Consumer Law and Policy (2nd edition, Elgar 2013), p. 72.

16. See Article 52(5) of the Charter, that—according to the Explanations Relating to the Charter—clarifies the distinction between ‘rights’ and ‘principles’.

17. Commission Staff Working Document, p. 9.

18. Case C‑441/04, A-Punkt Schmuckhandel, ECLI:EU:C:2006:141, paragraph 29.

19. Explanations Relating to the Charter of Fundamental Rights, Official Journal, 2007, C 303/02.

20. Advocate General Stix-Hackl in Case C‑36/02, Omega, ECLI:EU:C:2004:162, paragraph 78.

21. Marcus Düwell, ‘Human Dignity and the Ethics and Regulation of Technology’, in Roger Brownsword, Eloise Scotford & Karen Yeung (eds), The Oxford Handbook of Law, Regulation and Technology (Oxford University Press 2018), pp. 178–196.

22. Jaron Lanier, Who Owns the Future? (Simon & Schuster 2013) and Shoshana Zuboff, Surveillance Capitalism (Profile Books 2019).

23. Maggie Jackson, Distracted (Prometheus Books 2008), p. 186.

24. Yuval Noah Harari, Homo Deus (Harper 2017), p. 334.

25. Cass R. Sunstein, ‘Fifty Shades of Manipulation’, Journal of Marketing Behavior, 2016, pp. 213–244, p. 226.

26. Nassim Nicholas Taleb, Antifragile (Random House 2012), p. 5.

27. Friedrich Nietzsche, Twilight of the Idols (1889). See also Cass R. Sunstein, Simpler (Simon & Schuster 2013), p. 196, on the value of making errors.

28. Adam Alter, Irresistible (Penguin Press 2017), p. 242.

29. If one believes that this is where attention and decisions rest.

30. Case C‑377/98, Netherlands v Parliament and Council, ECLI:EU:C:2001:523, paragraphs 72 and 77.

31. Joseph Turow, The Aisles Have Eyes (Yale University Press 2017), p. 272.

32. See similarly the phrase ‘Life, Liberty and the pursuit of Happiness’ in the United States Declaration of Independence and the United States Supreme Court decision in Olmstead v. United States, 277 U.S. 438 (1928): ‘The makers of our Constitution undertook to secure conditions favorable to the pursuit of happiness. They recognized the significance of man’s spiritual nature, of his feelings, and of his intellect.’

33. Council Resolution of 14 April 1975 on a preliminary programme of the European Economic Community for a consumer protection and information policy and Preliminary programme of the European Economic Community for a consumer protection and information policy, Official Journal, C 92, 25 April 1975, pp. 1–16, recital 2.

34. J. F. Kennedy, ‘Special Message to the Congress on Protecting the Consumer Interest’, 15 March 1962.

35. Tim Wu, The Attention Merchants (Alfred A. Knopf 2017), p. 27.

36. Maggie Jackson, Distracted (Prometheus Books 2008), p. 202.

37. Cited in Tim Wu, The Attention Merchants (Alfred A. Knopf 2017), p. 7.

38. Viktor E. Frankl, Man’s Search for Meaning (Beacon Press 2014, first published 1959). See also David Runciman, How Democracy Ends (Profile Books 2018), pp. 86–87.

39. See, for instance, Noreena Hertz, The Lonely Century (Sceptre 2020), p. 9.

40. Bertrand Russell, The Conquest of Happiness (Liveright 2013, first published 1930).

41. Hartmut Rosa, Resonance (translated by James C. Wagner, Polity Press 2019, first published 2016), p. 26.

42. See, for instance, Yuval Noah Harari, Sapiens (Harper 2015), p. 382.

43. See e.g. Barry Schwartz & Kenneth Sharpe, Practical Wisdom (Riverhead 2010), p. 279: ‘Well-being depends critically on being part of a network of close connections to others. And well-being is enhanced when we are engaged in our work and find meaning in it.’

44. See Abraham H. Maslow, ‘A theory of human motivation’, Psychological Review, 1943, pp. 370–396.

45. See, for instance, Sigmund Freud, Civilization and Its Discontents (Penguin books 2014, first published 1930).

46. Maggie Jackson, Distracted (Prometheus Books 2008), p. 13.

47. Georgia Wells, Jeff Horwitz & Deepa Seetharaman, ‘Facebook Knows Instagram Is Toxic for Many Teen Girls, Company Documents Show’, Wall Street Journal, 14 September 2021.

48. See also Case C‑356/16, Wamo and Van Mol, ECLI:EU:C:2017:809.

49. See also Joined Cases C‑544/13 and C‑545/13, Abcur, ECLI:EU:C:2015:481.

50. Case C‑339/15, Vanderborght, ECLI:EU:C:2017:335. Article 8(1) provides that ‘Member States shall ensure that the use of commercial communications which are part of, or constitute, an information society service provided by a member of a regulated profession is permitted subject to compliance with the professional rules regarding, in particular, the independence, dignity and honour of the profession, professional secrecy and fairness towards clients and other members of the profession.’

51. Directive 85/374/EEC of 25 July 1985 on the approximation of the laws, regulations and administrative provisions of the Member States concerning liability for defective products.

52. Case C‑65/20, KRONE – Verlag, ECLI:EU:C:2021:471, paragraph 32 (concerning health advice in a printed newspaper).

53. Roy F. Baumeister & John Tierney, Willpower (Penguin 2011), p. 175 with references.

54. Noreena Hertz, The Lonely Century (Sceptre 2020), p. 6 with references.

55. Tony Crabbe, Busy (Grand Central Publishing 2015), p. 182.

56. Georgia Wells, Deepa Seetharaman & Jeff Horwitz, ‘Is Facebook Bad for You? It Is for About 360 Million Users, Company Surveys Suggest’, The Wall Street Journal, 5 November 2021.

57. Douglas Rushkoff, Throwing Rocks at the Google Bus (Portfolio 2016), p. 21.

58. Noreena Hertz, The Lonely Century (Sceptre 2020), p. 94; and about the decline in ‘social capital’ in the U.S., see Robert D. Putnam, Bowling Alone (Simon & Schuster 2000).

59. Leon Festinger, ‘A Theory of Social Comparison Processes’, Human Relations, 1954, pp. 117–140.

60. Anna Lembke, Dopamine Nation (Dutton 2021), p. 27.

61. Shmuel I. Becher & Sarah Dadush, ‘Relationship as Product: Transacting in the Age of Loneliness’, University of Illinois Law Review, 2021, pp. 1547–1604.

62. Noreena Hertz, The Lonely Century (Sceptre 2020), p. 10. See also Jaron Lanier, Ten Arguments for Deleting Your Social Media Accounts Right Now (Henry Holt 2018), p. 65: ‘Speaking through social media isn’t really speaking at all.’

63. The dopamine systems are located in the midbrain. In collaboration with the prefrontal cortex, striatum and other areas, dopamine participates in the computation and storage of values attached to ideas which allow us to simulate experiments, among other things. See Read Montague, Your Brain is (Almost) Perfect (Plume 2007), pp. 240 and 78 et seq.

64. Anna Lembke, Dopamine Nation (Dutton 2021), p. 23. See also Adam Alter, Irresistible (Penguin Press 2017).

65. Roy F. Baumeister & John Tierney, Willpower (Penguin 2011) and Walter Mischel, The Marshmallow Test (Little, Brown 2014).

66. Anna Lembke, Dopamine Nation (Dutton 2021), pp. 49, 53 and 63.

67. George A. Akerlof & Robert J. Shiller, Phishing for Phools (Princeton University Press 2015), p. 103.

68. See, e.g., Jen Wasserstein, ‘My life without a smartphone is getting harder and harder’, The Guardian, 4 November 2021.

69. Adam Alter, Irresistible (Penguin Press 2017), p. 194. See also Roy F. Baumeister & John Tierney, Willpower (Penguin Books 2011), pp. 81–83; and Tony Crabbe, Busy (Grand Central Publishing 2015), pp. 44 and 46.

70. See about the check-in impulse Tim Wu, The Attention Merchants (Alfred A. Knopf 2017), pp. 186–187.

71. Anna Lembke, Dopamine Nation (Dutton 2021), p. 58.

72. Roger McNamee, Zucked (Penguin 2019), p. 88.

73. Ibid., p. 98.

74. Roy F. Baumeister & John Tierney, Willpower (Penguin 2011), p. 33

75. James Williams, Stand Out of Our Light (Cambridge University Press 2018), p. 89.

76. Tim Wu, The Attention Merchants (Alfred A. Knopf 2017), p. 344. See also Ryan Holiday, Ego Is the Enemy (Portfolio 2016), p. 109.

77. George A. Miller, ‘The magical number seven, plus or minus two: Some limits on our capacity for processing information’, Psychological Review, 1956, pp. 81–97. See also Nicholas Carr, The Shallows (W. W. Norton & Company 2010), p. 124.

78. Robin I. Dunbar, ‘Neocortex size as a constraint on group size in primates’, Journal of Human Evolution, 1992, pp. 469–493. See also Malcolm Gladwell, The Tipping Point (Little, Brown and Company 2000), pp. 177–181 and 185–186.

79. Tony Crabbe, Busy (Grand Central Publishing 2015), pp. 185–186.

80. Sigmund Freud, Civilization and Its Discontents (Penguin books 2014, first published 1930), p. 126.

81. Nassim Nicholas Taleb, Fooled by Randomness (2nd edition, Random House 2004), p. 199.

82. Douglas Rushkoff, Program or Be Programmed (Soft Skull Press 2010), p. 68.

83. Joseph Turow, The Aisles Have Eyes (Yale University Press 2017), p. 254.

84. Nicholas Carr, The Shallows (W.W. Norton & Company 2010), p. 113.

85. Maggie Jackson, Distracted (Prometheus Books 2008), p. 155.

86. Nicholas Carr, The Shallows (W.W. Norton & Company 2010), p. 138.

87. Douglas Rushkoff, Program or Be Programmed (Soft Skull Press 2010), p. 40: Thinking is not like a book you can pick up when you want to, in your own time.’

88. Nicholas Carr, The Shallows (W.W. Norton & Company 2010), pp. 138, 180, 192 and 195.

89. Robert B. Cialdini, Influence, New and Expanded (Harper Collins 2021, first published 1984), p. 10.

90. Adrian F. Ward, Kristen Duke, Ayelet Gneezy & Maarten W. Bos, ‘Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity’, Journal of the Association for Consumer Research, 2017, pp. 140–154.

91. Alan F. Westin, Privacy and Freedom (Atheneum 1967), p. 39. See also Sarah E. Igo, The Known Citizen (Harvard 2018), p. 15.

92. Case C‑623/17, Privacy International, ECLI:EU:C:2020:790, paragraph 70 with references.

93. Article 52(3) of the Charter and Explanations Relating to the Charter of Fundamental Rights, Official Journal, 2007, C 303/02.

94. Recital 2 provides that ‘in particular, this Directive seeks to ensure full respect for the rights set out in Articles 7 and 8 of that Charter.’

95. See Case C‑362/14, Schrems, ECLI:EU:C:2015:650, paragraph 94 with references, concerning public authorities’ access on a generalised basis to the content of electronic communications.

96. Helen Nissenbaum, Privacy in Context (Stanford University Press 2010).

97. Case C‑311/18, Facebook Ireland and Schrems, ECLI:EU:C:2020:559, paragraph 172 with references.

98. See also EDPS, ‘Joint hearing in cases C‑793/19, C‑794/19 Spacenet and C‑140/20, Garda’, 17 September 2021.

99. Case C‑623/17, Privacy International, ECLI:EU:C:2020:790, paragraph 55.

100. Joined Cases C‑293/12 and C‑594/12, Digital Rights Ireland, ECLI:EU:C:2014:238.

101. Joined Cases C‑203/15 and C‑698/15, Tele2 Sverige, ECLI:EU:C:2016:970.

102. Case C‑207/16, Ministerio Fiscal, ECLI:EU:C:2018:788.

103. Case C‑623/17, Privacy International, ECLI:EU:C:2020:790.

104. Joined Cases C‑511/18, C‑512/18 and C‑520/18, La Quadrature du Net and Others, ECLI:EU:C:2020:791.

105. Case C‑623/17, Privacy International, ECLI:EU:C:2020:790, paragraph 71 with references.

106. Joined Cases C‑203/15 and C‑698/15, Tele2 Sverige, ECLI:EU:C:2016:970, paragraph 102 (emphasis added). The word ‘only’ is only mentioned in this judgment.

107. Case C‑623/17, Privacy International, ECLI:EU:C:2020:790, paragraph 75.

108. Joined Cases C‑203/15 and C‑698/15, Tele2 Sverige, ECLI:EU:C:2016:970, paragraph 107; and Case C‑623/17, Privacy International, ECLI:EU:C:2020:790, paragraph 81.

109. Joined Cases C‑511/18, C‑512/18 and C‑520/18, La Quadrature du Net and Others, ECLI:EU:C:2020:791, paragraph 109.

110. Recital 30 and Joined Cases C‑203/15 and C‑698/15, Tele2 Sverige, ECLI:EU:C:2016:970, paragraph 87.

111. Case C‑623/17, Privacy International, ECLI:EU:C:2020:790, paragraph 59 with references.

112. Danah Boyd, It’s Complicated (Yale 2014), pp. 57 and 177.

113. Tim Wu, The Attention Merchants (Alfred A. Knopf 2017), pp. 341–342. He suggests drawing inspiration from zoning laws that apply in real reality.

114. Joined Cases C‑203/15 and C‑698/15, Tele2 Sverige, ECLI:EU:C:2016:970, paragraph 100.

115. Case C‑623/17, Privacy International, ECLI:EU:C:2020:790, paragraph 72 with references. See also Case C‑274/99 P, Connolly v Commission, ECLI:EU:C:2001:127, paragraph 39.

116. See also Anja Møller Pedersen, Henrik Udsen & Søren Sandfeld Jakobsen, ‘Data retention in Europe—the Tele 2 case and beyond’, International Data Privacy Law, 2018, pp. 160–174, pp. 170–172.

117. Danah Boyd, It’s Complicated (Yale 2014), p. 183.

118. See also Seth Stephens-Davidowitz, Everybody Lies (Bloomsbury 2017), p. 21.

119. Helen Nissenbaum, ‘A Contextual Approach to Privacy Online’, Daedalus, 2011, pp. 32–48.

120. Zygmunt Bauman & David Lyon, Liquid Surveillance (Polity Press 2013), p. 23.

121. Seth Stephens-Davidowitz, Everybody Lies (Bloomsbury 2017), p. 261.

122. See similarly Article 14 of the ECHR concerning the prohibition of discrimination. See also Directive 2000/43/EC of 29 June 2000 implementing the principle of equal treatment between persons irrespective of racial or ethnic origin, Directive 2004/113/EC of 13 December 2004 implementing the principle of equal treatment between men and women in the access to and supply of goods and services, and proposal of 2 July 2008 for a Directive on implementing the principle of equal treatment between persons irrespective of religion or belief, disability, age or sexual orientation, COM(2008) 426 final, 2008/0140 (CNS).

123. See also EDPB, ‘Guidelines 8/2020 on the targeting of social media users (version 2.0), paragraph 11.

124. See also Joseph Turow, The Daily You (Yale University Press 2011), pp. 88 et seq.; and Oscar H. Gandy, The Panoptic Sort (2nd edition, Oxford University Press 2021, first published 1993).

125. David Caplovitz, The Poor Pay More (The Free Press 1963). See also Philip Kotler, Hermawan Kartajaya & Iwan Setiawan, Marketing 3.0 (Wiley 2010), p. 105.

126. See, for instance, Cass R. Sunstein, Simpler (Simon & Schuster 2013), p. 48.

127. Regulation (EU) 2018/302 of 28 February 2018 on addressing unjustified geo-blocking and other forms of discrimination based on customers’ nationality, place of residence or place of establishment within the internal market. See also Article 20 of Directive 2006/123/EC on services in the internal market.

128. See also Frederik Zuiderveen Borgesius & Joost Poort, ‘Online Price Discrimination and EU Data Privacy Law’, Journal of Consumer Policy, 2017, pp. 347–366, suggesting that data protection law could ‘play a significant role in mitigating any adverse effects of personalized pricing’.

129. See, for instance, Joseph Turow, The Aisles Have Eyes (Yale University Press 2017), pp. 239–240; and Zygmunt Bauman & David Lyon, Liquid Surveillance (Polity Press 2013), p. 128.

130. See also Jonathan Zittrain, The Future of the Internet (Yale 2008), p. 204.

131. Directive 98/6/EC of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers.

132. Article 29 Data Protection Working Party, ‘Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679’, WP251rev.01 (adopted on 3 October 2017 and revised 6 February 2018), p. 22.

133. Directive 2006/123/EC of 12 December 2006 on services in the internal market.

134. Regulation (EU) 2018/302 of 28 February 2018 on addressing unjustified geo-blocking and other forms of discrimination based on customers’ nationality, place of residence or place of establishment within the internal market.

135. Directive 2011/83/EU of 25 October 2011 on consumer rights, Article 6(1)(ea).

136. Not insinuating that there is anything wrong in men wearing dresses; just assuming that more women than men wear dresses. In that vein it should also be noted that ‘gender’ is not sensitive data within the meaning of Article 9 GDPR.

137. EDPB-EDPS Joint Opinion 5/2021 of 18 June 2021 on the proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act).

138. See also Frederik J.Z. Borgesius, Improving Privacy Protection in the Area of Behavioural Targeting (Wolters Kluwer 2015).

139. <https://www.forbrugerombudsmanden.dk/nyheder/forbrugerombudsmanden/
pressemeddelelser/2020/markedsfoering-af-laan-til-hasardspillere-var-ulovlig/>.

140. Case 4-73, Nold v. Commission, ECLI:EU:C:1974:51.

141. Proposal for a regulation on a Single Market For Digital Services and amending Directive 2000/31/EC, 15 December 2020, COM(2020) 825 final, 2020/0361(COD).

142. Explanations Relating to the Charter of Fundamental Rights, Official Journal, 2007, C 303/02.

143. Case C‑544/10, Deutsches Weintor, ECLI:EU:C:2012:526, paragraph 54 with references. See also Case C‑292/97, Karlsson and Others, ECLI:EU:C:2000:202, paragraph 45, with reference to Case C‑5/88, Wachauf v Bundesamt für Ernährung und Forstwirtschaft, ECLI:EU:C:1989:321, paragraph 18.

144. See, e.g., Case C‑12/11, McDonagh, ECLI:EU:C:2013:43, paragraph 61.

145. Article 54 of the Charter.

146. Case C‑12/11, McDonagh, ECLI:EU:C:2013:43, paragraph 63. An obligation to take care of air passengers in the event of cancellation of a flight due to ‘extraordinary circumstances’ (in Regulation (EC) No 261/2004, recognising Article 38 of the Charter and Article 169 TFEU) was not a disproportionate interference with the right to conduct a business. See also Case C‑281/09, Commission v Spain, ECLI:EU:C:2011:767, paragraph 33; and Case C‑157/14, Neptune Distribution, ECLI:EU:C:2015:823, paragraph 73.

147. Case C‑275/06, Promusicae, ECLI:EU:C:2008:54, paragraphs 63 and 65. Concerning the communication of personal data (infringing Article 8 of the Charter) to enable copyright holders to bring civil proceedings to ensure the protection of copyright (envisaged in Article 17 of the Charter).

148. Case C‑544/10, Deutsches Weintor, ECLI:EU:C:2012:526, paragraphs 44 and 45. The right to conduct a business (Article 16 of the Charter) was not disproportionately limited by the prohibition of the use of certain health claims (recognising that health is protected under Article 35 of the Charter).

149. Case C‑275/06, Promusicae, ECLI:EU:C:2008:54, paragraph 68; and Case C‑544/10, Deutsches Weintor, ECLI:EU:C:2012:526, paragraphs 47 and 59 with references.

150. Case C‑275/06, Promusicae, ECLI:EU:C:2008:54, paragraph 68. See also Case C‑623/17, Privacy International, ECLI:EU:C:2020:790, paragraph 67 with references.

151. Case C‑281/09, Commission v Spain, ECLI:EU:C:2011:767, paragraphs 33 and 49.

152. See Case C‑544/10, Deutsches Weintor, ECLI:EU:C:2012:526, paragraphs 57–58; and Case C‑157/14, Neptune Distribution, ECLI:EU:C:2015:823, paragraph 71.

153. Case C‑157/14, Neptune Distribution, ECLI:EU:C:2015:823, paragraphs 64–65 with references.

154. Case C‑4/73, Nold v. Commission, ECLI:EU:C:1974:51.

155. Explanations Relating to the Charter of Fundamental Rights, Official Journal, 2007, C 303/02. See also Case C‑426/11, Alemo-Herron and Others, ECLI:EU:C:2013:521, paragraph 32 with references.

156. BJ Fogg, Persuasive Technology (Morgan Kaufmann 2003), pp. 220–221. Similarly, Fernando Bermejo, ‘Online Advertising as a Shaper of Public Communication’, in Rikke Frank Jørgensen (ed.), Human Rights in the Age of Platforms (The MIT Press 2019), chapter 5, pp. 129–130, suggests focusing on ‘companies, content, and users’ ‘to explore some of the consequences of the advertising model prevalent on the social web’.

157. Case C‑36/02, Omega, ECLI:EU:C:2004:614, paragraph 36.

158. See in general Barry Schwartz, The Paradox of Choice (Ecco 2016, first published 2004) with reference to studies by Sheena Iyengar & Mark Lepper. See, for instance, Sheena S. Iyengar & Mark R. Lepper, ‘Rethinking the value of choice: A cultural perspective on intrinsic motivation’, Journal of Personality and Social Psychology, 1999, pp. 349–366; and Sheena S. Iyengar & Mark R. Lepper, ‘When Choice Is Demotivating: Can One Desire Too Much of a Good Thing?’, Journal of Personality and Social Psychology, 2000, pp. 995–1006.

159. Herbert A. Simon, ‘Designing Organizations For An Information-Rich World’, in Martin Greenberger (ed.), Computers, communications, and the public interest (The Johns Hopkins Press 1971).

160. Douglas Rushkoff, Program or Be Programmed (Soft Skull Press 2010), p. 58.

161. Adam Smith, An Inquiry into the Nature and Causes of the Wealth of Nations (1776). See also Adam Smith, The Theory of Moral Sentiments (1759).

162. Yuval Noah Harari, Homo Deus (Harper 2017), p. 402.

163. Seth Godin, All Marketers are Liars (Portfolio 2009, first published 2005), p. 123.

164. Council Resolution of 14 April 1975 on a preliminary programme of the European Economic Community for a consumer protection and information policy and Preliminary programme of the European Economic Community for a consumer protection and information policy, Official Journal, C 92, 25 April 1975, pp. 1–16, paragraph 8 and 30.

165. European Commission, ‘New Consumer Agenda’, 13 November 2020, COM(2020) 696 final, point 3.2 on ‘Digital Transformation’.

166. See, e.g., recitals 47 and 50 GDPR.

167. Erik K. Clemons, New Patterns of Power and Profit (Springer 2019), p. 233.

168. Case C‑628/17, Orange Polska, ECLI:EU:C:2019:480, paragraph 36 with reference to Case C‑54/17, Wind Tre, EU:C:2018:710, paragraph 54.

169. EDPB, ‘Guidelines 05/2020 on consent under Regulation 2016/679 (version 1.1)’, paragraph 5: ‘Even if the processing of personal data is based on consent of the data subject, this would not legitimise collection of data, which is not necessary in relation to a specified purpose of processing and be [sic] fundamentally unfair’.

170. Articles 5(1)(c) and (d) GDPR, respectively.

171. Tim Wu, The Attention Merchants (Alfred A. Knopf 2017), p. 258.

172. See <https://spreadprivacy.com/duckduckgrowing/>.

173. Article 20 GDPR provides for a right to data portability, which for data relating to social connections may be limited by the requirement that data portability may not ‘adversely affect the rights and freedoms of others’.

174. Chris Anderson, Free (Hyperion 2009).

175. See, e.g., European Commission’s communication ‘On the European democracy action plan’, 3 December 2020, COM(2020) 790 final.

176. Jamie Bartlett, The People Vs Tech (Ebury Press), p. 220.

177. George A. Akerlof & Robert J. Shiller, Phishing for Phools (Princeton University Press 2015), p. 152.

178. James Williams, Stand Out of Our Light (Cambridge University Press 2018), p. 45.

179. Barry Schwartz & Kenneth Sharpe, Practical Wisdom (Riverhead 2010), p. 195.

180. See, e.g., Erik K. Clemons, New Patterns of Power and Profit (Springer 2019).

181. See also Pinar Akman, ‘A Web of Paradoxes: Empirical Evidence on Online Platform Users and Implications for Competition and Regulation in Digital Markets’, Virginia Law & Business Review, 2022 (forthcoming).

182. See also Tristan Harris, ‘Written Statement of Tristan Harris (Center for Humane Technology) to United States Senate’, 2021: ‘We face genuine existential threats that require urgent attention’.

183. Tim Hwang, Subprime Attention Crisis (FSGO 2020).

184. Roger Brownsword, ‘From Erewhon to AlphaGo: for the sake of human dignity, should we destroy the machines?’, Law, Innovation and Technology, 2017, pp. 117–153. See also Roger Brownsword, Law 3.0 (Taylor & Francis 2020).

185. James Williams, Stand Out of Our Light (Cambridge University Press 2018), p. 102.

186. Recital 2 GDPR. Emphasis added.

187. Similarly, the prohibition of unsolicited commercial e-mail is also intended to secure the stability of ‘electronic communications networks and terminal equipment’, cf. recital 40 of the ePrivacy Directive.

188. Peter Hulsrøj & Marco Aliberti, Essays on the Optional Society (Ex Tuto 2021), p. 285.

189. European Commission, ‘Report on the safety and liability implications of Artificial Intelligence, the Internet of Things and robotics’, 19 February 2020, COM(2020) 64 final, Council of Europe, declaration on the manipulative capabilities of algorithmic processes, 13 February 2019; and EDPB, ‘Guidelines 8/2020 on the targeting of social media users (version 2.0), paragraph 13.

190. Nassim Nicholas Taleb, Antifragile (Random House 2012), p. 98.

191. Jaron Lanier, Ten Arguments for Deleting Your Social Media Accounts Right Now (Henry Holt 2018), p. 75.

192. Jamie Bartlett, The People Vs Tech (Ebury Press 2018), p. 37.

193. Directive (EU) 2019/770 of 20 May 2019 on certain aspects concerning contracts for the supply of digital content and digital services.

194. Jamie Bartlett, The People Vs Tech (Ebury Press 2018), p. 2.

195. Norbert Reich et al., European Consumer Law (2nd edition, Intersentia 2014), p. 7.

196. Jon Elster, Sour Grapes (Cambridge University Press 2016, first published 1983), pp. 91 et seq.

197. Neil Postman, The End of Education (Vintage 1995), p. 71, describing the American Constitution as a ‘hypothesis’.

198. See Nassim Nicholas Taleb, Antifragile (Random House 2012).

199. See, e.g., Eli Pariser, The Filter Bubble (Penguin 2011) and Roger McNamee, Zucked (Penguin 2019). See also Erik K. Clemons, New Patterns of Power and Profit (Springer 2019), p. 196: ‘The major democracies will figure out that they cannot allow search engines to plunder from mainstream journalism or destroy investigative journalism.’

200. Frank Pasquale, The black box society (Harvard University Press 2015) and Danah Boyd, It’s Complicated (Yale 2014)

201. N. Just & M. Latzer, ‘Governance By Algorithms: Reality Construction by Algorithmic Selection on the Internet’, Media, Culture & Society, 2016, pp. 238–258.

202. Alan F. Westin, Privacy and Freedom (Atheneum 1967), p. 57. See also Bruce Schneier, Data and Goliath (W. W. Norton & Company 2015).

203. Cass R. Sunstein, ‘Fifty Shades of Manipulation’, Journal of Marketing Behavior, 2016, pp. 213–244, p. 219.

204. Keach Hagey & Jeff Horwitz, ‘Facebook Tried to Make Its Platform a Healthier Place. It Got Angrier Instead’, The Wall Street Journal, 15 September 2021.

205. As famously said by Facebook’s first research scientist, Jeff Hammerbacher: ‘The best minds of my generation are thinking about how to make people click ads […] That sucks.’ Bloomberg Businessweek, April 2011. See also James Williams, Stand Out of Our Light (Cambridge University Press 2018), p. 30; and Shoshana Zuboff, Surveillance Capitalism (Profile Books 2019), p. 189, mentioning ‘a “missing generation” of data scientists’.