Your Privacy Is Important to Us! – Restoring Human Dignity in Data-Driven Marketing

about & support

Foreword by Eric K. Clemons

Preface, Acknowledgements and Abbreviations

Bibliography


PART I – INTRODUCTION


1. Why this book?
    (#methodology #delimitations #structure)


2. Data-Driven Business Models
    (#surveillancecapitalism #valueextraction #harm)

PART II – LAW


3. Regulating Markets
    (#law #markets #architecture #consumerprotection)


4. Data Protection Law
    (#gdpr #personaldata #lawfulprocessing #legitimatebasis)


5. Marketing Law
    (#ucpd #professionaldiligence #averageconsumer)

PART III – PSYCHOLOGY AND TECHNOLOGY  


6. Human Decision-Making
    (#agency #psychology #boundedrationality #willpower)


7. Persuasive Technology
    (#technology #choicearchitecture #friction #prompts)


8. Manipulation
    (#coercion #deception #subliminalmarketing #paternalism)


9. Transparency
    (#information #communication #complexity #asymmetry)

PART IV – SOCIETY


10. Human Dignity and Democracy
      (#humanwellbeing #privacy #discrimination #proportionality)


PART V – CONCLUSIONS AND BEYOND


11. Conclusions
      (#humandignity #datadrivenmarketing #beinghuman)


12. Next Steps
      (#action #conversations #future)

CHAPTER TWELVE

Next Steps

#action  #conversations  #future

The aim of this chapter is to sketch out some ideas for actions and conversations that this book may give rise to. The legal framework is likely to play an important role, but technology, education and conscious use of technology may also be key to ensuring empowerment, transparency and human dignity, including human well-being.

1. Effective enforcement

Yogi Berra supposedly said that ‘in theory there is no difference between theory and practice—in practice there is.’1 This is also true for law, as it rests on the assumptions that people will feel compelled to comply with the law and that there is effective enforcement against offenders.

Personalised content may have important implications for regulatory supervision when, for example a website is dynamically designed for each visit and each visitor. Information about personalisation can be obtained by analysing the tools that traders offer to third parties, such as Facebook’s Detailed Targeting, introduced in Chapter 2 (data-driven business models).2 Detailed Targeting allows traders to refine the group of people advertising is shown to, including by:3

  • Ads they click.

  • Pages they engage with.

  • Activities they engage in on Facebook related to things like their device usage and travel preferences.

  • Their demographic profile, including such things as age, gender and location.

  • The mobile device they use and the speed of their network connection.

Connections Targeting allows for the targeting of not only the trader’s connections on Facebook, but also people who are ‘friends’ with someone who has engaged with the trader’s Page, app and/or event. For traders’ personalisation that is not offered to third parties, law enforcers may have to rely on whistleblowers.4

Even though the GDPR has applied since 25 May 2018, major issues—such as whether Article 22 is a prohibition or a right and the extent to which data-driven marketing, including targeted advertising, is lawful—have not been resolved before the CJEU.

The draft decision of 6 October 2021 from the Irish Data Protection Commission (DPC), mentioned in Chapter 4 (data protection law), followed an inquiry begun more than three years earlier (20 August 2018). The case was handled by the DPC, as ‘lead supervisory authority’, because Facebook is established in Ireland.5

According to Article 65 GDPR, the EDPB must adopt a ‘binding decision’ in case of relevant and reasoned objections to the lead supervisory authority’s draft decision from other supervisory authorities. Subsequently, the DPC must adopt its final decision on the basis of the EDPB’s binding decision, which is likely to be adopted in early 2022. If the case goes to court, it is not unlikely that the process will take around five years in total. It may only add insult to injury if a fine eventually is paid to the Irish state for harm felt by the entire European single market.

It has been pointed out that the Irish Data Protection Commission6 constitutes a bottleneck to efficient enforcement.7 This Irish authority is the dominant ‘lead supervisory authority’ of the EU because large companies are established in Ireland. From this Member State, the companies have access to the European single market and can rely on its principles, including the free movement of services.

Companies established in Ireland include Adobe, Alphabet (including Google and YouTube), Apple, Dropbox, eBay, Meta (including Facebook, Instagram, Oculus and WhatsApp), Microsoft (including LinkedIn), Oracle, SalesForce, Shopify, TikTok and Twitter. Other Member States where large companies are established include The Netherlands (Cisco, Netflix, Tesla, Uber and Zoom), Luxembourg (Amazon and PayPal) and France (IBM).

1.1. Effective, proportionate and dissuasive fines

Fines under EU law must be effective, proportionate and dissuasive, and the GDPR provides for administrative fines ‘up to’ €20 million, or in the case of an undertaking, up to 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher. A similar approach to penalties was introduced to the UCPD, but notably only for ‘enforcement measures in coordinated actions’.8

In this book we have mentioned Facebook (now Meta) being fined a $5 billion penalty (for the violation of a settlement with the U.S. Federal Trade Commission); and in September 2021 (inquiry sent on 10 December 2018), WhatsApp (owned by Meta, then known as Facebook) was fined €225 million by the DPC for lack of transparency in the processing of personal data.9 In its draft decision, the DPC suggested a fine of €30–50 million.10 Fines are often announced and perceived in absolute terms, and it may be difficult for our linear-thinking brains to grasp the enormous difference between millions and billions: One million seconds is the equivalent of 11.57 days; one billion seconds is the equivalent of 31.71 years.

As the fine-provision is drafted, it provides both an absolute and a relative anchor, which makes €225 million appear to be a significant fine. However, if we compare this amount to the estimated $8–12 billion that Google pays to Apple every year for being the default option on Apple products, these fines may seem less extreme. By focusing on turnover in the calculation of fines, there may be a risk of providing a discount to companies with a high ‘degree of operating leverage’, i.e. relatively low fixed costs compared to revenue. Due to automation and scalability, data-driven business models are particularly suitable for leveraging both revenue and impact with relatively little in the way of personnel and technology.

The 4% cap is advantageous to companies with high profit-margins, for whom the fines may not be effective, proportionate and dissuasive. Also, the fine level does not directly account for the time aspect of enforcement, and the lack of case law on data-driven business models may serve as a testament to how slowly law can work in practice. Turnover and profits for ‘big tech’ can easily be researched by using some of the otherwise excellent products they offer.

In a classic study, a monetary fine was introduced to parents arriving late to collect their children from some Israeli day-care centres.11 The counter-intuitive result was that the number of tardy parents increased significantly, which may be explained by the moral obligation being replaced with a price, i.e. a straightforward financial calculation.12 Subsequently removing the fine did not reduce the number of late-comers from the new high level. Traders are usually less emotional than human beings and better at understanding risks.

2. Is there hope in marketing?

Astrid Lindgren expressed in 1947 through the voice of Pippi Longstocking that ‘if you are very strong, you must also be very kind’.13 Kindness may hold value in a business context. Cialdini suggests using his levers of influence in accordance with ‘the truth’; and as expressed by marketing guru Seth Godin:14

‘For me, marketing works for society when the marketer and consumer are both aware of what’s happening and are both satisfied with the ultimate outcome.’15

This may, however, be challenged by traders’ legitimate pursuit of profits16 and the notorious ambiguity of ethical standards. As Yuval Noah Harari writes:

‘It is dangerous to trust our future to market forces, because these forces do what’s good for the market rather than what’s good for humankind or for the world. The hand of the market is blind as well as invisible, and left to its own devices it may fail to do anything at all about the threat of global warming or the dangerous potential of artificial intelligence.’17

This concern may be understood by looking at the tobacco and oil industries, including initiatives to make their products (appear) safer and healthier. The Irish Council for Civil Liberties has taken steps to sue the online advertising industry to stop online tracking for marketing purposes, calling it ‘the world’s biggest data breach’.18

Just as this book was going to press (November 2021), Facebook announced that they would shut down their Face Recognition system and delete more than a billion people’s individual facial recognition templates’.19 This decision was taken in the wake of the ‘Facebook Files’ leak, and apparently was prompted not as by a felt moral obligation but by ‘growing societal concerns’ and a lack of ‘clear rules’.

As observed by Theodore Roosevelt in the context of consumer protection:

‘The “let the buyer beware” maxim, when translated into actual practice, whether in law or business, tends to translate itself further into the seller making his profit at the expense of the buyer, instead of by a bargain which shall be to the profit of both.’20

There is a lot of important cultural value for society in creative marketing, i.e. ‘creative’ as in art, and not as in accounting. We use the word ‘con man’ for a person using confidence tricks for fraudulent purposes, and in a modern context ‘con’ may also refer to consent.

3. Humane technology and human welfare computing

There are basically two ways in which computers may add to human well-being: indirectly, by increasing our ability to produce other goods and services, and directly, by making us smarter or even happier.21 As we have discussed, technology can also be used in ways that entail personal, social and societal harm, which is likely to diminish human well-being.

To combat superficiality and strengthen social cohesion, computers are likely to be a better tool than law.22 Knowledge about human decision-making and persuasive technology can be utilised to design technology that works for the user and protects human dignity. This may include technology that uses prompt and friction to ensure transparency, empowerment and human well-being, including by augmenting human capabilities, supporting deeper social connections and nudging users to take breaks from the use of technology.23

This is well aligned with the European Commission’s overall digital strategy, which seeks to promote technology that works for people.24 As discussed in this book, the aim must be to ensure digital architectures that preserve the user’s goals, values and preferences and provide for respectful defaults and nudges.

Users have access to various tools that may enhance their privacy online, and the first step must be to create awareness not only about surveillance and behaviour modification capabilities, but also, in particular, about how storytelling and digital architectures are likely to affect human dignity, including agency and human well-being.

Black box algorithms challenge democratic oversight as well as individual consumers’ ability to compare market conditions. Yet technology may also be used to enhance consumers’ ability to unite, and ‘victory’ may go to those who cooperate better with one another.25 One could imagine technology that would gather and compare algorithmic output signalled to individual users.26 The ability to compare personalised virtual realities would likely empower users, provided friction is overcome.

A central question is the extent to which it should be the users’ obligation to defend themselves against unlawful and unethical commercial practices in the ‘coercive arms race’27 between traders and consumers.

4. Education

So long as we do not understand or care about the societal implications and personal consequences of data-driven business models, there is less economic incentive to abandon predatory business models. Information and education are included in the five basic consumers’ rights introduced in Chapter 3 (regulating markets). Like democracies, our brains are antifragile, and education, in the sense of the formation of character, personality, and acquisition of true knowledge, likes disorder.28

To support critical thinking we must reduce superficiality and shallowness, and also be aware that too-strict regulation may have a negative impact on the development of skills.29 And a central question is to what extent it is fair—taking overall expectations of citizens in a modern society into account—to assume that these citizens, in their role as consumers, exercise great care in extracting meaningful information from marketing; especially in cases where the trader has not exercised great care to convey meaningful information in a way that appeals to our capacity for reflection and deliberation.

Consumers may learn through experience about commercial practices and how they affect them,30 but learning to choose is not free, and traders may still benefit from those who have their first experience with a particular commercial practice.

The internet—amplified by social media—has spurred an unprecedented realisation of the freedom of expression. Everyone can publish their opinion regardless of underlying intentions. In contrast to the time when information in society was mediated by relatively few media outlets, the receiving end now has to do the filtering. This requires a similar unprecedented ability for critical thinking.

Critical thinking is an indispensable skill that all people must learn. As with consumer choice, it is important to identify one’s own goals, values and preferences and develop strategies and tactics to pursue them. This concerns inter alia how to use which technology, and why. This also concerns how to obtain information, how to be critical and how to challenge your own beliefs, as well as developing the skill to decide who should mediate one’s realities.

For schools, educators and students, it may be good to consider how the presence of screens affects cognitive capacity,31 bearing in mind that taking notes on laptops may result in shallower processing that impairs learning.32

As a parent, it may be difficult to know how to dispense technology exposure on a busy day when screen time may give relief. In deciding what to do on screens, it could make sense to focus on activities that challenge children without constant hyperbole rewards and without having to pay with privacy. For children, the most important learning objective may be acquiring the skills to be idle, present and creative without the guidance of a computer. And it may be helpful to notice that children, like all human beings, copy the behaviour of people they admire, and that we are all more susceptible to addictions when we are less fulfilled.

Tighter rules and regulations may be necessary but are ‘pale substitutes for wisdom’,33 which ‘is not fed upon a diet of distraction’.34

4.1. Stories

A good introduction to data-driven business models and surveillance capitalism is the documentary The Social Dilemma (2020).35 Both George Orwell’s Nineteen Eighty-Four (1949)36 and Aldous Huxley’s Brave New World (1932)37 also illustrate aspects of surveillance and addictions in dysfunctional societies. In Huxley’s world, ‘Soma’ is a mind-numbing narcotic that keeps citizens peaceful, and which could translate into ‘SoMe’ (as in ‘Social Media’) in our modern times. In the words of Edward Luce:38

‘There is no need to ban books if people are not reading them.’

The role of frames and storytelling is one of the most important points in this book. Compelling narratives do not have to be ‘true’ in a scientific sense to do their work,39 as illustrated in Dave Eggers’ novel The Circle (2013).40 In 1928 Edward Bernays expressed the following concern:

‘The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.’41

Everything we say is framed in some way, and ‘if we manage to have a different frame accepted into the discourse, everything you say is just common sense’.42 As Yuval Noah Harari writes:

‘The difficulty lies not in telling the story, but in convincing everyone else to believe it.’43

We need to understand the role of the stories we hear and tell ourselves and others. Stories may be challenged, developed and rejected, but this requires a capacity for critical thinking, which in itself is at risk in an age of attention, addiction, dopamine, loneliness and surveillance capitalism.

4.2. Conversations

Human beings are social animals. We learn from interactions with each others, and collaboration is important for community and civilization.44 We may need to start conversations about a ‘human reclamation project’,45 while bearing in mind that ‘despite the amazing things that the web can do, it can’t provide leadership.’46 We could assume that our use of technology should be an informed, deliberate and rewarding choice. As famously expressed by Vinton Cerf, one of the fathers of the internet:47

‘The internet is a reflection of our society and that mirror is going to be reflecting what we see. If we do not like what we see in that mirror the problem is not to fix the mirror, we have to fix society.’

Conversations in real reality are easier to understand and less likely to be observed and manipulated by third parties, and they allow us to experience the breadth of communication and develop our capacity for empathy.48 However, when the degradation of social processes is experienced as a form of personal empowerment, it may feel like a restraint to be ‘truly social’.49 Timothy Snyder suggests that making eye contact and small talk is ‘part of being a citizen and a responsible member of society’.50 Clicking a ‘like button’ is not likely to change the world, and as observed by James Williams:

‘Future generations will judge us not only for our stewardship of the outer environment, but of the inner environment as well. Our current crisis does not only come in the form of rising global temperatures, but also in our injured capacities of attention.’51

4.3. Conscious and deliberate use of technology

There was a time when it was normal to smoke tobacco in restaurants, meeting rooms, television studios, auditoriums, etc. Smart phones, apps and pervasive internet access have become a standard ingredient in daily life for many people, and there are usually good reasons to be reachable. Given the distractive and addictive nature of many technologies, however it could be helpful to consider and discuss how we use technology in public and private spaces, including in workplaces. As suggested by Barry Schwartz:

‘In circumstances like this, we should learn to view limits on the possibilities we face as liberating, not constraining.’52

For government and business, there is a huge efficiency potential in current digital transformations, but sometimes these advantages just shift costs to the user, who then becomes even more dependant on these technologies, including apps and devices. And to begin with, government and public institutions should—as the media industry has been forced to—consider their use of platforms that rely on surveillance and manipulation.

As one example, the Norwegian Data Protection Authority has decided not to use Facebook for their own communication work, arguing that ‘the risks to the data subjects’ rights and freedoms associated with the processing of personal data through a Page on Facebook are too high’, and that:53

‘By using the most popular tools available, free of charge, from the large technology companies, public agencies invite commercial actors to collect and use data about Norwegian citizens. At the same time, a relationship of dependency is created which it can be difficult to break free from as there are few alternative service providers.’

When citizens in reality are forced to use certain technologies, including apps and devices, it becomes more difficult to apply self-binding tactics against digital addiction. As Yuval Noah Harari explains, it was humans that became domesticated by crops and not vice versa,54 and the same can possibly be said about humans and technology. And as Douglas Rushkoff asks:

‘The more we accept the screen as a window on reality, the more likely we are to accept the choices it offers. But what of the choices that it doesn’t offer? Do they really not exist?’55

The politicians asking for stricter regulation of big tech after each outrageous revelation of current business ethics could start by considering the role their efforts for re-election play in supporting certain technologies.

5. Who ‘owns’ your attention and agency?

Everyone who has lost a loved one will have felt the importance and scarcity of time and attention, and one of the most important tasks for modern human beings may be to take control over how attention is dispensed. This includes the recognition and development of our goals, values and preferences, as well as strategies and tactics to pursue them. If we do not take ownership of our attention, decisions and behaviour, someone else might. As Seneca the Younger observed some 2,000 years ago:56

‘There’s only one way to be happy and that’s to make the most of life.’



1. This is widely attributed to Berra online, but in fact, he never said it, <https://www.snopes.com/fact-check/practice-and-theory/>.

2. See also Julia Angwin, Surya Mattu & Terry Parris Jr., ‘Facebook Doesn’t Tell Users Everything It Really Knows About Them’, ProPublica, 27 December 2016.

3. Meta for Business (formerly Facebook for Business), <https://www.facebook
.com/business/>.

4. See also Directive (EU) 2019/1937 of the European Parliament and of the Council of 23 October 2019 on the protection of persons who report breaches of Union law and the proposed Artificial Intelligence Act.

5. Article 56(1) GDPR provides that ‘[…] the supervisory authority of the main establishment or of the single establishment of the controller or processor shall be competent to act as lead supervisory authority for the cross-border processing carried out by that controller or processor […].

6. <https://www.dataprotection.ie/>.

7. Irish Council for Civil Liberties’ report on the enforcement capacity of data protection authorities: ‘Europe’s enforcement paralysis’ (September 2021), <https://www.iccl.ie/digital-data/2021-gdpr-report/>. The report suggests that almost all (98%) major GDPR cases referred to Ireland remain unresolved.

8. See the New Consumer Deal Directive and Regulation (EU) 2017/2394 of 12 December 2017 on cooperation between national authorities responsible for the enforcement of consumer protection laws.

9. Following EDPB’s ‘binding decision 1/2021 on the dispute arisen on the draft decision of the Irish Supervisory Authority regarding WhatsApp Ireland under Article 65(1)(a) GDPR’, adopted 28 July 2021. See also <https://www.data
protection.ie/en/news-media/press-releases/data-protection-commission-
announces-decision-whatsapp-inquiry>.

10. See also <https://www.cnil.fr/en/cookies-financial-penalties-60-million-euros-against-company-google-llc-and-40-million-euros-google-ireland>, and <https://www.cnil.fr/en/cnils-restricted-committee-imposes-financial-penalty-50-million-euros-against-google-llc>.

11. Uri Gneezy & Aldo Rustichini, ‘A Fine Is a Price’, Journal of Legal Studies, 2000, pp. 1–17.

12. Barry Schwartz & Kenneth Sharpe, Practical Wisdom (Riverhead 2010), p. 191.

13. Astrid Lindgren, Do you know Pippi Longstocking? (Oxford University Press 1947), quote from <https://www.astridlindgren.com/en/quotes> (visited December 2019).

14. See, e.g., Yoram Wind & Catharine Findiesen Hays, Beyond Advertising (Wiley 2016) and Seth Godin, This Is Marketing (Portfolio, Penguin 2018).

15. Seth Godin, This is Marketing (Portfolio 2018), p. 247.

16. George A. Akerlof & Robert J. Shiller, Phishing for Phools (Princeton University Press 2015), p. 9: ‘It lies in the market idea that business will take the opportunities available.’

17. Yuval Noah Harari, Homo Deus (Harper 2017), p. 382.

18. <https://www.iccl.ie/news/press-announcement-rtb-lawsuit/>.

19. <https://about.fb.com/news/2021/11/update-on-use-of-face-recognition/>.

20. Theodore Roosevelt, An Autobiography (1913), Chapter III.

21. Jim Holt, When Einstein Walked with Gödel (FSG 2018), p. 203.

22. See also <https://www.humanetech.com/>.

23. Maggie Jackson, Distracted (Prometheus Books 2008), p. 91: ‘We need computers that sense when we are busy and then decide when and how to interrupt us.’

24. Communication from the Commission, ‘Shaping Europe’s Digital Future’, COM(2020) 67 final.

25. Yuval Noah Harari, Homo Deus (Harper 2017), p. 132.

26. See for inspiration: <https://www.propublica.org/article/breaking-the-black-box-what-facebook-knows-about-you>.

27. Douglas Rushkoff, Coercion (Riverhead 1999), p. 3.

28. Nassim Nicholas Taleb, Antifragile (Random House 2012), p. 422.

29. Barry Schwartz & Kenneth Sharpe, Practical Wisdom (Riverhead 2010), p. 12.

30. See about the role of education to enhance consumer confidence in Christian Twigg-Flesner, ‘The Importance of Law and Harmonisation’, in Dorota Leczykiewicz & Stephen Weatherill (eds), The Images of the Consumer in EU Law (Hart 2016), pp. 183–202, p. 201.

31. See, for instance, Adrian F. Ward, Kristen Duke, Ayelet Gneezy & Maarten W. Bos, ‘Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity’, Journal of the Association for Consumer Research, 2017, pp. 140–154.

32. See Pam A. Mueller & Daniel M. Oppenheimer, ‘The Pen Is Mightier Than the Keyboard: Advantages of Longhand Over Laptop Note Taking’, Psychological Science, 2014, pp. 1159–1168: ‘We show that whereas taking more notes can be beneficial, laptop note takers’ tendency to transcribe lectures verbatim rather than processing information and reframing it in their own words is detrimental to learning.’

33. Barry Schwartz & Kenneth Sharpe, Practical Wisdom (Riverhead 2010), pp. 9 et seq.

34. Maggie Jackson, Distracted (Prometheus Books 2008), p. 260.

35. <https://www.thesocialdilemma.com/>.

36. George Orwell, Nineteen Eighty-Four (Secker & Warburg 1949).

37. Aldous Huxley, Brave New World (Everyman’s Library 2013, first published 1932).

38. Edward Luce, The Retreat of Western Liberalism (Little, Brown 2017), p. 128. See also Cal Newport, Deep Work (Grand Central 2016), 69.

39. Neil Postman, The End of Education (Vintage 1995).

40. Dave Eggers, The Circle (Knopf 2013).

41. Edward Bernays, Propaganda (Horace Liveright 1955, first published 1928), p. 37.

42. George Lakoff, [The All New] Don’t Think of an Elephant (Chelsea Green Publishing 2014, first published 2004), p. 160.

43. Yuval Noah Harari, Sapiens (Harper 2015), p. 31.

44. Sigmund Freud, Civilization and Its Discontents (Penguin books 2014, first published 1930), p. 46: ‘The replacement of the power of the individual by that of the community is the decisive step towards civilization.’

45. Tim Wu, The Attention Merchants (Alfred A. Knopf 2017), p. 343. See also <https://www.thesocialdilemma.com/take-action/>.

46. Seth Godin, Tribes (Portfolio 2008).

47. <https://icannwiki.org/Vinton_Cerf>.

48. See also Michael J. Sandel, The Tyranny of Merit (Allen Lane 2020).

49. Douglas Rushkoff, Team Human (Norton 2019), p. 39.

50. Timothy Snyder, On Tyranny (The Bodley Head 2017), p. 81.

51. James Williams, Stand Out of Our Light (Cambridge University Press 2018), p. 127. See also David Wallace-Wells, The Uninhabitable Earth (Tim Duggan Books 2019).

52. Barry Schwartz, The Paradox of Choice (Ecco 2016, first published 2004), p. 239.

53. Datatilsynet (Norway), ‘Norwegian Data Protection Authority choose not to use Facebook’, 22 September 2021, <https://www.datatilsynet.no/en/news/2021/
norwegian-data-protection-authority-choose-not-to-use-facebook/> with link to their risk assessment report. See also Article 26 GDPR on ‘joint controllers’ and Case C‑40/17, Fashion ID, ECLI:EU:C:2019:629; and Case C‑210/16, Wirtschaftsakademie Schleswig-Holstein, ECLI:EU:C:2018:388.

54. Yuval Noah Harari, Sapiens (Harper 2015), p. 79.

55. Douglas Rushkoff, Team Human (Norton 2019), p. 64.

56. Seneca, Letters from a Stoic (Penguin Classics 2014, translation from 1969), p. 274.