Fear Has Misled Us To The GDPR

In case you’re unfamiliar with GDPR, it stands for General Data Protection Regulation and it officially goes into effect today, May 25th, 2018 in the EU. Read this to get up to speed on GDPR before reading the rest of this post. 

Happy GDPR Day! With all the news and notifications on GDPR lately, I’ve been thinking – where did all of this start? What is the root problem that has brought about this regulation, and what will GDPR do?

What is “the problem”?

In a nutshell, fear. We, as consumers, are scared of companies collecting our data and using it for purposes that we don’t understand. This fear is driven by what we see happening to our online experiences (i.e. product focused retailer ads following us around everywhere) and what we hear in the news (i.e. Equifax data breach, Cambridge Analytica scandal). In a vacuum, most people aren’t all that bothered by targeted advertising, especially once they understand that it’s mostly based on anonymized data and is relatively harmless (of course, for those that don’t understand, there is a higher level of paranoia). If users were truly concerned about targeted ads, ad blockers would have much greater adoption than they do.

So let’s dig deeper. If targeted ads don’t really bother us, what does? How did we get here? I believe it’s our deep-seated fear of what we think data collection can lead to. On a rudimentary level, it’s hacking, data breaches, selling of our personal information to annoying telemarketers. But on a deeper level, it’s fear of what AI may be capable of. Think Hal from 2001: A Space Odyssey, Big Brother from 1984, Ava from Ex Machina (although that guy had it coming). The fear has been instilled in us through pop culture, and we’re starting to see ominous signs in the real world (Google Duplex demo, anyone?). Is Alexa going to understand us so well that one day she’ll start subliminally controlling us? Is a self-driving Uber going to override where we want to go? Will our crypto accounts get hacked and will all of our savings get cyber-robbed? Are we making ourselves vulnerable mentally, physically, financially, by allowing ourselves to be tracked, and then allowing for rapidly improving AI technology to have at our data? For most people, buried deep down, this is really “the problem”. Targeted ads are just an easy to see, easy to explain, and easy to blame scapegoat.

What will happen when the GDPR goes into effect?

  1. A majority will opt-out: Under the GDPR, if users do not opt-in to tracking, they will be opted out by default. Many users, who choose to ignore the tracking prompts, will fall into this bucket. And when faced with the binary choice of being tracked versus not being tracked, many, likely a majority, will choose to not be tracked. It’s just the easier thing to do, as there’s no way of truly knowing what happens with their data behind the scenes and there’s fear of “the problem”. But in doing so, they’re not thinking about how…
  2. The rich will get richer: Many companies that rely heavily on user data for monetization purposes, such as publishers and smaller adtech providers, will get hit hard, and some may go out of business (not good for the end consumer, as this will reduce online choice further and drive up prices). As a result, more dollars will go to the bigger companies (i.e. Facebook, Google, Amazon) that are better equipped, in many ways, to adapt to the regulation.
  3. Nothing Will Change: The entities that are able to stick it out will get a sense of how the regulation is being enforced, and will start to uncover gray areas / workarounds in order to, once again, monetize and exploit user data. They will be the big, powerful companies, that can pay their way through to the other side of the storm. Some will be bold enough to outright ignore the regulation, and they will not be penalized.
  4. Consumers will be worse off: With disingenuous, uninformed legislation that fails to recognize the downstream negative impact it will have, we will likely be worse off as consumers and still have no real protection against what we fear most.

So opting-in and taking control is a better option, right?

  1. You won’t have much control: …even though the spirit of the law is to give users control over their own data and privacy. The GDPR is pretty much binary. Either you opt in to being tracked, or you’re out, and next thing you know your favorite blog is going bankrupt. What if you want your favorite blog to continue showing you targeted ads so they can make enough money to stay in business, but you don’t want them to be able to figure out where you live, because that’s where your line is crossed? In most cases, you won’t be able to manage your opt-in settings to that level of detail, and even if you can, you likely won’t know the implications of your granular choices.
  2. What does control even mean?: At its core, the GDPR aims to protect us from “the problem” by giving control of data back to consumers, but consumers don’t know how their data can be used to help or hurt them, so giving them “control” feels like lip service. It creates the illusion of control by presenting the user with a few check boxes. Big tech knows this and will exploit it (just look at how Mark Zuckerberg ran circles around the US Senate).
  3. You’ll only be asked once: Once consent has been obtained, companies won’t come back to get it again every time they improve their data modeling or AI capabilities. What you give your data for today won’t be what its used for down the road.
  4. It’ll be too late: By the time AI has spiraled out of the control to the point that we’re all living “the problem”, it will be too late. It’s not like the people that check the “yes I am okay with being tracked” box will see the takeover of the robots coming 30 days out, promptly log into their privacy settings on all of their favorite sites, turn all tracking off, and be spared.

So sure, opting-in might be a better option, to preserve competition in publishing and tech and hopefully keep the big companies honest, but it’s also just the status quo and doesn’t solve “the problem”.

So what?

If I haven’t made it clear already, I don’t think GDPR is the solution to “the problem”. If anything, GDPR hurts the tech industry, hurts consumers, diverts attention from the real issue at hand, and gives people a false sense of security. We need something better. I realize I’m not presenting a solution here (I will do so in a later post) and only outlining the problem, but before offering up any answers it’s important to fully recognize what’s at stake. Not doing so is where I believe the GDPR tripped up. The first step is more educated consumers and government around what technology is capable of doing with data today, and in the future. And equipped with that knowledge, consumers and government need to be a lot more honest and explicit about what we’re trying to protect ourselves from.

One thing is for sure, it’s not targeted advertising.

Ad tech news
with a side of memes.

Sign up for our weekly newsletter to get the scoop on what’s going down in ad tech.