GDPR has started a privacy resurrection, but you have to claim it

Taylor Armerding
8 min readJun 8, 2020
Photo by Dayne Topkin on Unsplash

Perhaps you have heard: The resurrection of personal privacy is now two years old.

The second anniversary, May 25, of the European Union’s (EU) General Data Protection Regulation (GDPR) taking effect was marked with a small eruption of stories that might have been greater were it not for a global pandemic.

But the general theme was that while progress toward improved privacy is still incremental, it is indeed progress.

A joint statement issued by the EU framed it thus: “Within two years, these rules have not only shaped the way we deal with our personal data in Europe, but has also become a reference point at global level on privacy,” adding that the GDPR “ensures that citizens have more control over their personal data and sets at the same time a framework for trustworthy innovation.”

Well, perhaps. For the average user and observer, the progress might not be quite so apparent.

For average users, the most regular reminder of the GDPR is a pop-up every time they go to a website that tells them “This site uses cookies” (just to give them “the best experience,” of course) and that by continuing to use the site, they are consenting to the collection of their data.

Or it may have an “OK” button to consent and a “Learn More” button as another option, but rarely will they see a “Decline” button that has no effect on their use of the site.

Not what anyone would describe as “privacy by default.”

Perhaps part of what is in play here is right in the title of the law. It refers to data “protection,” not privacy.

Of course, you can’t have one without the other. If data isn’t protected, then it won’t be private for long. Still, while it may be worse to have your data stolen, it also isn’t private if it is being collected, stored and “shared” — as in, sold.

Privacy requirements

And there are specific privacy provisions within the GDPR. Among the things it requires of companies that collect personal data:

- Explain how they process data in “a concise, transparent, intelligible and easily accessible form, using clear and plain language.”

- Make it easy for people to make requests about the collection, storage and use of their data and “respond to those requests quickly and adequately.”

- State the purpose of processing data, and the length of time it will be held.

- Allow users to “correct inaccurate or incomplete personal data” that a company is processing.

- Delete any information about users at their request — commonly known as “the right to be forgotten.” There are five exemptions to this right.

So far, based on consumer response, there are many, many violations of these provisions. The EU reported in March 2019 that in less than a year it had logged 144,000 complaints.

About 37% were still pending at the time.

And what about bringing down the hammer on violators, especially the tech giants whose business models depend on vacuuming up user data and selling it? The threat was that penalties for violations could be as much as 4% of annual revenue.

The EU’s GDPR “enforcement tracker” has logged 282 fines in the past two years, although it says the list will never be complete since not all fines are made public. Many of them are in the $5,000 to $10,000 range.

The collective fines levied over the past two years amounted to about $520 million. The largest individual fine was $227 million against British Airways. The second was $122.5 million against Marriott International and in third was $57 million, against Google.

Those are staggering amounts to most of us. But to corporate giants, not so much. The British Airways fine was about 1.5% of annual revenue. For Marriott, it was about a half a percent.

And for Google, with estimated annual revenue of $160.7 billion, it doesn’t even qualify as a minor annoyance, at far less than a tenth of a percent. For context, that would be like a fine of $35 for somebody making $100,000 a year — in the range of a parking ticket.

Toothless enforcement?

Which is why at least some critics are calling the law toothless. The maximum 4% against Google would be about $6.4 billion — 112 times $57 million, and enough to start causing some pain.

But so far, it is apparently up to users to demand damages in that range. Reuters reported last week that Google has been sued in a proposed class action seeking at least $5 billion for allegedly “illegally invading the privacy of millions of users by pervasively tracking their internet use through browsers set in ‘private’ mode.”

The search giant’s response? Spokesman Jose Castaneda didn’t deny the data collection, but said it gave users notice.

“As we clearly state each time you open a new incognito tab, websites might be able to collect information about your browsing activity,” he told Reuters.

However that battle shakes out, legal experts say what may look like a light touch on fines may be in part because enforcement is still being sorted out.

As Patrick Van Eecke, chair of DLA Piper’s international data protection practice, put it in a quote on the company blog, “Ask two different regulators how GDPR fines should be calculated and you will get two different answers. We are years away from having legal certainty on this crucial question, but one thing is for certain, we can expect to see many more fines and appeals over the coming years.”

Also, some privacy experts say the effectiveness and impact of the GDPR should not be judged on a lack of mega-fines or what users see on websites.

Progress in the background

“That oversimplifies what anyone should have expected out of the GDPR,” said Joseph Jerome, a privacy attorney and director of multistate policy for Common Sense Media. “It’s probably fair to say that the most public manifestation of the GDPR has been a proliferation of consent pop-ups, but that ignores the back-end work that companies put into meeting the law.”

He added that the GDPR was never intended to “destroy adtech and put big tech platforms out of a business overnight. It is also incorrect to say that GDPR always requires consent for (data) processing.”

But he said the GDPR and the California Consumer Privacy Act (CCPA), which took effect at the beginning of this year, have prompted companies to curb their data collection and storage.

“I know firsthand that a number of companies decided to reduce their data stores, discarding data they weren’t actually using, to avoid the costs of complying with access and deletion requests down the road. All of that is stuff that people don’t see but are real benefits from the GDPR,” he said.

Pam Dixon, executive director of the World Privacy Forum, said it is important to note that the GDPR is actually an update of the EU’s Data Protection Directive, which was 25 years old when the GDPR took effect.

“Ultimately, it was a modification of what already existed,” she said, adding that, as Jerome noted, much of the progress is behind the scenes. “While we may not see it from the outside, many businesses have had to tag and trace their data and data flows, some for the first time. That has improved data governance overall,” she said.

Dixon said the influence of the GDPR can also be seen in the proliferation of other privacy laws. “At last count, there were at least 142 jurisdictions that have passed meaningful national privacy laws that had some or many features of the GDPR. Even though not all of these statutes are GDPR-aware, many are. That is a lot of influence,” she said.

Not that the GDPR has spawned imitators everywhere. Dixon said there are many differences between the CCPA and GDPR, “and the CCPA has many difficulties, including its irregular legislative process.”

Slow going in the U.S.

And she acknowledged that in the U.S., it is slow going for privacy initiatives. “The Washington state privacy bill has failed twice now,” she said. “Of the many general privacy bills that were attempted in 2019, including at the federal level, impasses around pre-emption and enforcement, among other issues, prevented most bills from moving.”

Jody Westby, CEO of Global Cyber Risk, agrees that there are substantive differences between the GDPR and CCPA, but said both have “changed the privacy landscape for average people within the U.S.”

And she said while only Nevada and Maine have enacted state privacy laws, “about 12–15 state legislatures have similar privacy legislation.” She said consumers are now much more aware of their privacy rights and that “they can now demand companies implement the required privacy protections. The U.S. population is behind in this regard but will quickly catch up; there have been five CCPA class action lawsuits filed already.”

Regarding current enforcement of the GDPR, Jerome said it has been “slower than many anticipated. But in terms of effects, it’s not always the size of the fine that’s relevant. When DPAs [data protection authorities] fine local businesses a few thousand Euros for GDPR violations, that’s both appropriate and sends signals to businesses large and small.”

Dixon said she is in “wait-and-see” mode. She said the UK Information Commissioner’s Office (ICO) “is doing a great job creating codes of conduct under the auspices of the GDPR, and now UK law post Brexit.” And she said enforcement in France has been good as well.

“But generally, movement here has been quite slow. I’m not yet sure what to fully make of it,” she said.

Westby said the current level of penalties “doesn’t mean that it is a trend line to be counted on. It more likely reflects a gradual ramping up after GDPR implementation. The EU is fully capable of handing out stiff fines.”

Tim Mackey, principal security strategist at the Synopsys Cybersecurity Research Center (CyRC), said the GDPR does give power to consumers to control their privacy, but they need to take the initiative to reclaim it.

In a recent statement to eWeek, he said consumers should “go on the offensive and hold the people collecting data on us more accountable Regulations like GDPR provide individuals the ability to request what data a company already has collected, but the fight to control data actually starts with its collection and not reviewing what is already out in the wild.”

“If more people asked their vendors or providers of services they’re subscribed to what data they collect, how it’s secured, how long it’s retained, precisely who it’s shared with, who has access to it and under what conditions, and how they would detect that someone accessed your data without proper authorization — then we’d start having consumers driving the agenda for data protection rather than being passive recipients of breach notifications containing offers of credit monitoring.”



Taylor Armerding

I’m a security advocate at the Synopsys Software Integrity Group. I write mainly about software security, data security and privacy.