← Back to Blog Home

“Are these data and privacy protection regulations serious or are they just for show?”  I’ve been hearing that question lately from the tech reporters and journalists who’ve been contacting me. Even after pointing out extensive case files and other documented incidents on government and legal sites, I’m still left with the feeling that it’s just not proof enough.

Fate has finally intervened.

With the EU Commission’s complaint against Google’s privacy policies reaching a conclusion, I now have a teachable moment to convince the naysayers that this stuff is serious business.

When Google changed its privacy terms in early 2012, the fine print was also being looked at by EU regulators. Google may have thought it was making it easier for consumers with a single policy covering all its web services, but others felt a bit differently. The Article 29 Working Party is in charge of advising the EU Commission on their data security and privacy rules, which are contained in the Data Protection Directive or DPD. In late 2012, they filed a complaint against Google, and addressed a letter to Mr. Page.

In so many words, the Article 29 folks said the search engine company had not done enough to follow DPD rules on consumer privacy.

Security experts, compliance gurus, CIOs, and other interested players would normally have to get the real story about this intersection of legal and tech in niche publications or in the back pages of certain business sections, or perhaps in a blog of a major data governance player. Since this is Google, and it appears that the EU is willing to go to the mat on this one—in other words, there will be fines—the story is now moving up in importance and appearing more prominently in business sections of main-stream publications.

You can read from the regulator’s report to learn about the long list of Google’s privacy shortcomings, which are conveniently bold-faced. I offer a few of their choice phrases: “no valid consent”, “incomplete or approximate information”, and “retention periods must be appropriate in regards to the purpose.”

Whoa! The EU—technically the individual national data protection authorities led by France’s CNIL— will fine a major American online service provider over their …  data retention policy?

Of course, having data retention policies and procedures —what to keep, what to archive—in place is just IT common sense. But you’re probably thinking that just because an organization doesn’t have explicit data retention or migration plans doesn’t mean it has broken the law.

Actually, it’s not only the EU that takes this IT procedure seriously. Data retention limits also show up in the US’s HIPAA rules for personal health data and in some financial data security regulations. But usually the limits—measured in years—are the amount of time an electronic document must be kept.

The EU, though, views data collection and retention with a goal of “data minimization” in mind: companies should store the minimum amount of personal data and limit the duration to what “must be appropriate in regards to the purpose”. That’s essentially the language of the DPD law. In other words, you just can’t keep personal consumer data unless there’s a legitimate business reason, you have to say what that reason is, and you have to say how long you’re going to keep it.

According to France’s CNIL, Google has to this date refused to provide any information about its data retention policies after being requested to do so.

And the EU Commission has been very clear that there will be consequences for not following its rules. How bad could the fines be for violating, either willfully or negligently, the DPD? The head of the Commission is suggesting they could run as high as 2% of global sales.

Last year Google earned revenues of over $45 billion. You do the math on what it means for not taking data compliance regulations seriously.

Image credit: Dschwen

Leave a Comment