Category Archives: Privacy

[Podcast] Dr. Zinaida Benenson and the Human Urge to Click

[Podcast] Dr. Zinaida Benenson and the Human Urge to Click

Leave a review for our podcast & we'll send you a pack of infosec cards.

Dr. Zinaida Benenson is a researcher at the University of Erlangen-Nuremberg, where she heads the “Human Factors in Security and Privacy” group. She and her colleagues conducted a fascinating study into our spam clicking habits. Those of you who attended Black Hat last year may have heard her presentation on How to Make People Click on a Dangerous Link Despite their Security Awareness.

As we’ve already pointed on the IOS blog, phishing is a topic worthy of serious research. Her own clever study adds valuable new insights. Benenson conducted an experiment in which she phished college students (ethically, but without their direct knowledge) and then asked them why they clicked.

In the first part of our interview with Benenson, we discuss how she collected her results, and why curiosity seems to override security concerns when dealing with phish mail. We learned from Benenson that hackers take advantage of our inherent curiosity. And this curiosity about others can override the analytic security-aware part of our brain when we’re in a good mood!

So feel free to (safely) click on the above podcast to hear the interview.

[Podcast] Adam Tanner on the Dark Market in Medical Data, Transcript

[Podcast] Adam Tanner on the Dark Market in Medical Data, Transcript

This article is part of the series "[Podcast] Adam Tanner on the Dark Market in Medical Data". Check out the rest:

Adam Tanner, author of Our Bodies, Our Data, has shed light on the dark market in medical data. In my interview with Adam, I learned that our medical records, principally drug transactions, are sold to medical data brokers who then resell this information to drug companies. How can this be legal under HIPAA without patient consent?

Adam explains that if the data is anonymized then it no longer falls under HIPAA’s rules. However, the prescribing doctor’s name is still left on the record that is sold to brokers.

As readers of this blog know, bits of information related to location, like the doctor’s name, don’t truly anonymize a record and can act as quasi-identifiers when associated with other data.

My paranoia was certainly in the red zone during this interview, and we explored what would happen if hackers or others could connect the dots. Some of the possibilities were a little unsettling.

Adam believes that by writing this book, he can raise awareness about this hidden medical data market. He also believes that consumers should be given a choice — since it’s really their data  — about whether to release the “anonymized” HIPAA records to third-parties.


Inside Out Security: Today, I’d like to welcome Adam Tanner. Adam is a writer-in-residence at Harvard University’s Institute for Quantitative Social Science. He’s written extensively on data privacy. He’s the author of What Stays In Vegas: The World of Personal Data and the End of Privacy As We Know It. His articles on data privacy have appeared in Scientific American, Forbes, Fortune, and Slate. And he has a new book out, titled “Our Bodies, Our Data,” which focuses on the hidden market in medical data. Welcome, Adam.

Adam Tanner: Well, I’m glad to be with you.

IOS: We’ve also been writing about medical data privacy for our Inside Out Security blog. And we’re familiar with how, for example, hospital discharge records can be legally sold to the private sector.

But in your new book, and this is a bit of a shock to me, you describe how pharmacies and others sell prescription drug records to data brokers. Can you tell us more about the story you’ve uncovered?

AT: Basically, throughout your journey as a patient into the healthcare system, information about you is sold. It has nothing to do with your direct treatment. It has to do with commercial businesses wanting to gain insight about you and your doctor, largely, for sales and marketing.

So, take the first step. You go to your doctor’s office. The door is shut. You tell your doctor your intimate medical problems. The information that is entered into the doctor’s electronic health system may be sold, commercially, as may the prescription that you pick up at the pharmacy or the blood tests that you take or the urine tests at the testing lab. The insurance company that pays for all of this or subsidizes part of this, may also sell the information.

That information about you is anonymized.  That means that your information contains your medical condition, your date of birth, your doctor’s name, your gender, all or part of your postal zip code, but it doesn’t have your name on it.

All of that trade is allowed, under U.S. rules.

IOS: You mean under HIPAA?

AT: That’s right. Now this may be surprising to many people who would ask this question, “How can this be legal under current rules?” Well, HIPAA says that if you take out the name and anonymize according to certain standards, it’s no longer your data. You will no longer have any say over what happens to it. You don’t have to consent to the trade of it. Outsiders can do whatever they want with that.

I think a lot of people would be surprised to learn that. Very few patients know about it. Even doctors and pharmacists and others who are in the system don’t know that there’s this multi-billion-dollar trade.

IOS:Right … we’ve written about the de-identification process, which it seems like it’s the right thing to do, in a way, because you’re removing all the identifiers, and that includes zip code information, other geo information. It seems that for research purposes that would be okay. Do you agree with that, or not?

AT: So, these commercial companies, and some of the names may be well-known to us, companies such as IBM Watson Health, GE, LexisNexis, and the largest of them all may not be well-known to the general public, which is Quintiles and IMS. These companies have dossiers on hundreds of millions of patients worldwide. That means that they have medical information about you that extends over time, different procedures you’ve had done, different visits, different tests and so on, put together in a file that goes back for years.

Now, when you have that much information, even if it only has your date of birth, your doctor’s name, your zip code, but not your name, not your Social Security number, not things like that, it’s increasingly possible to identify people from that. Let me give you an example.

I’m talking to you now from Fairbanks, Alaska, where I’m teaching for a year at the university here. I lived, before that, in Boston, Massachusetts, and before that, in Belgrade, Serbia. I may be the only man of my age who meets that specific profile!

So, if you knew those three pieces of information about me and had medical information from those years, I might be identifiable, even in a haystack of millions of different other people.

IOS: Yeah …We have written about that as well in the blog. We call these quasi-identifiers. They’re not the traditional kind of identifiers, but they’re other bits of information, as you pointed out, that can be used to sort of re-identify. Usually it’s a small subset, but not always. And that this information would seem also should be protected as well in some way. So, do you think that the laws are keeping up with this?

AT: HIPAA was written 20 years ago, and the HIPAA rules say that you can freely trade our patient information if it is anonymized to a certain standard. Now, the technology has gone forward, dramatically, since then.

So, the ability to store things very cheaply and the ability to scroll through them is much more sophisticated today than it was when those rules came into effect. For that reason, I think it’s a worthwhile time to have a discussion now. Is this the best system? Is this what we want to do?

Interestingly, the system of the free trade in our patient information has evolved because commercial companies have decided this is what they’d want to do. There has not been an open public discussion of what is best for society, what is best for patients, what is best for science, and so on. This is just a system that evolved.

I’m saying, in writing this book, “Our Bodies, Our Data,” that it is maybe worthwhile that we re-examine where we’re at right now and say, “Do we want to have better privacy protection? Do we want to have a different system of contributing to science than we do now?”

IOS: I guess what also surprised me was that you say that pharmacies, for example, can sell the drug records, as long as it’s anonymized. You would think that the drug companies would be against that. It’s sort of leaking out their information to their competitors, in some way. In other words, information goes to the data brokers and then gets resold to the drug companies.

AT: Well, but you have to understand that everybody in what I call this big-data health bazaar is making money off of it. So, a large pharmacy chain, such as CVS or Walgreen’s, they may make tens of millions of dollars in selling copies of these prescriptions to data miners.

Drug companies are particularly interested in buying this information because this information is doctor-identified. It says that Dr. Jones in Pittsburgh prescribes drug A almost all the time, rather than drug B. So, the company that makes drug B may send a sales rep to the doctor and say, “Doctor, here’s some free samples. Let’s go out to lunch. Let me tell you about how great drug B is.”

So, this is because there exists these doctor profiles on individual doctors across the country, that are used for sales and marketing, for very sophisticated kind of targeting.

IOS: So, in an indirect way, the drug companies can learn about the other drug companies’ sales patterns, and then say, “Oh, let me go in there and see if I can take that business away.” Is that sort of the way it’s working?

AT: In essence, yes. The origins of this trade date back to the 1950s. In its first form, these data companies, such as IMS Health, what they did was just telling companies what drugs sold in what market. Company A has 87% of the market. Their rival has 13% of the market. When medical information began to become digitized in the 1960s and ’70s and evermore since then, there was a new opportunity to trade this data.

So, all of a sudden, insurance companies and middle-men connecting up these companies, and electronic health records providers and others, had a product that they could sell easily, without a lot of work, and data miners were eager to buy this and produce new products for mostly the pharmaceutical companies, but there are other buyers as well.

IOS:  I wanted to get back to another point you mentioned, in that even with anonymized data records of medical records, with all the other information that’s out there, you can re-identify or at least limit, perhaps, the pool of people who that data would apply to.

What’s even more frightening now is that hackers have been stealing health records like crazy over the last couple of years. So, there’s a whole dark market of hacked medical data that, I guess, if they got into this IMS database, they would have the keys to the kingdom, in a way.

Am I being too paranoid here?

AT: Well, no, you correctly point out that there has been a sharp upswing in hacking into medical records. That can happen into a small, individual practice, or it could happen into a large insurance company.

And in fact, the largest hacking attack of medical records in the last couple of years has been into Anthem Health, which is the Blue Cross Blue Shield company. Almost 80 million records were hacked in that.

So even people that did… I was hacked in that, even though I was not, at the time, a customer of them or had never been a customer of them, but they… One company that I dealt with outsourced to someone else, who outsourced to them. So, all of a sudden, this information can be in circulation.

There’s a government website people can look at, and you’ll see, every day or two, there are new hackings. Sometimes it involves a few thousand names and an obscure local clinic. Sometimes it’ll be a major company, such as a lab test company, and millions of names could be impacted.

So, this is something definitely to be concerned about. Yes, you could take these hacked records and match them with anonymized records to try to figure out who people are, but I should point out that there is no recorded instance of hackers getting into these anonymized dossiers by the big data miners.

IOS: Right. We hope so!

AT: I say recorded or acknowledged instance.

IOS: Right. Right. But there’s now been sort of an awareness of cyber gangs and cyber terrorism and then the use of, let’s say, records for blackmail purposes.

I don’t want to get too paranoid here, but it seems like there’s just a potential for just a lot of bad possibilities. Almost frightening possibilities with all this potential data out there.

AT: Well, we have heard recently about rumors of an alleged dossier involving Donald Trump and Russia.

IOS: Exactly.

AT: And information that… If you think about what kind of information could be most damaging or harmful to someone, it could be financial information. It could be sexual information, or it could be health information.

IOS: Yeah, or someone using… or has a prescription to a certain drug of some sort. I’m not suggesting anything, but that… All that information together could have sort of lots of implications, just, you know, political implications, let’s say.

AT: I mean if you know that someone takes a drug that’s commonly used for a mental health problem, that could be information used against someone. It could be used to deny them life insurance. It could be used to deny them a promotion or a job offer. It could be used by rivals in different ways to humiliate people. So, this medical information is quite powerful.

One person who has experienced this and spoken publicly about it is the actor, Charlie Sheen. He tested positive for HIV. Others somehow learned of it and blackmailed him. He said he paid millions of dollars to keep that information from going public, before he decided finally that he would stop paying it, and he’d have to tell the world about his medical condition.

IOS: Actually I was not aware of the payments he was making. That’s just astonishing. So, is there any hope here? Do you see some remedies, through maybe regulations or enforcement of existing laws? Or perhaps we need new laws?

AT: As I mentioned, the current rules, HIPAA, allows for the free trade of your data if it’s anonymized. Now, I think, given the growth of sophistication in computing, that we should change what the rule is and to define our medical data as any medical information about us, whether or not it’s anonymized.

So, if a doctor is writing in the electronic health record, you should have a say as to whether or not that information is going to be used elsewhere.

A little side point I should mention. There are a lot of good scientists and researchers who want data to see if they can gain insights into disease and new medications. I think people should have the choice whether or not they want to contribute to those efforts.

So, you know, there’s a lot of good efforts. There’s a government effort under way now to gather a million DNA samples from people to make available to science. So, if people want to participate in that, and they think that’s good work, they should definitely be encouraged to do so, but I think they should have the say and decide for themselves.

And so far, we don’t really have that system. So, by redefining what patient data is, to say, “Medical information about a patient, whether or not it’s anonymized,” I think that would give us the power to do that.

IOS: So effectively, you’re saying the patient owns the data, is the owner, and then would have to give consent for the data to be used. Is that, about right?

AT: I think so. But on the other hand, as I mentioned, I’ve written this book to encourage this discussion. The problem we have right now is that the trade is so opaque.

Companies are extremely reluctant to talk about this commercial trade. So, they do occasionally say that, “Oh, this is great for science and for medicine, and all of these great things will happen.” Well, if that is so fantastic, let’s have this discussion where everyone will say, “All right. Here’s how we use the data. Here’s how we share it. Here’s how we sell it.”

Then let people in on it and decide whether they really want that system or not. But it’s hard to have that intelligent policy discussion, what’s best for the whole country, if industry has decided for itself how to proceed without involving others.

IOS: Well, I’m so glad you’ve written this book. This will, I’m hoping, will promote the discussion that you’re talking about. Well, this has been great. I want to thank you for the interview. So, by the way, where can our listeners reach out to you on social media? Do you have a handle on Twitter? Or Facebook?

AT: Well, I’m @datacurtain  and I have a webpage, which is

IOS: Wonderful. Thank you very much, Adam.

[Podcast] Adam Tanner on the Dark Market in Medical Data, Part II

[Podcast] Adam Tanner on the Dark Market in Medical Data, Part II

This article is part of the series "[Podcast] Adam Tanner on the Dark Market in Medical Data". Check out the rest:

Leave a review for our podcast & we'll send you a pack of infosec cards.

More Adam Tanner! In this second part of my interview with the author of Our Bodies, Our Data, we start exploring the implications of having massive amounts of online medical  data. There’s much to worry about.

With hackers already good at stealing health insurance records, is it only a matter of time before they get into the databases of the drug prescription data brokers?

My data privacy paranoia about all this came out in full force during the interview. Thankfully, Adam was able to calm me down, but there’s still potential for frightening possibilities, including political blackmail.

Is the answer more regulations for drug data? Listen to the rest of the interview below to find out, and follow Adam on Twitter, @datacurtain, to keep up to date.

Continue reading the next post in "[Podcast] Adam Tanner on the Dark Market in Medical Data"

[Podcast] Adam Tanner on the Dark Market in Medical Data, Part I

[Podcast] Adam Tanner on the Dark Market in Medical Data, Part I

This article is part of the series "[Podcast] Adam Tanner on the Dark Market in Medical Data". Check out the rest:

Leave a review for our podcast & we'll send you a pack of infosec cards.

In our writing about HIPAA and medical data, we’ve also covered a few of the gray areas of medical privacy, including  wearables, Facebook, and hospital discharge records. I thought both Cindy and I knew all the loopholes. And then I talked to writer Adam Tanner about his new book Our Bodies, Our Data: How Companies Make Billions Selling Our Medical Records.

In the first part of my interview with Tanner, I learned how pharmacies sell our prescription drug transactions to medical data brokers, who then resell it to pharmaceutical companies and others. This is a billion dollar market that remains unknown to the public.

How can this be legal under HIPAA, and why doesn’t it require patient consent?

It turns out after the data record is anonymized, but with the doctor’s name still attached, it’s no longer yours!  Listen in as we learn more from Tanner in this first podcast.

Continue reading the next post in "[Podcast] Adam Tanner on the Dark Market in Medical Data"

[Podcast] More Dr. Ann Cavoukian: GDPR and Access Control

[Podcast] More Dr. Ann Cavoukian: GDPR and Access Control

Leave a review for our podcast & we'll send you a pack of infosec cards.

We continue our discussion with Dr. Ann Cavoukian. She is currently Executive Director of Ryerson University’s Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design (PbD).

In this segment, Cavoukian tells us that once you’ve involved your customers in the decision making process, “You won’t believe the buy-in you will get under those conditions because then you’ve established trust and that you’re serious about their privacy.”

We also made time to cover GDPR as well as three things organizations can do to demonstrate that they are serious about privacy.

Learn more about Dr. Cavoukian:


Cindy Ng: Dr. Cavoukian, besides data minimalization, de-identification, user access control, what are some other concrete steps that businesses can take to benefit from protecting privacy?

Dr. Cavoukian: I think one of the things businesses don’t do very well is involve their customers in the decisions that they make, and I’ll give you an example. Years ago I read something called “Permission Based Marketing” by Seth Godin, and he’s amazing. And I read it, and I thought, “Oh this guy must have a privacy background,” because it was all about enlisting the support of your customers, gaining their permission and getting them to, as Godin said, “Put their hand up and say ‘count me in.'” So I called him, he was based in California at the time, and I said, “Oh Mr. Godin, you must have a privacy background?” And he said something like, “No, lady, I’m a marketer through and through, but I can see the writing on the wall. We’ve gotta engage customers, get them involved, get them to wanna participate in the things we’re doing.”

So, I always tell businesses that are serious about privacy, “First of all, don’t be quiet about it. Shout it from the rooftops, the lengths you’re going to, to protect your customer’s privacy. How much you respect it, how user-centric your programs are, and you’re focused on their needs in delivering.” And, then, once they understand this is the background you’re bringing, and you have great respect for privacy, in that context you say, “We would like you to consider giving us permission to allow it for these additional secondary uses. Here’s how we think it might benefit you, but we won’t do it without your positive consent.” You wouldn’t believe the buy-in you will get under those conditions because then you have established a trusted business relationship. They can see that you’re serious about privacy, and then they say, “Well by all means, if this will help me, in some way, use my information for this additional purpose.” You’ve gotta engage the customers in an active dialog.

Cindy Ng: So ask, and you might receive.

Dr. Cavoukian: Definitely, and you will most likely receive.

Cindy Ng: In sales processes they’re implementing that as well, “Is it okay if I continue to call you, or when can I call you next?” So they’re constantly feeling they’re engaged and part of the process, and it’s so effective.

Dr. Cavoukian: And I love that. Myself, as a customer… I belong to this air miles program, and I love it, because they don’t do anything without my positive consent. And, yet, I benefit because they send me targeted ads and things I’m interested in. And I’m happy to do that, and then I get more points and then it just continues to be a win-win.

Cindy Ng: Did you write anything about user access controls? What are your thoughts on that?

Dr. Cavoukian: We wrote about it in the context of that you’ve gotta have restricted access to those who have… I was gonna say, “Right to know.” Meaning there are some business purpose for which they’re accessing the data. And that can be…when I say, “business purpose,” I mean that broadly, in a hospital. People who are taking care of a patient, in whatever context, it can be in the lab. They go there for testing. Then they go for an MRI, and then they go… So there could be a number of different arms that have legitimate access to the data, because they’ve gotta process it in a variety of different ways. That’s all legitimate, but those people who aren’t taking care of the patient, in some broad manner, should have absolutely complete restricted access to the data. Because that’s when the snooping and the rogue employee…

Cindy Ng: Curiosity.

Dr. Cavoukian: …picture, the curiosity, takes you away, and it completely distorts the entire process in terms of the legitimacy of those people who should have access to it, especially in a hospital context, or patient context. You wanna enable easy access for those who have a right to know because they’re treating patients. And then the walls should go up for those who are not treating in any manner. It’d be difficult to do, but it is imminently doable, and you have to do it because that’s what patients expect. Patients have no idea that someone might be just, out of curiosity, looking at their file. You’ve had a breast removed, you had… I mean horrible things happen.

Cindy Ng: Tell us about GDPR, and it’s implications on Privacy by Design.

Dr. Cavoukian: For the first time, right now the EU has the General Data Protection Regulation, which passed for the first time, ever. It has the words, the actual words, “Privacy by Design” and “Privacy as the default” in the stature.

Cindy Ng: That’s great.

Dr. Cavoukian: It’s a first, it’s really huge, but what that means, it will strengthen those laws far higher than the U.S. laws. We talked about privacy as the default. It’s the model of positive consent. It’s not just looking for the opt out box. It’s gonna really raise the bar, and that might present some problems in dealing with laws in the states.

Cindy Ng: Then there’s also their right to be forgotten, and we live in such a globalized world, people both doing business in the states and in Europe, it’s been complicated.

Dr. Cavoukian: It does get very complicated. What I tell people everywhere that I go to speak is that if you follow the principles of Privacy by Design, which in itself raised the bar dramatically from most legislation, you will virtually be assured of complying with your regulations, whatever jurisdiction you’re in. Because you’re following the highest level of protection. So that’s another attractive feature about Privacy by Design is it offers such a high level of protection that you’re virtually assured of regulatory compliance, whatever jurisdiction you’re in.

And in the U.S., I should say, that the FTC, the Federal Trade Commission, a number of years ago, under Jon Leibowitz, when he was Chair, they made Privacy by Design the first of three best practices that the FTC recommended. And since he’s left, and Chairwoman Edith Ramirez is the Chair, she has also followed Privacy by Design and Security by Design, which are absolutely, interchangeably critical, and they are holding this high bar. So, I urge companies always to follow this to the extent that they can, because it will elevate their standing, both with the regulatory bodies, like the FTC, and with commissioners, and jurisdictions, and the EU, and Australia, and South America, South Africa. There’s something called GPN, the Global Privacy Network, and a lot of the people who participate in these follow these discussions.

Cindy Ng: What are three things that organizations can do in terms of protecting their consumers’ privacy?

Dr. Cavoukian: So, when I go to a company, I speak to the board of directors, their CEO, and their senior executive. And I give them this messaging about, “You’ve gotta be inclusive. You have to have a holistic approach to protecting privacy in your company, and it’s gotta be top down.” If you give the messaging to your frontline folks that you care deeply about your customer’s privacy, you want them to take it seriously, that message will emanate. And, then what happens from there, the more specific messaging is, what you say to people, is you wanna make sure that customers understand their privacy is highly respected by this company. “We go to great lengths to protect your privacy.” You wanna communicate that to them, and then you have to follow up on it. Meaning, “We use your information for the purpose intended that we tell you we’re gonna use it for. We collect it for that purpose. We use it for that purpose.” And then, “Privacy is the default setting. We won’t use it for anything else without your positive consent after that, for secondary uses.”

So that’s the first thing I would do. Second thing I would do is I would have at least quarterly meetings with staff. You need to reinforce this message. It’s gotta be spread across the entire organization. It can’t just be the chief privacy officer who’s communicating this to a few people. You gotta get everyone to buy into this, because you… I was gonna say the lowest. I don’t mean low in terms of category, but the frontline clerk might be low on the totem pole, but they may have the greatest power to breach privacy. So they have to understand, just like the highest senior manager has to understand, how important privacy is and why and how you can protect it. So have these quarterly meetings with your staff. Drive the message home, and it can be as simple as them understanding that this is… You’re gonna get what I call, “privacy payoff.” By protecting your customer’s privacy, it’s gonna yield big returns for your company. It will increase customer confidence and enhance customer trust, and that will increase our bottom line.

And the third thing, I know this is gonna a little pompous, but I would invite, and only because this happened to me, I’ve been invited in to speak to a company, like, once a year. And you invite everybody, from top to bottom. You open it up and… People need to have these ideas reinforced. It has to be made real. “Is this really a problem?” So, you bring in a speaker. I’m using myself as an example because I’ve done it, but it can be anybody who can speak to what happens when you don’t protect your customer’s privacy. It really helps for people inside a company, especially those doing a good job, to understand what can happen when you don’t do it right and what the consequences are to both the company and to employees. They’re huge. You can lose your jobs. The company could go under. You could be facing class action lawsuits.

And I find that it’s not all a bad news story. I give the bad news, what’s happening out there and what can happen, and then I applaud the behavior of the companies. And what they get is this dual message of, “Oh my God, this is real. This has real consequences when we fail to protect customer’s privacy, but look at the gains we have, look at the payoff in doing so.” And it makes them feel really good about themselves and the job that they’re doing, and it underscores the importance of protecting customer’s privacy.

[Podcast] Dr. Ann Cavoukian on Privacy By Design

[Podcast] Dr. Ann Cavoukian on Privacy By Design

Leave a review for our podcast & we'll send you a pack of infosec cards.

I recently had the chance to speak with former Ontario Information and Privacy Commissioner Dr. Ann Cavoukian about big data and privacy. Dr. Cavoukian is currently Executive Director of Ryerson University’s Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design (PbD).

What’s more, she came up with PbD language that made its way into the GDPR, which will go into effect in 2018. First developed in the 1990s, PbD addresses the growing privacy concerns brought upon by big data and IoT devices.

Many worry about PbD’s interference with innovation and businesses, but that’s not the case.

When working with government agencies and organizations, Dr. Cavoukian’s singular approach is that big data and privacy can operate together seamlessly. At the core, her message is this: you can simultaneously collect data and protect customer privacy.


Cindy Ng
With Privacy by Design principles codified in the new General Data Protection Regulation, which will go into effect in 2018, it might help to understand the intent and origins of it. And that’s why I called former Ontario Information and Privacy Commissioner, Dr. Ann Cavoukian. She is currently Executive Director of Ryerson University’s Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design. When working with government agencies and organizations, Dr. Cavoukian’s singular approach is that big data and privacy can operate together seamlessly. At the core, her message is this, you can simultaneously collect data and protect customer privacy.

Thank you, Dr. Cavoukian for joining us today. I was wondering, as Information and Privacy Commissioner of Ontario, what did you see what was effective when convincing organizations and government agencies to treat people’s private data carefully?

Dr. Cavoukian
The approach I took…I always think that the carrot is better than the stick, and I did have order-making power as Commissioner. So I had the authority to order government organizations, for example, who were in breach of the Privacy Act to do something, to change what they were doing and tell them what to do. But the problem…whenever you have to order someone to do something, they will do it because they are required to by law, but they’re not gonna be happy about it, and it is unlikely to change their behavior after that particular change that you’ve ordered. So, I always led with the carrot in terms of meeting with them, trying to explain why it was in both their best interest, in citizens’ best interest, in customers’ best interest, when I’m talking to businesses. Why it’s very, very important to make it…I always talk about positive sum, not zero sum, make it a win-win proposition. It’s gotta be a win for both the organization who’s doing the data collection and the data use and the customers or citizens that they’re serving. It’s gotta be a win for both parties, and when you can present it that way, it gives you a seat at the table every time. And let me explain what I mean by that. Many years ago I was asked to join the board of the European Biometrics Forum, and I was honored, of course, but I was surprised because in Europe they have more privacy commissioners than anywhere else in the world. Hundreds of them, they’re brilliant. They’re wonderful, and I said, “Why are you coming to me as opposed to one of your own?” And they said, “It’s simple.” They said, “You don’t say ‘no’ to biometrics. You say ‘yes’ to biometrics, and ‘Here are the privacy protective measures that I insist you put on them.'” They said, “We may not like how much you want us to do, but we can try to accommodate that. But what we can’t accommodate is if someone says, ‘We don’t like your industry.'” You know, basically to say “no” to the entire industry is untenable. So, when you go in with an “and” instead of a “versus,” it’s not me versus your interests. It’s my interests in privacy and your interests in the business or the government, whatever you’re doing. So, zero sum paradigms are one interest versus another. You can only have security at the expense of privacy, for example. In my world, that doesn’t cut it.
Cindy Ng
Dr. Cavoukian, can you tell us a little bit more about Privacy by Design?
Dr. Cavoukian
I really crystallized Privacy by Design really after 9/11, because at 9/11 it became crystal clear that everybody was talking about the vital need for public safety and security, of course. But it was always construed as at the expense of privacy, so if you have to give up your privacy, so be it. Public safety’s more important. Well, of course public safety is extremely important, and we did a position piece at that point for our national newspaper, “The Globe and Mail,” and the position I took was public safety is paramount with privacy embedded into the process. You have to have both. There’s no point in just having public safety without privacy. Privacy forms the basis of our freedoms. You wanna live in free democratic society, you have to be able to have moments of reserve and reflection and intimacy and solitude. You have to be able to do that.
Cindy Ng
Data minimalization is important, but what do you think about companies that do collect everything with hopes that they might use it in the future?
Dr. Cavoukian
See, what they’re asking for, they’re asking for trouble, because I can bet you dollars to doughnuts that’s gonna come back to bite you. Because, especially with data, that you’re not clear about what you’re gonna do with it, so you got data sitting there. What data does is in identifiable form is attracts hackers. It attracts rogue employees on the inside who will make inappropriate use of the data, sell the data, do something with the data. It just…you’re asking for trouble, because keeping data in identifiable form, once the uses have been addressed, just begs trouble. I always tell people, if you wanna keep the data, keep the data, but de-identify it. Strip the personal identifiers, make sure you have the data aggregated, de-identified, encrypted, something that protects it from this kind of rogue activity. And you’ve been reading lately all about the hackers who are in, I think they were in the IRS for God’s sakes, and they’re getting in everywhere here in my country. They’re getting into so many databases, and it’s not only appalling in terms of the data loss, it’s embarrassing for the government departments who are supposed to be protecting this data. And it fuels even additional distrust on the part of the public, so I would say to companies, “Do yourself a huge favor. You don’t need the data, don’t keep it in identifiable form. You can keep it in aggregate form. You can encrypt it. You can do lots of things. Do not keep it in identifiable form where it can be accessed in an unauthorized manner, especially if it’s sensitive data.” Oh my god, health data…Rogue employees, we have a rash of it here, where…and it’s just curiosity, it’s ridiculous. The damage is huge, and for patients, and I can tell you, I’ve been a patient in hospitals many times. The thought that anyone else is accessing my data…it’s so personal and so sensitive. So when I speak this way to boards of directors and senior executives, they get it. They don’t want the trouble, or I haven’t even talked costs. Once these data breaches happen these days, it’s not just lawsuits, they’re class action lawsuits that are initiated. It’s huge, and then the damage to your reputation, the damage to your brand, can be irreparable.
Cindy Ng
Right. Yeah, I remember Meg Whitman said something about how it takes years and years to build your brand and reputation and seconds ruined.
Dr. Cavoukian
Yeah, yes. That is so true. There’s a great book called “The Reputation Economy” by Michael Fertik. He’s the CEO of It’s fabulous. You’d love it. It’s all about exactly how long it takes to build your reputation, how dear it is and how you should cherish it and go to great lengths to protect it.
Cindy Ng
Can you speak about data ownership?
Dr. Cavoukian
You may have custody and control over a lot of data, your customer’s data, but you don’t own that data. And with that custody and control comes an enormous duty of care. You gotta protect that data, restrict your use of the data to what you’ve identified to the customer, and then if you wanna use it for additional purposes, then you’ve gotta go back to the customer and get their consent for secondary uses of the data. Now, that rarely happens, I know that. In Privacy by Design, one of the principles talks about privacy as the default setting. The reason you want privacy to be the default setting…what that means is if a company has privacy as the default setting, it means that they can say to their customers, “We can give you privacy assurance from the get-go. We’re collecting your information for this purpose,” so they identify the purpose of the data collection. “We’re only gonna use it for that purpose, and unless you give us specific consent to use it for additional purposes, the default is we won’t be able to use it for anything else.” It’s a model of positive consent, it gives privacy assurance, and it gives enormous, enormous trust and consumer confidence in terms of companies that do this. I would say to companies, “Do this, because it’ll give you a competitive advantage over the other guys.”

As you know, because you sent it to me, the Pew Research Center, their latest study on Americans’ attitudes, you can see how high the numbers are, in the 90 percents. People have had it. They want control. This is not a single study. There have been multiple surveys that have come out in the last few months like this. Ninety percent of the public, they don’t trust the government or businesses or anyone. They feel they don’t have control. They want privacy. They don’t have it, so you have, ever since, actually, Edward Snowden, you have the highest level of distrust on the part of the public and the lowest levels of consumer confidence. So, how do we change that? So, when I talk to businesses, I say, “You change that by telling your customers you are giving them privacy. They don’t even have to ask for it. You are embedding it as the default setting which means it comes part and parcel of the system.” They’re getting it. I do what I call my neighbors test. I explain these terms to my neighbors who are very bright people, but they’re not in the privacy field. So, when I was explaining this to my neighbor across the street, Pat, she said, “You mean, if privacy’s the default, I get privacy for free? I don’t have to figure out how to ask for it?” And I said, “Yes.” She said, “That’s what I want. Sign me up!”

See, people want to be given privacy assurance without having to go to the lengths they have to go to now to find the privacy policy, search through the terms of service, find the checkout box. I mean, it’s so full of legalese. It’s impossible for people to do this. They wanna be given privacy assurance as the default. That’s your biggest bet if you’re a private-sector company. You will gain such a competitive advantage. You will build the trust of your customers, and you will have enormous loyalty, and you will attract new opportunity.

Cindy Ng
What are your Privacy by Design recommendations for wearables and IoT innovators and developers?
Dr. Cavoukian
The internet of things, wearable devices and new app developers and start up…they are clueless about privacy, and I’m not trying to be disrespectful. They’re working hard, say an app developer, they’re working hard to build their app. They’re focused on the app. That’s all they’re thinking about, how to deliver what the app’s supposed to deliver on. And then you say, “What about privacy?” And they say, “Oh, don’t worry about it. We’ve got it taken care of. You know, the third-party security vendor’s gonna do it. We got that covered.” They don’t have it covered, and what they don’t realize is they don’t know they don’t have it covered. “Give it to the security guys and they’re gonna take care of it,” and that’s the problem. When I speak to app developers…I was at Tim O’Reilly’s Web 2.0 last year or the year before, and there’s 800 people in the room, I was talking about Privacy by Design, and I said, “Look, do yourself a favor. Build in privacy. Right now you’re just starting your app developing, build it in right now at the front end, and then you’re gonna be golden. This is the time to do it, and it’s easy if you do it up front.” I had dozens of people come up to me afterwards because they didn’t even know they were supposed to. It had never appeared on their radar. It’s not resistance to it. They hadn’t thought of it. So our biggest job is educating, especially the young people, the app developers, the brilliant minds. My experience, it’s not that they resist the messaging, they haven’t been exposed to the messaging. Oh, I should just tell you, we started Privacy by Design certification. We’ve partnered with Deloitte and I’ll send you the link and we’re, Ryerson University, where I am housed, we are offering this certification for Privacy by Design. But my assessment arm, my audit arm, my partner, is Deloitte, and we’re partnering together, and we’ve had a real, real, just a deluge of interest.
Cindy Ng
So, do you think that’s also why people are also hiring Chief Privacy Officers?
Dr. Cavoukian
Cindy Ng
What are some qualities that are required in a Chief Privacy Officer? Is it just a law background?
Dr. Cavoukian
No, in fact, I’m gonna say the opposite, and this is gonna sound like heresy to most people. I love lawyers. Some of my best friends are lawyers. Don’t just restrict your hiring of Chief Privacy Officers to lawyers. The problem with hiring a lawyer is they’re understandably going to bring a legal regulatory compliance approach to it, which, of course, you want that covered. I’m not saying…You have to be in compliance with whatever legislation is in your jurisdiction. But if that’s all you do, it’s not enough. I want you to go farther. When I ask you to do Privacy by Design, it’s all about raising the bar. Doing technical measures such as embedding privacy into the design that you’re offering into the data architecture, embedding privacy as a default setting. That’s not a legalistic term. It’s a policy term. It’s computer science. It’s a… You need a much broader skill set than law alone. So, for example, I’m not a lawyer, and I managed to be Commissioner for three terms. And I certainly valued my legal department, but I didn’t rely on it exclusively. I always went farther, and if you’re a lawyer, the tendency is just to stick to the law. I want you to do more than that. You have to have an understanding of computer science, technology, encryption, how can you… De-identification protocols are critical, combined with the risk of re-identification framework. When you look at the big data world, the internet of things, they’re going to do amazing things with data. Let’s make sure it’s strongly de-identified and resist re-identification attacks.
Cindy Ng
There have been reports that people can re-identify people without data.
Dr. Cavoukian
That’s right, but if you examine those reports carefully, Cindy, a lot of them are based on studies where the initial de-identification was very weak. They didn’t use strong de-identification protocols. So, like anything, if you start with bad encryption, you’re gonna have easy decryption. So, it’s all about doing it properly at the outset using proper standards. There’s now four standards of de-identification that have all come out that are risk-based, and they’re excellent.
Cindy Ng
Are you a fan of possibly replacing privacy policies with something simpler, like a nutrition label?
Dr. Cavoukian
It’s a very clever idea. They have tried to do that in the past. It’s hard to do, and I think your simplest one for doing the nutrition kinda label would be if you did embed privacy as the default setting. Because then you could have a nutrition label that said, “Privacy built in.” You know how, I think, Intel had something years ago where you had security built it or something. You could say, “Privacy embedded in the system.”

[Podcast] Data Privacy Attorney Sheila FitzPatrick on GDPR

[Podcast] Data Privacy Attorney Sheila FitzPatrick on GDPR

Leave a review for our podcast & we'll send you a pack of infosec cards.

We had a unique opportunity in talking with data privacy attorney Sheila FitzPatrick. She lives and breathes data security and is a recognized expert on EU and other international data protection laws. FitzPatrick has direct experience in representing companies in front of EU data protection authorities (DPAs). She also sits on various governmental data privacy advisory boards.

During this first part of the interview with her, we focused on the new General Data Protection Regulation (GDPR), which she says is the biggest overhaul in EU security and privacy rules in twenty years.

One important point FitzPatrick makes is that the GDPR is not only more restrictive than the existing Data Protection Directive —breach notification, impact assessment rules — but also has far broader coverage.

Cloud computing companies no matter where they are located will be under the GDPR if they are asked to process personal data of EU citizens by their corporate customers. The same goes for companies (or controllers in GDPR-speak) outside the EU who directly collect personal data — think of any US-based e-commerce or social networking company on the web.

Keep all this in mind as you listen to our in-depth discussion with this data privacy and security law professional.


Cindy Ng
Sheila FitzPatrick has over 20 years of experience running her own firm as a data protection attorney. She also serves as outside counsel for Netapp as their chief privacy officer, where she provides expertise in global data protection compliance, cyber security regulations, and legal issues associated with cloud computing and big data. In this series, Sheila will be sharing her expertise on GDPR, PCI compliance, and the data security landscape.
Andy Green
Yeah, Sheila. I’m very impressed by your bio and the fact that you’ve actually dealt with some of these PPA’s and EU data protection authorities that we’ve been writing about. I know there’s been, so the GPDR will go into effect in 2018, and I’m just wondering what sort of the biggest change for companies, I guess they’re calling them data controllers, in dealing with DPA’s under the law. Is there something that comes to mind first?
Sheila FitzPatrick
And thank you for the compliment by the way. I live and breathe data privacy. This is the stuff I love. GPR …I mean is certainly the biggest overhaul in 20 years, when it comes to the implication of new data privacy regulations. Much more restrictive than what we’ve seen in the past. And most companies are struggling because they thought what was previously in place was strict.

There’s a couple things that stick out when it comes GDPR, is when you look at the roles of the data controller verses the data processor, in the past many of the data processors, especially when you talk about third party outsourcing companies and any particular cloud providers, have pushed sole liability for data compliance down to their customers. Basically, saying you decide what you’re going to put in our environment, you have responsibility for the privacy and security aspects. We basically accept minimal responsibility. Usually, it’s around physical security.

The GDPR now is going to put very comprehensive and very well-defined regulations and obligations in place for data processors as well. Saying that they can no longer flow responsibility for privacy compliance down to their customers. And if they’re going to be… even if they… often times, cloud providers will say, “We will comply with the laws in countries where we have our processing centers.” And that’s not sufficient under the new laws. Because if they have a data processing center say in in UK, but they’re processing the data of a German citizen or a Canadian citizen or someone from Asia Pacific, Australia, New Zealand, they’re now going to have to comply with the laws in those countries as well. They can’t just push it down to their customers.

The other part of GDPR that is quite different and it’s one of the first times it’s really going to be put into place is that it doesn’t just apply to companies that have operations within the EU. It is basically any company regardless of where they’re located and regardless of whether or not they have a presence in the EU, if they have access to the personal data of any EU citizen they will have to comply with the regulations under the GDPR. And that’s a significant change. And then the third one being the sanction. And the sanction can be 20,000,000 euro or 4% of your global annual revenue, whichever is higher. That’s a substantial change as well.

Andy Green
Right, So that’s some big, big changes. So you’re referring to I think, what they call ‘territorial scope’? They don’t have to necessarily have an office or an establishment in the EU as long as they are collecting data? I mean we’re really referring to social media and to the web commerce, or e-commerce.
Sheila FitzPatrick
Absolutely, but it’s going to apply to any company. So even if for instance you say, “Well, we don’t have any, we’re just a US domestic company”, but if you have employees in your environment that hold EU citizenship, you will have to protect their data in accordance with GDPR. You can’t say, well they’re working the US, therefore US law applies. That’s not going to be the case if they know that the individual holds citizenship in the EU.
Andy Green
We’re talking about employees, or…?
Sheila FitzPatrick
Could be employees, absolutely. Employees…
Andy Green
Sheila FitzPatrick
Andy Green
Isn’t that interesting? I mean one question about this expanded territorial scope, is how are they going to enforce this against US companies? Or not just US, but any company that is doing business but doesn’t necessarily have an office or an establishment?
Sheila FitzPatrick
Well it can be… see what happens under GDPR is any individual can file a complaint with the ports in basically any jurisdiction. They can file it at the EU level. They can file with it within the countries where they hold their citizenship. They can file it now with US courts, although the US courts… and part of that is tied to the new privacy shield, which is a joke. I mean, I think that will be invalidated fairly quickly. With the whole Redress Act, it does allow EU citizens to file complaints with the US courts to protect their personal data in accordance with EU laws.
Andy Green
So, just to follow through, if I came from the UK into the US and was doing transactions, credit card transactions, my data would be protected under EU law?
Sheila FitzPatrick
Well, if the company knows you’re an EU citizen. They’re not going to necessarily know. So, in some cases if they don’t know, they’re not going to held accountable. But if they absolutely do know then they will have to protect that data in accordance with UK or EU law. Well, not the UK… if Brexit goes through, the EU law won’t matter. The UK data protection act will take precedence.
Andy Green
Wow. You know it’s just really fascinating how the data protection and privacy now is just so important. Right, with the new GPDR? For everybody, not just the EU companies.
Sheila FitzPatrick
Yeah, and its always been important, it’s just the US has a totally different attitude. I mean the US has the least restrictive privacy laws in the world. So for individuals that have really never worked or lived outside of the US, the mindset is very much the US mindset, which is the business takes precedence. Where everywhere else in the world, the fundamental right to privacy takes precedence over everything.
Andy Green
We’re getting a lot of questions from our customers the new Breach Notification rule…
Sheila FitzPatrick
Ask me.
Andy Green
…in the GDPR. I was wondering if you could talk about… What are one the most important things you would do when you discover a breach? I mean if you could prioritize it in any way. How would you advise a customer about how to have a breach response program in a GDPR context?
Sheila FitzPatrick
Yeah. Well first and foremost you do need to have in place, before a breach even occurs, an incident response team that’s not made up of just the IT. Because normally organizations have an IT focus. You need to have a response team that includes IT, your chief privacy officer. And if the person… normally a CPO would sit in legal. If he doesn’t sit in legally, you want a legal representative in there as well. You need someone from PR, communications that can actually be the public-facing voice for the company. You need to have someone within Finance and Risk Management that sits on there.

So the first thing to do is to make sure you have that group in place that goes into action immediately. Secondly, you need to determine what data has potentially been breached, even if it hasn’t. Because under GDPR, it’s not… previously it’s been if there’s definitely been a breach that can harm an individual. The definition is if it’s likely to affect an individual. That’s totally different than if the individual could be harmed. So you need to determine okay, what data has been breached, and does it impact an individual?

So, as opposed to if company-related information was breached, there’s a different process you go through. Individual employee or customer data has been breached, the individual, is it likely to affect them? So that’s pretty much anything. That’s a very broad definition. If someone gets a hold of their email address, yes, that could affect them. Someone could email them who is not authorized to email them.

So, you have to launch into that investigation right away and then classify the data that has been any intrusion into the data, what that data is classified as.

Is it personal data?

Is it personal sensitive data?

And then rank it based on is it likely to affect an individual?

Is it likely to impact an individual? Is it likely to harm an individual?

So there could be three levels.

Based on that, what kind of notification? So if it’s likely to affect or impact an individual, you would have to let them know. If it’s likely to harm an individual, you absolutely have to let them know and the data protection authorities know.

Andy Green
And the DPA, right? So, if I’m a consumer, the threshold is… in other words, if the company’s holding my data, I’m not an employee, the threshold is likely to harm or likely to affect?
Sheila FitzPatrick
Likely to affect.
Andy Green
Affect. Okay. That’s a little more generous in terms of…
Sheila FitzPatrick
Right. Right. And that has changed, so it’s put more accountability on a company, because you know that a lot of companies have probably had breaches and have never reported them. So, because they go oh well, there was no Social Security Number, National Identification number, or financial data. It was just their name and their address and their home phone number or their cell phone. And the definition previously has been well, it can’t really harm them. We don’t need to let them know.

And then all of a sudden people’s names show up on these mailing lists. And they’re starting to get this unsolicited marketing. And they can’t determine whether or not… how did they get that? Was it based on a breach or is it based on trolling the Internet and gathering information and a broker selling that information? That’s the other thing. Brokers are going to be impacted by the new GDPR, because in order to sell their lists they have to have explicit consent of the individual to include their name on a list that they’re going to sell to companies.

Andy Green
Alright. Okay. So, it’s quite consumer friendly compared to what we have in the US.
Sheila FitzPatrick
Andy Green
Is there sort of new rules about what they call sensitive data? And if you’re going to process certain classes of sensitive data, you need approval from the… I think at some point you might need approval from the DPA? You know what I’m referring to? I think it’s the…
Sheila FitzPatrick
Yes. Absolutely. I mean, that’s always been in place in most of the member states. So, if you look at the member states that have the more restrictive data privacy laws like Germany, France, Italy, Spain, Netherlands, they’ve always had the requirement that you have to register the data with the data protection authorities. And in order to collect and transfer outside of the country of origination any sensitive data, it did require approval.

The difference now is that any personal data that you collect on an individual, whether it’s an employee, whether it’s a customer, whether it’s a supplier, you have to obtain unambiguous and freely given explicit consent. Now this is any kind of data, and that includes sensitive data. Now the one difference with the new law is that there are just a few categories which are truly defined as sensitive data. That’s not what we think of sensitive data. We think of like birth date. Maybe gender. That information is certainly considered sensitive under… that’s personal data under EU law and everywhere else in the world, so it has to be treated to a high degree of privacy. But the categories that are political/religious affiliation, medical history, criminal convictions, social issues and trade union membership: that’s a subset. It’s considered highly sensitive information in Europe. To collect and transfer that information is going to now require explicit approval not only from the individual but from the DPA. Separate from the registrations you have done.

Andy Green
So, I think what I’m referring to is what they call the Impact Assessment.
Sheila FitzPatrick
Privacy Impact Assessments have to be conducted now anytime… and we’ve always… Anytime I’ve worked with any company, I’ve implemented Privacy Impact Assessments. They’re now required under the new GDPR for any collection of any personal data.
Andy Green
But sensitive data… I think they talked about a DNA data or bio-related data.
Sheila FitzPatrick
Oh no. So, what you’re doing… What happened under GPDR, they have expanded the definition of personal data. And so that not the sensitive, that’s expanding the definition of personal data to include biometric information, genetic information, and location data. That data was never included under the definition of personal data. Because the belief was, well you can’t really tie that back to an individual. They have found out since the original laws put in place that yes you can indeed tie that back to an individual. So, that is now included into the definition.
Andy Green
In sort of catching up a little bit with that technology?
Sheila FitzPatrick
Yeah. Exactly. But part of what GPDR did was it went from being a law around processing of personal data to a law that really moves you into the digital age. So, it’s anything about tracking or monitoring or tying different aspects or elements of data together to be able to identify a person. So, it’s really entering into the digital age. So, it’s trying to catch up with new technology.
Andy Green
I have one more question on the GDPR subject. There’s some mention in the law about sort of outside bodies can certify…?
Sheila FitzPatrick
Well, they’re talking about having private certifications and privacy codes. Right now, those are not in place. The highest standard you have right now for privacy law is what’s call Binding Corporate Rules. And so companies that have their Binding Corporate rules in place, there’s only less than a hundred companies worldwide that have those. And actually, I’ve written them for a number of companies, including Netapp has Binding Corporate rules in place. That is the gold standard. If you have BCRs, you are 90% compliant with GDPR. But the additional certifications that they’re talking about aren’t in place yet.
Andy Green
So, it may be possible to get a certification from some outside body and that would somehow help prove your… I mean, so if an incident happens and the DPA looks into it, having that compliance should help a little bit in terms of any kind of enforcement action?
Sheila FitzPatrick
yes, it certainly will once they come up with what those are. Unless you have Binding Corporate Rules. But right now… I mean if you’re thinking something like a trustee. No. there is no trustee certification. Trustee is a US certification for privacy, but it’s not a certification for GDPR.
Andy Green
Alright. Well, thank you so much. I mean these are questions that, I mean it’s great to talk to an expert and get some more perspective on this.

Are Wikileaks and ransomware the precursors to mass extortion?

Are Wikileaks and ransomware the precursors to mass extortion?

Despite Julian Assange’s promise not to let Wikileaks’ “radical transparency” hurt innocent people, an investigation found that the whistleblowing site has published hundreds of sensitive records belonging to ordinary citizens, including medical files of rape victims and sick children.

The idea of having all your secrets exposed, as an individual or a business, can be terrifying. Whether you agree with Wikileaks or not, the world will be a very different place when nothing is safe. Imagine your all your emails, health records, texts, finances open for the world to see. Unfortunately, we may be closer to this than we think.  

If ransomware has taught us one thing it’s that an overwhelming amount of important business and personal data isn’t sufficiently protected. Researcher Kevin Beaumont says he’s seeing around 4,000 new ransomware infections per hour. If it’s so easy for an intruder to encrypt data, what’s stopping cybercriminals from publishing it on the open web?

There are still a few hurdles for extortionware, but none of them are insurmountable:

1. Attackers would have to exfiltrate the data in order to expose it

Ransomware encrypts data in place without actually stealing it. Extortionware has to bypass traditional network monitoring tools that are built to detect unusual amounts of data leaving their network quickly. Of course, files could be siphoned off slowly disguised as benign web or DNS traffic.

2. There is no central “wall of shame” repository like Wikileaks

If attackers teamed up to build a searchable public repository for extorted data, it’d make the threat of exposure feel more real and create a greater sense of urgency. Wikileaks is very persistent about reminding the public that the DNC and Sony emails are out in the open, and they make it simple for journalists and others to search the breached data and make noise about it.

3. Maybe ransomware pays better

Some suggest that the economics of ransomware are better than extortionware, which is why we haven’t seen it take off. On the other hand, how do you recover when copies of your files and emails are made public? Can the DNC truly recover? Payment might be the only option, and one big score could be worth hundreds of ransomware payments.  

So what’s preventing ransomware authors from trying to doing both? Unfortunately, not much. They could first encrypt the data then try to exfiltrate it. If you get caught during exfiltration, it’s not a big deal. Just pop up your ransom notification and claim your BTC.

Ransomware has proven that organizations are definitely behind the curve when it comes to catching abnormal behavior inside their perimeters, particularly on file systems. I think the biggest lesson to take away from Wikileaks, ransomware, and extortionware is that we’re on the cusp of a world where unprotected files and emails will regularly hurt businesses, destroy privacy, and even jeopardize lives (I’m talking about hospitals that have suffered from cyberattacks like ransomware).

If it’s trivially easy for noisy cybercriminals that advertise their presence with ransom notes to penetrate and encrypt thousands of files at will, the only reasonable conclusion is that more subtle threats are secretly succeeding in a huge way.  We just haven’t realized it yet…except for the U.S. Office of Personnel Management. And Sony Pictures. And Mossack Fonseca. And the DNC. And…

[Podcast] Attorney and Data Scientist Bennett Borden, Part I: Data Analysis...

[Podcast] Attorney and Data Scientist Bennett Borden, Part I: Data Analysis Techniques

This article is part of the series "[Podcast] Attorney and Data Scientist Bennett Borden". Check out the rest:

Leave a review for our podcast & we'll send you a pack of infosec cards.

Once we heard Bennett Borden, a partner at the Washington law firm of DrinkerBiddle, speak at the CDO Summit about data science, privacy, and metadata, we knew we had to reengage him to continue the conversation.

His bio is quite interesting: in addition to being a litigator, he’s also a data scientist. He’s a sought after speaker on legal tech issues. Bennett has written law journal articles about the application of machine learning and document analysis to ediscovery and other legal transactions.

In this first part in a series of podcasts, Bennett discusses the discovery process and how data analysis techniques came to be used by the legal world. His unique insights on the value of the file system as a knowledge asset as well as his perspective as an attorney made for a really interesting discussion.

Continue reading the next post in "[Podcast] Attorney and Data Scientist Bennett Borden"

Let’s Get More Serious About AR and Privacy

Let’s Get More Serious About AR and Privacy

Augmented Reality (AR) is the technology of the moment. While some of us have already experienced the thrill of catching a Dragonite in Pokemon Go, AR is not just all fun and games. In fact, depending on how an AR gadget is used, it can have significant privacy implications.

Privacy in Public

Augmented reality enhances real images with digital special effects — it’s reality assisted by coding.  These gadgets generally let you record a scene, and then they give you the option of sharing on social media.

In the public space, you don’t have an expectation of privacy. As an amateur photographer myself, I was always told to be polite and ask permission of a stranger before taking a picture. If you’re curious, there’s a professional code of ethics that spells this out.

But doctors, bankers, lawyers, and some others are under real legal obligations when it comes to taking picturse of people and personal information.

Privacy at the Doctor’s

Suppose a doctor armed with an AR device (or a video-recorder), films his waiting room filled with people. The doctor may not necessarily need consent in this case, but some states and hospital associations may have their own laws and guidelines in this area.

If the doctor photographs a patient’s face for clinical purposes, usually the general HIPAA consent form would be sufficient.

But if the doctor were to use the video of the waiting room or clinical pictures for marketing purposes, HIPPA requires additional authorization.

In general, hospital employees and visitors (except when recording family members) need consent when photographing or video-ing people in a hospital setting.

Mark my words, but at some point a HIPAA case will be brought against hospital workers fooling around with Pokemon Go as they wander the medical corridors hunting for Vapereons.

By the way, photos or videos showing faces are considered protected health information (PHI).

If they were then stored, they would have to be protected in the same was as HIPAA text identifiers. And an unauthorized exposure of this type of PHI would be considered a breach.

Outside the Hospital Setting

These AR gadgets can also be a privacy problem in business and legal settings. If an outsider or unauthorized person with AR glasses were recording confidential data, trade secrets, or PII on someone’s desk or on their screen, then that would be considered a security leak.

And relevant laws such a Gramm-Leach-Bliley and Sarbannes-Oxley would kick in.

A judge recently banned Pokemon Go in the courtroom, but this seems to be more a case of legal etiquette.  Another judge was somewhat upset — and tweeted about it — that a defense counsel was using AR glasses, but apparently nothing illegal was done.

It’s a little premature to become too worried about the privacy and security issues of AR gadgetry with so many more pressing security problems.

However, it’s not a bad idea for your company to come up with initial guidelines and policies on AR device usage by employees and visitors.

Top Minds in PCI Compliance

Top Minds in PCI Compliance

With countless data breaches hitting the front page, many are turning to the Payment Card Industry Data Security Standard (PCI DSS) which is an invaluable controls list to guide, influence, and promote security.

However, there are merchants who argue that these controls provide too much security while security professionals think they provide too little.

So what do the experts think about PCI DSS? Here are five worth listening to:

1.Laura Johnson

laura johnson

As Director of Communications for PCI Security Standards Council (SSC), Laura Johnson is responsible for creating communication strategies that inform, educate, and help PCI SSC global stakeholders to participate in PCI programs and initiatives.

If you want to learn about PCI fundamentals, check out her blog. There, you’ll also find the latest and greatest on PCI DSS 3.2.

2. Anton Chuvakin / @anton_chuvakin


Not only is Anton Chuvakin an Infosec expert, but he’s also super knowledgeable about PCI DSS compliance, offering the best dos and don’ts to keep everyone’s payment cards safe. Currently Dr. Anton Chuvakin is a Research Vice President of Gartner’s  Technical Professionals (GTP) Security and Risk Management Strategies team.

According to Mr. Chuvakin, many make the mistake of only adhering to the PCI DSS specific tasks right before a compliance assessment. However, in reality you really need to adhere to the standards at all times as security doesn’t start and end with PCI compliance.

By the way, get his book on PCI Compliance! You won’t regret it!

3. Nancy Rodriguez

nancy rodriguez

Nancy Rodriguez is currently Enterprise PCI Governance Leader at Wells Fargo and responsible for coordinating and conducting PCI risk assessments.

Her contributions to the industry are wide and varied and started over 25 years ago. She has been a trusted advisor at Citi for all global PCI programs, a former Board of Advisors of PCI SSC, and a PCI Compliance Director at Philips.

See what others have to say about Rodriguez, here.

4. Troy Leach / @TroyLeach


Troy Leach is the Chief Technology Officer for the PCI Security Standards Council (SSC). He partners with industry leaders to develop standards and strategies to ensure that payment card data and infrastructure is secure.

If you want to hear more from Mr. Leach and Mr. Chuvakin on what they have to say about the balance between PCI DSS compliance and security, check out this insightful interview. Also Mr. Leach regularly tweets out links to stories on bank hackers, robberies, and ATM thieves – it’ll feel like you’re watching an episode of Law and Order!

5.John Kindervag/ @kindervag


Mr. Kindervag is a leading expert on wireless security, network security, security information management, and PCI data security. Currently he is Forrester’s Vice President and Principal Analyst serving security and risk professionals.

In this TechTarget article, Mr. Kindervag dispels the five biggest misunderstandings about PCI DSS.

Want to learn more?