Since we’ve been working on the blog, Cindy and I have chatted with security professionals across many different areas — pen testers, attorneys, CDOs, privacy advocates, computer scientists, and even a guru. With 2016 coming to an end and the state of security looking more unsettled than ever, we decided it was a good time to take stock of the collective wisdom we’ve absorbed from these pros.
The Theory of Everything
A good place to begin our wisdom journey is the Internet of Things (IotT). It’s where the public directly experiences all the issues related to privacy and security.
We had a chance to talk to IoT pen tester Ken Munro earlier this year, and his comments on everything from wireless coffee pots and doorbells to cameras really resonated with us:
“You’re making a big step there, which is assuming that the manufacturer gave any thought to an attack from a hacker at all. I think that’s one of the biggest issues right now is there are a lot of manufacturers here and they’re rushing new product to market …”
IoT consumer devices are not, cough, based on Privacy by Design (PbD) principles.
And over the last few months, consumers learned the hard way that these gadgets were susceptible to simple attacks that exploited backdoors, default passwords, and even non-existent authentication.
Additional help to the hackers was provided by public-facing router ports left open during device installation, without any warning to the poor user, and unsigned firmware that left their devices open to complete takeover.
As a result, IoT is where everything wrong with data security seems to show up. However, there are easy-to-implement lessons that we can all put into practice.
Is security always about passwords? No, of course not, but poor passwords or password defaults that were never reset seem to show up as a root cause in many breaches.
The security experts we’ve spoken to have, without prompting from us, often bring up the sorry state of passwords. One of them, Per Thorsheim, who is in fact a password expert himself, reminded us that one answer to our bad password habits is two-factor authentication (TFA):
“From a security perspective, adding this two-factor authentication, is everything. It increases security in such a way that in some cases even if I told you my password for my Facebook account, as an example, well because I have two –factor authentication, you won’t be able to log in. As soon as you type in my user name and password, I will be receiving a code by SMS from Facebook on my phone, which you don’t have access to. This is really good.”
We agree with Thorsheim that humans are generally not good at this password thing, and so TFA and biometric authentication will certainly be a part of our password future.
In the meantime, for those of who still cling to just plain-old passwords, Professor Justin Cappos told us awhile back that there’s a simple way to come up with better password generation:
“If you’re trying to generate passwords as a human, there are tricks you can do where you pick four dictionary words at random and then create a story where the words interrelate. It’s called the “correct horse battery staple” method! “
Correct-horse-battery-staple is just a way of using a story as a memory trick or mnemonic. It’s an old technique but which one helps create crack-proof passwords.
One takeaway from these experts: change your home router admin passwords now (and use horse-battery-staple). Corporate IT admins should also take a good, hard look at their own passwords and avoid aiding and abetting hackers
Cultivate (Privacy and Security) Awareness
Enabling TFA on your online accounts and generating better passwords goes a very long way to improving your security profile.
But we also learned that you need to step back and cultivate a certain level of privacy awareness in your online transactions.
We learned from attorney and privacy guru Alexandra Ross about the benefits of data minimization, both for the data companies that collect and the consumers who reveal their data:
“One key thing is to stop, take a moment, and be mindful of what’s going on. What data am I being asked to submit when I sign up for a social media service? And question why it’s being asked.
It’s worth the effort to try to read the privacy policies, or read consumer reviews of the app or online service.”
“If you’re speaking to the marketing team at a technology company—yeah, the default often is let’s collect everything. In other words, let’s have this very expansive user profile so that every base is covered and we have all these great data points.
But if you explain, or ask questions … then you can drill down to learn what’s really necessary for the data collection.”
In a similar vein, data scientist Kaiser Fung pointed out that often there isn’t much of a reason behind some of the data collection in the first place:
“It’s not just the volume of data, but that the fact that the data today is not collected without any design or plan in mind. Often times, people collecting the data are really divorced from any kind of business problem.”
Listen up IT and marketing people: think about what you’re doing before you submit your next contact form!
Ross and other PbD advocates preach the doctrine of data minimization: the less data you have, the lower your security risk is when there’s an attack.
As our privacy guru, Ross reminded us that there’s still lot of data about us spread out in corporate data systems. Scott “Hacked Again” Schober another security pro we chatted with makes the same point based on his personal experiences:
“I was at an event speaking … and was asked if I’d be willing to see how easy it is to perform identity theft and compromise information on myself. I was a little reluctant but I said ok, everything else is out there already, and I know how easy it is to get somebody’s information. So I was the guinea pig. It was Kevin Mitnick, the world’s most famous hacker, who performed the theft. Within 30 seconds and at the cost of $1, he pulled up my social security number.”
There’s nothing inherently wrong with companies storing personal information about us. The larger point is to be savvy about what you’re being asked to provide and take into account that corporate data breaches are a fact of life.
Credit cards can be replaced and passwords changed but details about our personal preferences (food, movies, reading habits) and our social security numbers are forever and a great of source raw material for hackers to use in social engineered attacks.
Data is Valuable
We’ve talked to attorneys and data scientists, but we had the chance to talk to both in the form of Bennett Borden. His bio is quite interesting: in addition to being a litigator at Drinker Biddle, he’s also a data scientist. Borden has written law journal articles about the application of machine learning and document analysis to e-discovery and other legal transactions.
Borden explained how as employees we all leave a digital trail in the form of emails and documents, which can be quite revealing. He pointed out that this information can be useful when lawyers are trying to work out a fair value for a company that’s being purchased.
He was called in to do a data analysis for a client and was able to show that internal discussions indicated the asking price for the company was too high:
“We got millions of dollars back on that purchase price, and we’ve been able to do that over and over again now because we are able to get at these answers much more quickly in electronic data.”
So information is valuable in a strictly business sense. At Varonis, this is not news to us, but still it’s still powerful to hear someone who is immersed in corporate content as part of his job to tell us this.
To summarize, as consumers and as corporate citizens, we should all be more careful about treating this valuable substance: don’t give it away easily, and protect if it’s in your possession.
More Than Just a Good Idea
Privacy by Design came up in a few of our discussions with experts, and one of its principles, privacy as a default setting, is a hard one for companies to accept. Although PbD says that privacy is not a zero-sum game — you can have tough privacy controls and profits.
In any case, for companies that do business in the EU, PbD is not just a good idea but in fact it will be the law in 2018. The concept is explicitly spelled out in the General Data Protection Regulation’s (GDPR) article 25, “Data protection by design and by default”.
We’ve been writing about the GDPR for the last two years and of its many implications. But one somewhat overlooked consequence is that the GDPR will apply to companies outside of the EU.
We spoke with data security compliance expert Sheila FitzPatrick, who really emphasized this point:
“The other part of GDPR that is quite different–and it’s one of the first times that this idea will be put in place– is that it doesn’t just apply to companies that have operations within the EU. Any company regardless of where they are located and regardless of whether or not they have a presence in the EU, if they have access to the personal data of any EU citizen, they will have to comply with the regulations under the GDPR. That’s a significant change.”
This legal idea is sometimes referred to as extraterritoriality. And US e-commerce and web service companies in particular will find themselves under the GDPR when EU citizens interact with them. IT best practices that experts like to talk about as things you should do are becoming legal requirements for them. It’s not just a good idea!
Our final advice for 2016: read the writing on the wall and get yourself in position to align yourself with PbD ideas on data minimization, consumer consent, and data protection.