Leave a review for our podcast & we'll send you a pack of infosec cards.
Troy Hunt is a web security guru, Microsoft Regional Director, and author whose security work has appeared in Forbes, Time Magazine and Mashable. He’s also the creator of “Have I been pwned?”, the free online service for breach monitoring and notifications.
In this podcast, we discuss the challenges of the industry, learn about his perspective on privacy and revisit his talk from RSA, Lessons from a Billion Breached Data Records as well as a more recent talk, The Responsibility of Disclosure: Playing Nice and Staying Out of Prison.
After the podcast, you might want to check out the free 7-part video course we developed with Troy on the new European General Data Protection Regulation that will be law on May 25, 2018. It will change the landscape of regulated data protection law and the way that companies collect personal data. Pro tip: GDPR will also impact companies outside the EU.
Cindy Ng: Troy Hunt is a web security guru, Australian Microsoft Regional Director and author whose security writing has appeared in Forbes, Time Magazine, and Mashable. In this podcast, we talk about his popular website, Have I Been Pwned, the morals and ethics in the work we’re involved in, and one thing everyone should get when it comes to security.
I’d like to try to capture on a podcast things that we can’t do in writing or in visual format and I think there’s an emotional aspect in audio. It really helps people get to know more of who you are.
Troy Hunt: You know, those were the exact words that just came to mind as you were saying it because there’s a lot of feeling and sentiment that gets lost when you just throw things out, isn’t there?
Cindy Ng: Mm-hmm. Definitely. And you have a site, Have I Been Pwned, that notifies people when there’s a data breach. And I was listening to your recording that you did at RSA, “Lessons from a Billion Breached Records.” I thought it was really interesting that you were making the case that kids, they’re 18, 19, 20 years old that are hackers, then you’re mediating conversation with them. Do you talk to their parents?
Troy Hunt: No, I just tell them to go to their room and think about what they’ve done. And we…no, I can’t do that. I feel like doing that at times because you get the sense, and to be clear, when we say, “talk,” it is all text, right? This is not what you and I are doing. And to the earlier point, that, yeah, this doesn’t sort of convey emotion and sentiment and maturity in the same way as a voice discussion does. This is all sort of text-based chat. And you sort of get the impression from the style of chat, the words that are used, the references that are made, you build up this mental image of who you’re talking to, right? And time and time again, it’s like this is a young male, it’s either legally a child, you know, normally 15, 16, 17, or very young adult, maybe sort of early 20s at the eldest. And time and time again, we see that that plays out to be the case. And particularly when we look at historical incidents of the likes of “hacktivists” being arrested and charged and that’s a little bit of a liberally-used term, I suspect, hacktivist. Very often when we see people that have been breaking into systems and causing havoc for not necessarily for sort of monetary gain or personal advancement, but just because it was there, just for the lulz. We see this pattern time and time again. Look, I mean, certainly, at that age, people are independent enough that I’m not going to end up in conversations with their parents. That would be condescending for me to go, “Hey, is your mom or dad there?” You know, “Can I have a chat to them?” So, we don’t normally end up in that direction.
Cindy Ng: It’s just funny that you’re engaging with them in a very human way to verify a breach or in the process of.
Troy Hunt: You know, they are human.
Cindy Ng: Well, we really don’t know what hackers look like. We have a certain kind of image of them.
Troy Hunt: Well, I mean, yes and no. So we have put faces to them insofar as we have seen many previous incidents where we’ve seen these people, this, you know, class of person, charged and turn up publicly. I mean, some of very sort of high-profile ones have been the likes of some of the individuals from the LulzSec hacktivist group that were very active around 2011. So we know of people like Jake Davis who was about 18 at the time who was charged. We know it’s Jake because he was up there in the news as a sort of a high-profile catch, if you like, for the authorities. And we’ve also seen him and others as well in that group actually go on to do some really cool stuff in very productive ways. So, you know, I guess there is this part of us which knows in an evidence-based way who these individuals tend to be and the demographic they fit into. And then the point I make, particularly in the introduction of that talk about the billions of breached records, is that there’s this other side which is how hackers are portrayed online. Look, I mean, there’s lots of recordings of this talk, “Lessons from Billions of Breached Records,” that people can go and have a look at and see what I show, but when you go to, say, Google Images Search, and you search for “hacker,” it’s like hoodies and green screens and binary and stuff everywhere. And it’s all scary imagery. And we’ve got probably the media to blame for a lot of that, we’ve got security companies to blame for a lot of that, because they like to make this stuff scary because the more scared you are, the more security stuff you buy. So we get this sort of portrayal which is very out of step with the individuals themselves. Now, part of that as well is blown up by those individuals, whilst they are anonymous, and they’re feeling invincible, and they feel that they sort of, you know, rule the world. Having a lot of sort of bravado in the way they present themselves, the way they talk. If we have a look at when we see things like attacks where data is held for ransom and we see individuals asking for money. The language they use, and the way they conduct themselves seems enormously confident, they feel infallible, they sound kind of scary. And we sort of see these three aspects, so the way they present themselves, the way the media categorizes them, and then who they actually are once they’re unveiled, and those three stories tend to actually be quite different.
Cindy Ng: And do they work with others, say, if you get an encounter with ransomware, and you go to their site and there’s tech support, customer support, are those people working independently, and are they 18 years old?
Troy Hunt: You know, the way I like to explain it, and certainly, it’s not just me, I see other people use these categories as well, is, there are sort of three particular demographics that we regularly see time and time again. And one demographic is this class of individual that I’ve just been discussing, which is sort of your hacktivist, your individual who’s out there in pursuit of a greater cause, very often just bored kids with time on their hands. And, you know, they’re dangerous because they’re bored kids. Bored kids can be pretty dangerous. But they’re not necessarily overly sophisticated, their attention spans can be a little short on the target, if there’s not something fun and easy there, and then they move on.
There is this other category of attacker which is those that are actually out there for commercial gain, so those that have an ROI. And these are the sort of the career criminals. And, you know, this is a really interesting group, and it speaks more to the ransomware-style class of attacker where they are out there to try and make money. Now, very often, your hacktivist is out there because something was there or it was fun, it was, again, for the lulz. But these guys are saying, “Look, we’ve actually got an ROI here. We’re going to invest in vulnerabilities, we’re going to invest in exploits, we’re going to invest in botnets. We’ll spend money where it makes sense to make money. We will target organizations with the expectation of getting a return. They’re not necessarily out there to get press and media, they’re out there to make money. And something like ransomware is a really good example there. They’ll indiscriminately target anyone that can be infected. So I shouldn’t laugh, but I was actually in a dentist’s just two days ago and whilst I was there they were busily discovering that they had ransomware. And, oh, man, watching that unfold. But inevitably, there’s someone behind that who’s out there to make money. And that’s sort of the second category and I suspect we’ll spend a bit more time there, and then in this third category we speak about state actors and sort of nation state hackers, which, of course, is also becoming a very big thing these days.
Cindy Ng: Well, I’d like to tie it into a future event that you’ll be presenting. I think it’s called “Playing Nice and Staying Out of Prison.” And I want to hear more about that event because it reminds of these investment bankers who got caught doing insider trading, and they said that everyone inside the community, they were doing it and making money. And then the FBI reminded them that these bankers that, just because you disagree with a law, it does not mean you can break it. And so I feel like we are treading in these interesting territories where you’re sitting in front of a computer. You don’t necessarily have to be in a suit, but it’s still considered like a white-collar crime? Is this something that you’re presenting on, or?
Troy Hunt: No, look, you’re pretty much right there. In fact, the talk I’m doing, it’s at the AusCERT Conference in Australia. And it’s the only conference I go to that I can walk to, which I’m very happy about. Because normally I’ve got to get on airplanes. But this talk is called “The Responsibility of Disclosure, Playing Nice and Staying Out of Prison.” And it’s actually a talk that AusCERT asked me to do. So AusCERT is a national computer emergency response team, it’s an organization that provides services to companies in Australia to help them deal with things like security incidents. And in fact, I worked with them quite a bit last year when the Red Cross blood bank service inadvertently published their database backups publicly. So I’ve had quite a bit to do with them, and they really wanted me to talk about, how do we do responsible disclosure in a responsible fashion? So I talk a lot about the way individuals need to go about their responsible disclosure, and I’ve got an example here, I’ll give everyone a highlight before I talk about it. I got an email the other day from a guy, and the guy says, “I’m a fledgling IT professional that likes to delve into web development and security.” It’s like, “Oh, that’s very nice, thank you for emailing me.” And then he goes on and he says, “I recently discovered a bug in an American company’s website which reveals the names, birthdates, email addresses, physical addresses, and phone numbers of their customers.” And you’re sort of going, “Okay, well, that’s bad, but he’s discovered it,” so now he’s at this crossroads where he can do the right thing and get in touch with the organization or he can go down various shades of gray and do the wrong thing. And the next thing he says is, he says, “This may have been dumb,” that turns out to be a very insightful comment, “But I wrote a script to grab the first 10,000 records to confirm the exploit is what I thought it was.” And this is a really good example of where the guy could have grabbed the first one record and said, “Hey, look at this, I can see someone else’s record, now I’m going to get in touch with the company and let them know.” And they would’ve gone, “Okay, well, look, he’s gone far enough to see one record.” And let’s say they did want to get all legal, and he’s gotta stand there in front of the judge and go, “Look, mate, I saw one record, I reported it, I handled it ethically.” But instead, he’s gone and grabbed 10,000 records of other people’s personal data. And as soon as you go down that road, now you’ve got a big problem. Because the entity involved is going to be accountable or certainly is going to be held accountable for contacting those 10,000 people and saying, “Hey, someone else grabbed your data.” And that’s going to invoke all sorts of other legal obligations on their behalf as well. So even though I don’t think this individual had malicious intent, obviously he went way, way, way beyond what he actually needed to. And it’s just interesting how…and, you know, look, maybe the script took him 20 minutes to write, but it’s interesting how there’s just these continual crossroads where it’s so easy to do the right thing, but it’s also so easy to put yourself in serious risk of legal action.
Cindy Ng: We’ve talked about this on our podcast a couple of times about having maybe like a technologist Hippocratic Oath in the same way that doctors might take an oath. And also, a possible problem with the law not necessarily having been caught up with how fast technology is changing. Is there something that you’ve seen that’s helpful for people? Because it’s complicated.
Troy Hunt: I think the analogies that try to compare what we do in this industry with other industries often don’t fly real well. And say if we want to sort of compare ourselves to doctors, look, when you’ve got 15-year-old kids at home sort of doing heart surgery on an ad hoc basis, well, then, you know, then we make comparisons. But it is very different because to be a doctor you’ve got, you know, years and years and years of training, you’ve got to have qualifications, there’s enormous amounts of oversight and regulation and everything else. And by the time you’re actually out there practicing as a doctor and doing things that impact people’s health, obviously, we have a huge amount of confidence that these are going to be people doing the right thing, that, you know, properly experienced.
Now, when we compare that to… I mean, let’s look broader than just security, let’s look at IT. When you compare that to what do you have to do to get involved in IT, read a book? You know, like, very, very little. And that’s kind of a…what both makes it great and makes it horrifying. Where we can have people out there building systems, leaving people at risk, or conversely, people out there that have got enough capability to find vulnerabilities, but maybe not quite enough in the sort of ethics front to handle them properly.
I don’t think anything around the sort of IT Hippocratic Oath or anything around IT certifications that everyone should have is ever going to be a feasible thing.
Cindy Ng: I don’t know, I’m thinking, too, though, that it takes years and years to figure out how to build a layered security system, for instance, and it takes a lot of manpower.
Troy Hunt: Yeah. Yeah. Yes and no. Yeah, part of the challenge here is that we operate on such a global scale as the internet. And one of the things that organizations and the industry is always concerned about is that if we’re overly burden… I mean, let’s say…in the U.S. we’ve said, “Okay, anyone who’s going to produce software that runs on the internet has to go and do X, Y, Z certifications. And then they become burdened to do that, regardless of what the upside is, there’s a time and a financial cost to do it. And then, someone goes, “Well, we could offshore it to India, and they don’t have to do that, it’d be a lot cheaper.” You know, so unless you get consensus on a global scale because we are talking about a global resource, being the internet, it’s just not going to happen. So it is a complex problem but, you know, by the same token as well, when we look at…and we’re probably sort of talking more about the defensive side here than the offensive, but when we look at where most software goes wrong, in terms of the vulnerabilities, and certainly when I look at the data breaches that I see day in and day out, these are really low-hanging risks that one person could have secured very easily if they just didn’t write that code that was vulnerable to SQL injection or if they just didn’t put their database back up in a publicly-facing location. It’s very often very low-hanging fruit in terms of the problems that are introduced, and consequently, they’re problems that could be easily fixed.
Cindy Ng: Is that part of your “Hacking Yourself” course that’s super-popular and remains to be one of your most popular courses?
Troy Hunt: No, you’re right, and the premise of hacking yourself first is that it’s very much targeted at people building systems, and it’s saying, “Hey, guys, it would be really good if you actually understood how things like SQL injection work.” So not just, you know, do you understand how TSQR works and how you query a database, but do you understand how people break into the software that you’re writing?
So I have an online course with Pluralsight, it’s, I think, about 9 or 10 hours’ worth of content on “Hack Yourself First” and I also do these workshops around the world where I sit with developers for a couple of days and we go through all of these aspects of building software and where the vulnerabilities are. And developers get this sort of first-hand experience of breaking their own things. And it’s amazing to watch the lightbulbs go on in people’s minds as they see how their beautiful software gets abused in all sorts of ways they never expected. And by hacking themselves first, that gives them this much more sort of defensive mindset. And as well as having a lot of fun doing it, developers do actually like breaking stuff. It also means that when they go forward and they build new software, that they’re thinking with a much more defensive mindset than what they ever had before.
Cindy Ng: When you say, “lightbulbs go off,” what are some common things that they go, “Oh, I never really thought about it that way,” or, “This really changed my worldview?”
Troy Hunt: Well, a really good example is enumeration risks. So when you go and let’s say when you register on a website. You put in an email address that already exists on the site, and the site says, “You can’t use this, someone already has that email address.” Now, we see that behavior day in and day out, but the thing to think about is, well, what that means is that someone can go to your website and find out if someone has an account or not. Now, what if I take a large number of email addresses, and I keep throwing them at the registration page, and I start to build up a profile of who has accounts or not. And it suddenly starts to seem not so much fun anymore. And you say to people, “How would your business feel if they were disclosing everyone who was a member of the service?” And they sort of start to go, “Well, that wouldn’t be a really very good idea.” “So, why are you doing it?” You know? Because the defensive pattern around this is very straightforward, you know. You’ve just gotta give the same response whether the account exists or not, and then you send them an email. And you say, “Well, you’ve already got an account, go and log in,” or, “Thank you for signing up.” So there are really sort of easy ways around that, and that’s more of a sort of a logic issue than it is even a coding flaw.
Cindy Ng: If you were to share this talk with the business, what would they do?
Troy Hunt: Well, what it tends to do is prompt different discussions much earlier on in the design of the system. So in the case of something like enumeration, what you really want to be doing is at the point where you’re sort of collecting those business requirements and having the discussion, you need to be saying to the business owner, how important is it to protect the identities of the customers of this service? Now, depending on the nature of the business, it may be more or less important. So, for example, if it is… I mean, let’s just say it’s Facebook. Just about everybody has a Facebook account. It’s not going to be a great big sensational thing if someone goes, “Hey, I went to Facebook and I just figured out you’ve got…” Let’s subtly put this as a site for discerning adults. Would those discerning adults have an expectation that their significant other or their workmates or their boss would not be able to go to that site, enter their email address, and discover that they like that kind of content? Well, yes, I mean, that is a very good example of where privacy is much more important. So for the most part, I really don’t have a problem with either direction an organization goes, so long as it’s like an evidence-based decision and they arrive there having looked at the upsides and the downsides and gone, “Well, on balance, this is the right thing to do.”
Cindy Ng: You mentioned privacy. Even though people are sharing their information online, people are also worried about their privacy because you’ve heard 60 Minutes do a segment on data brokers selling our data, and all the data breaches that you hear almost every day, and I think technology’s held to a higher standard because we’re seen as progressive technology people who are basically reimagining how we’re interacting with the world, and we’re creating awesome wearables and apps, and what is your take on our worldwide debate on privacy? Are consumers worried enough? They’re not worried enough? Or, of course, you can’t speak for everyone in this world, but I want to hear from you.
Troy Hunt: It’s extremely nuanced and it’s nuanced for many reasons. So one of the reasons is, I first used the internet in 1995 and I was at university at the time, and for me, I’d sort of gone into adulthood without having known of an internet, and without having known of an environment where we shared this information day in and day out. And now we have situations where there are qualified adults in the workforce who have never known a time without the internet. They don’t really have a memory of a time without iPhones, or a time without YouTube, or any of these things that many of us that…and I don’t think I’m old, but, you know, many of us sort of remember a phase where we sort of gradually transitioned into this. And what it really means is that our tolerances for privacy and sharing are really different with younger generations than what they are with my generation, and certainly with older generations as well. And this makes things really interesting because those individuals are now starting to have a lot more influence, they’re getting involved in running businesses and getting into politics and all these other things that actually impact the way we as a society operate. And they are at a very different end of the spectrum to, say, my parents’ generation, who have a Facebook account so they can look at the photos that I post of the kids but would never put their own things on there. So I think that’s one of the big things with privacy. How different it is for different generations.
The other thing that’s really interesting with privacy now is the number of devices we have that are collecting very private information. So, you know, I have an Apple watch. And that collects a lot of data and it puts it in the cloud. We have people that have things like Alexa at home, you know, or an Amazon Echo. So, smart devices that are listening to you. We have this crazy IoT state at the moment where everything from TVs, which are effectively listening to us as in your lounge room, and we’ve seen the likes of the CIA exploiting those, all the way through to adult toys that are internet-connected and have been shown to have vulnerabilities that disclose your private usage of them. So this is the sort of interesting paradox now, we’ve got so much collection of this very, very personal, very private data. Yet, on the other hand, we’re also seeing increasing regulation to try and ensure we have privacy. So we’ve got things like GDPR hitting in about a years’ time. Which is very centered around putting the control of personal data back into the hands of those who own it. And stuff like that becomes really interesting. Because we’re saying, “Hey, under GDPR, you might have a smart fridge, and the organization that holds the data from your smart fridge needs to recognize that it’s your data and you can have it erased and you can have access to it and do whatever you want. And there’s going to be more of it than ever because the fridge is constantly talking about, I don’t know, whatever it is a smart fridge talks about.” So it’s a really interesting set of different factors all happening at the same time.
Cindy Ng: What’s a question that you get over and over again that you get tired of explaining that, “I wish people would just get this right.”
Troy Hunt: I would say, why do you need a password manager? This week, I loaded more than a billion records into Have I Been Pwned from a couple of what they call combo lists. So, these are just big lists of email addresses and passwords built up from multiple different data breaches, we don’t even know how many. And they’re used for these credential stuffing attacks where attackers will take these lists, they’ll feed them into software which is designed to test those credentials against services like anything from your Gmail to your Spotify account, to whatever else they can figure out to do. And they go and find how many places you’ve reused your password. Because if you use the same password in LinkedIn, which got breached or we saw the data come out last year, but it got breached a few years earlier, if you use that same password on LinkedIn and Spotify, and then someone’s got your LinkedIn password and they go to Spotify, well, you know, now you’ve got a problem. Now they’re in your Spotify account. And they might sell that for some small number of dollars along with hundreds of thousands of other ones. So people sort of go, “Well, yeah, but it’s hard to have unique passwords that are strong.”
Cindy Ng: You’re no doubt extremely influential in this security space and there has been endless talk about how to bring a more diverse group into the space, and I’m wondering if you would like to provide a statement of support so that women and minorities aren’t just self-organizing?
Troy Hunt: So this is an enormously emotionally-charged subject. You know, like let’s just start there and I’m always really, really cautious because sometimes I’ll see things said on both sides of the argument, and I’ll just go, “Well, you’ve lost the plot.” But as soon as you weigh in on these things publicly, it can get very nasty.
So I guess for context, I mean, I’ve got a son and a daughter so I’ve got a foot in both camps there. I’ve got a wife who is becoming more active as a speaker who is actually on a security professionals panel in that same event I just mentioned in a couple of weeks’ time talking about diversity in security. And I’ve been involved in organizing conferences where we have to choose speakers as well. It’s a very difficult situation, particularly in that latter scenario because we all want to have diversity of people because the diversity gives you a richer experience. It gives you many different perspectives and backgrounds, rather than seeing the same cast of people over and over and over again.
On the other hand, we’re also really cautious that we don’t end up in a situation where we’re saying, “We’re going to choose someone because of their gender or their race or their political view or their sexuality or whatever it may be. Not because they have good content, but because of some other attribute which they’ve inherited.” And we’re very, very cautious with that, and interestingly, for my wife and for other women I speak to, the last thing in the world they want is to be chosen just because of their gender as opposed to their capabilities. So, it becomes a really, really difficult situation.
And what I find is that we know that in technology in general, women are massively underrepresented as a gender, and anecdotally, I would say within security, it’s even more significant than that. It’s a very, very male-dominated sector. So I think it’s a really difficult thing, and interestingly, there are parts of the world where that bias is very, very different. So apparently, Egypt has a really, really strong representation of women. I think I heard it was about half or even more. So there seems to be some cultural biases that come into play, too.
Honestly, I don’t have good answers for this other than trying as parents to give our kids equal opportunities and see what they’re drawn to. Obviously, trying to have cool, inclusive environments, we certainly see behavior at times which would be very uncomfortable for women, and that’s not cool, that’s not going to make anyone feel happy. So certainly, the conferences I’m involved in really put a lot of effort into not sort of creating that environment. And to be fair as well, we’re not saying to fundamentally change normal behaviors, we’re saying, “Like, let’s just not be dicks.” You know? “Like, let’s all be nice people.” And this is very often what it boils down to.
Ultimately, though, is that until that this sort of pipeline of professionals coming through, until that balance changes such that it’s more evenly represented, we are going to have a significant bias towards genders and races and nationalities that simply are way, way upstream and something that we have no immediate control over at the moment.