All posts by Cindy Ng

A Guide on the Data Lifecycle: Identifying Where Your Data is Vulnerable

A Guide on the Data Lifecycle: Identifying Where Your Data is Vulnerable

Data is a company’s most valuable asset. To maintain data’s value, it’s vital to identify where that data is vulnerable. According to data and ethics expert Dr. Gemma Galdon Clavell, there are five major moments where data is most vulnerable: collection, storage, sharing, analysis, and deletion. These vulnerability points increase the risk of a data breach – and we’ve all heard about the costs of having one.

Many of these vulnerability points are part of a cycle, known as the data lifecycle. The data lifecycle determines where the data lives: on premise, in the cloud, with third-party vendors and more. What’s more, understanding where the data lives within the system is how you can consistently take steps that protect the security as well as the privacy of that data.

What is the Data Lifecycle?

This data lifecycle is a high-level, general process that describes how data can flow through an organization. Because the lifecycle presented is your organic, local garden-variety, it can be adapted to many different scenarios. The data lifecycle is an important guide for security and privacy pros to consider when protecting data.

Data Collection – Where the Data Lifecycle Begins

To understand why life starts at collection, let’s take a closer look at what happens at this stage. Without data collection, there would be nothing to analyze, no patterns to discover, or data-driven business plans to implement. Fortunately, the ability to collect data – and the acknowledgement of how important it is to collect data – is no longer an issue. In fact, there are overwhelming incentives to design technologies in a way that maximizes the collection of personal information for potential business use.

Data scientist and statistician Kaiser Fung warns against collecting data with no particular business problem: “You have a lot of expensive data, but there’s no measurement of that thing that you want to impact. And then in order to do that, you have to actually merge in a lot of data or try to collect data from other sources. And you probably often times cannot find appropriate data so you’re kind of stuck in this loop of not having any ability to do anything.”

Regulators have also caught on that the “collect first, ask questions later” mentality incentivizes design choices that may marginalize users’ privacy interests in how their data is collected and used.

Referenced in Article 25 of the General Data Protection Regulation, it’s now law for many companies to protect data by design and by default. This means companies need to integrate data protection principles into business practices right from the start – and throughout the data lifecycle. Based on a 20-year-old concept, Privacy by Design (PbD) consists of seven core principles that ensure strong privacy and personal control over a user’s personal information as well as a sustainable competitive advantage for organizations and positive-sum (win/win) paradigm.

Get Consent for Active and Passive Data Collection

Before you start collecting data, best practices recommend data collectors to first obtain consent prior to both active and passive data collection. Preventative measures help avoid miscommunication and enable the user to decide to opt-in or out.

User information is collected either actively or passively. It’s active collection when the user is aware of the data collection and passive when the collection and analysis takes place behind the scenes, and the user’s activity or movements could reveal a behavioral pattern.

A user filling out a web form is an example of active data collection. The user knows he will reveal his identity with a name, address, phone number and other types of personal information. If the user has her location services on, that location information is an example of passive data collection: the consent is implied and assumed due to active use and engagement of the platform, service, or product.

Types of Data Collection

Besides getting consent, another core PbD principle is to minimize the collection of data. In other words, limit personal data collection and only collect what’s necessary to only carry out the purpose for which user consented. This limits vulnerability and risk for both the data collector and data subject.

Be mindful of data minimalisation/minimization when collecting these types of data:

  1. First-party data collection: when the user provides her personal data directly to the data collector.
  2. Surveillance: when the collector observes data produced by the user without interfering with the user’s experience.
  3. Repurposing data: when reusing previously collected data for a different purpose. Best practices recommend additional consent from the data subject. Depending on the industry, when data is collected for one purpose and then reused for a completely different purpose, it can be a violation of privacy rights and possibly illegal under some regulatory requirements.
  4. Third-party collection: when collected information is transferred to a third-party for further analysis

Making Sense of the Data: Processing, Analyzing and/or Sharing

After collecting data, there’s often a strong desire for everyone in an organization have access to the data. With data widely available, creative connections can made more quickly, potentially advancing business initiatives and opportunities. Being first in an untapped marketplace is everything.

That’s a fair point of view, but security and privacy advocates have another recommendation. They’re not recommending absolutely no access. After all, without access to data, it’s a challenge to process and analyze the data.

Security advocates and NIST suggest that the balance is in limiting access to those who only need to use the data. The guiding principles are maintaining a least-privileged-model and role-based access controls where data owners grant appropriate users the access they need – and only the access they need – to do their jobs.

PbD author Ann Cavoukian says “You’ve gotta have restricted access to those who have a right to know – meaning there is a business purpose for which they’re accessing the data.”

Using a hospital as an example, she continues, “You want to enable easy access for those who have a right to know because they’re treating patients. And then the walls should go up for those who are not treating in any manner.”

Privacy advocates add that it’s vital that data collected is only used for the original purpose and intent. If you’re planning to use that data for another purpose, seek additional consent. Also, let your data subjects know if their data will be shared with third parties. It’s a user’s right to know.

Poorly protected folders with permissions that are more generous than they need to be often lure attackers. That’s where Varonis comes in handy: DatAdvantage identifies who can access data, who does access data, shows where users have too much access, and helps safely automate proper changes to access control lists and security groups.  Meanwhile the Automation Engine helps put it all out autopilot: automatically repairing inconsistent ACLs and remediating global group access.

Data Retention Policies to Remediate Risk

Whether your data is on premise, in the cloud, or with 3rd parties, companies need to consider how long data is retained by their system. Having data retention policies and procedures —what to keep, what to archive—in place is just IT common sense.

Data retention is covered in GDPR’s Article 5, which explains that data should only be retained for as long as is required to achieve the purpose for which data were collected and processed.

It’s not only the EU that takes this IT procedure seriously. Data retention limits also show up in the US’s HIPAA rules for personal health data and in some financial data security regulations. Data retention limits—measured in years—define the amount of time an electronic document must be kept.

The message is clear: lower your data security risk profile. If you don’t need data, delete it. The less data you have, the less damaging a breach will be. If it’s sensitive, make sure it’s only accessible to those who need it. Old and stale files are expensive and risky, which is why we have retention policies and software solutions such as Varonis Data Transport Engine – which helps archive, quarantine, and delete stale (and regulated) data.

Secure Data Destruction

Once you’ve identified what needs to be retained, the rest of the data is guided by another PbD principle: to reduce security risks by deleting or archiving unnecessary or stale sensitive data embedded in files. This makes incredible sense. Stale data can be consumer identifiers originally collected in short-term marketing campaigns, for example, but now reside in rarely used spreadsheets or management presentations. Your organization may no longer need it, but it’s just the kind of monetizable data that hackers would love to get their hands on.

To echo the significance of data minimization, Ann Cavoukian warns, “What idle data does is in identifiable form is attracts hackers. It attracts rogue employees on the inside who will make inappropriate use of the data, sell the data, do something with the data.”

If stale data is no longer needed and has reached the end of it’s data lifecycle, the level of destruction is determined by the sensitivity of the data. NIST has recommendations for sanitizing storage devices and destroying data that range from clearing the data by overwriting it to destroying the physical medium.

Protecting the Data Where It is Most Vulnerable: From Birth to Death

If data truly is more valuable than oil, it makes sense to protect data like you do with money. From birth till the end of the data lifecycle, it is possible to protect your organization’s data first, not last. Ann Cavoukian, who has worked with both government agencies and organizations and knows a thing or two about positive-sum paradigms says, “When you can present it that way, it gives you a seat at the table, every time.”

 

The Difference between a Computer Virus and Computer Worm

The Difference between a Computer Virus and Computer Worm

Viruses and worms are often used interchangeably: there are a few key differences in how they work. Both viruses and worms are a type of malware: a worm is a type of virus.

What is a Computer Virus?

Computer viruses are named after human viruses that spread from person to person. A computer virus is a program made of malicious code that can propagate itself from device to device. Like a cold that alters your well-being, when your computer is infected, it alters the way your computer operates, can destroy your files, or prevent it from working altogether.

A virus typically attaches itself to a program, file, or the boot sector of the hard drive. Once the virus attaches itself to that file or program (aka, the host), they’re infected.

When the infected application or file runs in the computer, the virus activates and executes in the system. It continues to replicate and spread by attaching replicas of itself to other files and applications in the system.

How Does a Computer Virus Spread?

A virus spreads when the infected file or program migrates through networks, file collaboration apps, email attachments, and USB drives. Once a user opens the infected file or program, the vicious cycle repeats itself all over again.

Typically, the host program continues to function after the viral infection, but some viruses overwrite entire programs with copies of themselves, which corrupts and destroys the host program altogether. Viruses can also attack data: they can disrupt access, corrupt, and/or destroy your data.

What’s a Computer Worm?

Worms are a self-replicating type of malware (and a type of virus) that enter networks by exploiting vulnerabilities, moving quickly from one computer to another. Because of this, worms can propagate themselves and spread very quickly – not only locally, but have the potential to disrupt systems worldwide.

Unlike a typical virus, worms don’t attach to a file or program. Instead, they slither and enter computers through a vulnerability in the network, self-replicating and spreading before you’re able to remove the worm. But by then, they’ll already have consumed all the bandwidth of the network, interrupting and arresting large network and web servers.

A Modern Computer Worm Story

In 2017, the WannaCry worm attack caused damage worth hundreds of millions to billions of dollars. Also known as WannaCry ransomware, this attack is a hybrid of ransomware and a worm – specifically cryptoworm.

Ransomware is a type of malware that holds a user’s data hostage: it encrypts data and asks the victim to pay a ransom, betting on the user’s willingness to pay to restore the user’s data. Ransomware infections often occur through phishing campaigns.

Instead, WannaCry took advantage of a vulnerability in Microsoft’s SMB Version 1 file sharing protocol, typically used by Windows machines to communicate with file systems over a network. Those who didn’t patch SMB Version 1 learned the hard way about the perils of forgetting to patch their systems.

WannaCry leveraged EternalBlue, a Windows SMB protocol exploit, to gain access, install a backdoor, and download software –  infecting the systems.

In short, WannaCry self-propagated, self-replicated, and quickly traversed entire networks, causing worldwide damage.

How to Protect Yourself from Computer Viruses and Computer Worms

Here are some simple ways to protect yourself:

  • Install anti-virus software and firewall
  • Track potential data exfiltration at the edge and attacks at the point of entry
  • Remember to regularly install security patches
  • Monitor and analyze file and user behavior
  • Leverage security analytics to spot suspicious behavior
  • Set up alerts to notify you automatically and immediately when an anomaly occurs

How Varonis Helps

When a virus or worm evades your anti-virus detection software or endpoint to exfilitrate your organization’s data, Varonis can help.

Varonis DatAdvantage monitors and analyzes file and email activity – as well as user behavior. When there’s an unusual amount of lockouts that occur or a thousand files are opened in a minute, Varonis DatAlert can detect these anomalies, automate security responses, and enable teams to investigate security incidents directly in the web UI. Varonis Edge adds context with perimeter telemetry, detecting signs of attack at the perimeter via DNS, VPN, and Web Proxies.

Discover how Varonis can help defend against worms and viruses – see Varonis in action with a 1:1 demo today.

[Podcast] Cyber & Tech Attorney Camille Stewart: Discerning One’...

[Podcast] Cyber & Tech Attorney Camille Stewart: Discerning One’s Appetite for Risk

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


We continue our conversation with cyber and tech attorney Camille Stewart on discerning one’s appetite for risk. In other words, how much information are you willing to share online in exchange for something free?

It’s a loaded question and Camille takes us through the lines of questioning one would take when taking a fun quiz or survey online. As always, there are no easy answers or shortcuts to achieving the state of privacy savvy nirvana.

What’s also risky is that we shouldn’t connect laws made in the physical world to cyberspace. Camille warns: if we start making comparisons because at face value, the connection appears to be similar, but in reality isn’t, we may set up ourselves up to truly stifle innovation.

Choosing Convenience over Privacy

Camille Stewart

Hi, I’m Camille Stewart. I’m a cyber and technology attorney. I am currently at Deloitte working on cyber risk and innovation issues, so identifying emerging technologies for the firm to work with. Prior to that, I was a senior policy advisor at the Department of Homeland Security working on cyber infrastructure regarding to foreign policy in the Office of Policy. I was an appointee in the Obama Administration. And then prior to that I was in-house at a cybersecurity company. So I’ve worked in both the public sector and the private sector on cyber issues.

Cindy Ng

Thanks, Camille. Can you talk a little bit about privacy conceptually? Everybody wants privacy, it seems like a good thing, but why aren’t people picking privacy over convenience? Convenience, yes, it’s easy but what about privacy is not getting through to people?

Camille Stewart

I don’t think people are looking at the long-term ramifications, right? I know very recently we had the genetic testing case that helped lead to a killer, which is wonderful in that specific instance. But I doubt that anybody who sends in their genetic information, had it tested and figured out their heritage has thought about how that data might be used otherwise, has read the disclaimer that tells you how your data will be used whether it’s for research, whether it will be used by the police, whether it will be used to create new things.

And if anybody remembers Henrietta Lacks, her data was used to create all of these things that are very wonderful but she never got any compensation for it. Not knowing how your information is used takes away all of your control, right? And a world where your data is commoditized and it has a value, you should be in control of the value of your data. And whether it’s as simple as we’re giving away our right to choose how and when we disburse our information and/or privacy that leads us to security implications, those things are important.

For example, you don’t care that there’s information pooled and aggregated from a number of different places about you because you’ve posted it freely or because you traded it for a service that’s very convenient until the moment when you realize that because you took the quiz and let this information out or because you didn’t care that your address was posted on like a Spokeo site or something else, you didn’t realize that all of the questions to your banking security information are now all easily searched on the internet and probably being aggregated by some random organization. So somebody could easily take and say, “Oh, what’s your mother’s maiden name? Okay. And what city do you live in? Okay. And what high school did you go to? Okay.”

And those are three pieces of information that maybe you didn’t post in the same place but you posted and didn’t care because you traded it for something or you posted it and you didn’t think it through and now they can aggregate it because you use those two things for everything and now someone has access to your bank account, they’ve got access to your email, they’ve got access to all of these things that are really important to you and your privacy has now translated into your security.

Cindy Ng

I was just talking to my coworkers about this that it doesn’t come naturally to know not to answer these questions because you can online somewhere and let’s say you’re a part of a community you trust and you answer these innocuous questions and then you won’t necessarily have the foresight to know that it’s gonna come back and hurt you. How did you come up with the reasoning behind, “Oh, I probably shouldn’t answer those questions?” Because you kinda have to be a little skillful and have a bit of foresight or some knowledge to even think in the way that you do.

Camille Stewart

No, you’re right, there is a level of savvy that has to happen for you to think that way and a level of, like you said, foresight or a level of reaction, right? Most people aren’t thinking that way because they knew it before it happened but now that the information’s out there, they’re taking action. And I think there are a lot of people who are neglecting that.

So we all, just like organizations, just have to press it, have to make this vision become their appetite for risk. We as individuals have to do the same. And so if you are willing to risk because you think either, “They won’t look for me,” or, “I’m willing to take the hits because my bank will reimburse me,” or whatever the decision which you are making, I want you to be informed.

I’m not telling you what your risk calculus is but I wanna encourage people to understand how information can be used, understand what they’re putting out there and make decisions accordingly. So your answer to that might be like “Look, I don’t wanna give up taking Facebook for this or sharing information in a community that I trust on some social site but what I will do is have a set of answers that I don’t share with anyone to those normal questions that they use for password reset that are wrong but only I know the fake answers that I’m using for them.”

So instead of your actual mother’s maiden name, you’re using something else and you’ve decided that that’s one of the ways that you will protect yourself because you really wanna still use these other tools and that might be the way you protect yourself. So I challenge people not to give up the things that they love, like I mean, I would assess whether or not certain things are worth the risk, right?

Like a quiz on Facebook that makes you provide data to an external third party that you’re not really sure of how they’re using it, not likely worth it. But the quizzes where you can just kinda take them, that might be worth it. I mean, the answers you provide for those questions still are revealing about you but maybe not in a way that’s super impactful. Maybe in a way that’s likely just for marketing and if you’re okay with that, then take that or you go resilient the other way.

Artificial Intelligence and Legal Protections

Cindy Ng

I wanna talk about an article that an attorney wrote, Tiffany Li, she wrote about how AI will someday eclipse the intelligence of the human and whether or not AI will have legal protections and then she juxtaposed it with the case with the monkey and how a monkey took a photographer’s camera and took a selfie and there was a lawsuit with how we can use the monkey’s lawsuit as precedent for future cases such as AI and recently, the monkey lost the lawsuit. Not the monkey but PETA. I just wanna hear from your perspective, as a lawyer, how to think about it moving forward.

Camille Stewart

I mean, it remains to be seen how things like AI will translate, especially in terms of creative spaces. It will be hard to determine ownership if a machine creates a work. And I mean, they’ll come down to a final decision. We’ll have to decide that things that are created by a machine and solely by a machine, right, like if there are human’s input we might make one decision versus if it’s solely created by a machine, we might say that that is in the public sphere and anybody can use it and is not as anything that has any kinda attributable protection.

Versus if there is human input, we would decide that that is something that they can then own the production of, right, because they contributed to the making of whatever the end product is. It’s hard to speculate but there will have to be a line drawn and it’s likely somewhere in there, right? The sense that there is enough human interjection, whether that is from the input from whatever creative process is happening by the machine or in the creation of the process or program or software that is being used and then spit out some creation on the end, there will have to be a law or I guess at least case law that kinda dictates where that line is drawn.

But those will be the things that’s fun, right? Tiffany, and other lawyers like myself, I think those are the things that we enjoy most about the space is that stuff is unclear. And as these things roll out you get to make connections with the monkey case and AI and with other things that have already happened and new processes, new tech, new innovations and try to help draw those lines.

Cindy Ng

Is there anything we need to look out for that we’re not aware of? Or certain connections that are sorta in the legal space that people in the tech space aren’t aware of?

Camille Stewart

So I was gonna say, I don’t actually think it is safe to on a broad scale without some level of assessment, connect laws made in accordance with the physical world to cyberspace, I think it’s dangerous, because usually they’re not one for one. It is the place where most people start because it’s the easiest proposition to compare something that we’ve seen before with something in cyber. But they don’t always compare or don’t always compare in the way that we would think that they would.

And so it’s dangerous to make those comparisons without some level of assessment. And so I would tell people to challenge those assessments when you hear them and try to poke holes in them, because bad facts make for bad law. And if we take the easy route and just start making comparisons because on their face they seem similar, we may set up ourselves up to truly stifle innovation, which is exactly what we’re trying to prevent.

Cindy Ng

Can you provide us with an example of why it’s dangerous, because it feels like the natural thing to do?

Camille Stewart

No, you’re right, it does feel natural. I’m trying to think of something…I’m thinking more along the lines of likening something physical to something cyber. So let’s think about borders, right? So borders in a physical sense are very clear limitations of authority and operation. You can’t cross a physical border without being able to use a passport, a Visa, things like that and they can control physical entry and exit at a border, a different country can.

That is not the same as cyber-based. And to liken the two in the way that you use rules is not smart, right? It’s your first inclination to wanna try to stop data flow at the edge of a country, at the edge of some imaginary border, but it is not realistic because the internet by its very nature is global and interconnected and, you know, traverses the world freely and you can’t really stop things on that line, which is why things like GDPR are important for organizations across the world because as a company that has a global reach because you’re on the internet, you will be affected by how laws are created in different localities.

So that’s a very big example but it happens in very discreet ways too when it comes to technology, cyberspace, and physical laws. Or the physical space and laws that are operated in that way and so I would challenge people that when you hear people make a one for one connection very easily without some level of assessment to try to question that to make sure it really is the best way to adapt some things to the given situation.

The reason for example, Tiffany’s likening of AI to this monkey case, it’s an easy connection to make because in your head you think, “Well, the monkey is not human, they made a thing, and if they can’t own the thing then when you do that online and a machine makes a thing, they can’t own a thing.” But it very well may not be the same analysis that needs to be made in setting, right? The lines may become very different because none of us could create a monkey. So if I can’t create a monkey, then it’s harder to control the output of that monkey. But I could very well create a machine that could then create an output and shouldn’t I be the owner of that output if I created the machine that then created the output?

Cindy Ng

Mm-hmm.

Camille Stewart

But that was my point is that likening things that on their face being the same, the lines therein might be different or they just might be different altogether because cyberspace and the physical space are not a one for one.

[Podcast] Cyber & Tech Attorney Camille Stewart: The Tension Between L...

[Podcast] Cyber & Tech Attorney Camille Stewart: The Tension Between Law and Tech

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Many want the law to keep pace with technology, but what’s taking so long?

A simple search online and you’ll find a multitude of reasons why the law is slow to catch up with technology – lawyers are risk averse, the legal world is intentionally slow and also late adopters of technology. Can this all be true? Or simply heresy?

I wanted to hear from an expert who has experience in the private and public sector. That’s why I sought out the expertise of Camille Stewart, a cyber and technology attorney.

In part one of our interview, we talk about the tension between law and tech. And as it turns out, laws are built in the same way a lot of technologies are built: in the form of a framework. That way, it leaves room and flexibility so that technology can continue to evolve.

Frameworks Reign in Law and Tech

Camille Stewart

Hi, I am Camille Stewart. I’m a cyber and technology attorney. I’m currently at Deloitte working on cyber risk and innovation issues, so identifying emerging technologies for the firm to work with. Prior to that, I was a Senior Policy Advisor at the Department of Homeland Security working on cyber infrastructure, and foreign policy in the office of policy. I was an appointee of the Obama administration. And then prior to that, I was in-house at a cybersecurity company. I worked in both the public sector and the private sector on cyber issues.

Cindy Ng

Today, we’re gonna be talking about the tension between law and technology, where a law takes a lot of time and inquiry to create something that makes sense and hopefully is impactful for years to come, whereas technology, it’s really about ideation and creating and bringing product and service to market as quickly as possible.

Tech people, they want law to catch up with technology. Lawyers wished tech people would understand the law a little bit more. And some have even criticized that the law doesn’t move as quickly as technology, and you have a lot of experience both as a cybersecurity attorney in Washington and in the private sector.

And I’m wondering if there’s a deeper divide between the two entities, and I’m wondering if you can share your experience with us in working with lawmakers as well as your experience in the private sector.

Camille Stewart

Yeah, so, I mean, I think one misconception is you don’t want the law to keep pace with innovation. There’s no way for you to legislate for future occurrences and for the ideation and innovation we’ve talked about.

You want the law to leave room and flexibility so that technology can continue to evolve. And so that’s kind of what has to happen. It’s frustrating that there are no legal recourses when an issue comes up, but you almost have to test those boundaries to figure out a framework to fit your bill to address issues that are coming.

So even the laws that we do build tend to be framework because we need to leave room for that innovation and ideation. And part of the tension between technology communities and lawyers and technology communities and the general public or the government is trust. So technologists don’t trust the government with the information that they have, and the government wants to build that trust desperately so that we can leverage the resources that are at the disposal of both.

You know, the government has a lot of insight and intelligence that they can layer over the tools and capabilities in the private sector, and if they came together, it’s great, but there’s this base level of trust and understanding of what each is trying to do that if we could bridge that gap, so much more could be done.

Cindy Ng

Is there a think tank or a non-profit or some kind of institution that can bridge that gap that you’ve seen develop over the past few years?

Camille Stewart

Yeah, so there are a number that are working on this, whether it’s issue-specific, right, “So let’s talk about surveillance and bringing people together around that.” “Let’s talk about a given issue and discuss that.” Also the government is trying that.

Organizations like DHS that work with the private sector quite a bit are trying to build those bridges and find ways to share information in a way that’s valuable to both the private sector and the government through things like AIS, the Automated Indicator Sharing system. And it’s gonna be a slow process.

Those trusts are bolted tight.

Private sector has coalesced together to build trust circles with their peers and people that they know doing work that they understand, and they’re sharing information that way. And those mechanisms have become pretty robust and helpful, but the government has to be able to be a part of that for us to really complete the picture, and that’s the work that’s being done, some through non-profit organizations, NGOs, but also through the government and the private sector starting to get into a room.

And then, as people move back and forth across lines, right, traditionally people were govies for life, or they were in the private sector. Now there’s more movement back and forth, and that’ll help build the trust as well.

Bridging the Gap between Law and Tech

Cindy Ng

What would you say to lawyers who need to understand technology and technologists that need to understand the law?

Camille Stewart

I would say at a base level, do the work to understand the content. Lawyers need to take the time to understand the technology, to ask the questions, understand what the end goal is, and understanding what the technologist is building and for what end user. And the nice thing is that a lawyer is likely the end user of many of the products that they’re speaking to understand, so they can easily understand that perspective. And then do the to work to understand how we got there, how the technologists built that.

And then technologists, on the other hand, need to be willing to have those conversations and those explanations and understand that lawyering of the past, there was the perception that lawyers were just gonna say no. Right? They’re risk averse, they aren’t gonna let you ideate and innovate, they’re just gonna shut it down. And that’s not really true.

My job as a lawyer and the jobs of lawyers at companies today, especially if they deal with technology and cyber issues, is to lay out the risk, understand the organization’s risk calculus, and to put the information in front of leadership so that they can make an informed decision and then help to build a cast-forward that calculates those risks, that mitigates those risks to the best of their ability and be ready to support the company in what they’ve done.

So, with that base level understanding and the willingness to do the work to understand, lawyers can be great assets to technologists because they can be translators, different communities, as well as the company builds out and understands what the risk posture is. It’s important to have all key stakeholders as part of that discussion, and lawyers are definitely part of that group.

Cindy Ng

So you talk about trust and doing your homework having a baseline knowledge of the other’s concepts and principles. What have you seen in your work that has worked that you’ve seen others reach over the aisle, and are you able to provide an example? And also, what doesn’t work?

Camille Stewart

I think the biggest catalyst for change is that things happen, right? So, a breach occurs, and you watch this organization scramble to figure out how to right itself after this big occurrence and realizing that the stakeholders that you were encouraged to have in the room initially were essential when this thing exploded.

And had you accounted for more perspective on the front end in a proactive way, it would have mitigated some of the risk on the back end or you would have been able to right yourself more quickly.

And so I think watching that occur has started a number of organizations and built a number of frameworks to help organizations get the right people in the room and encourage people to do the work to figure out where different players fall in the conversations that they’re having as an organization about how the security is evolving and how technology will be used and integrated in the organization. But I think that outside factors in this area of law and cyberspace evolving has done a lot of the work to encourage the collaboration that’s needed.

Continue reading the next post in "[Podcast] Cyber & Tech Attorney Camille Stewart"

[Podcast] I’m Sean Campbell, Systems Engineer at Varonis, and This is How...

[Podcast] I’m Sean Campbell, Systems Engineer at Varonis, and This is How I Work

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


In April of 2013, after a short stint as a professional baseball player, Sean Campbell started working at Varonis as a Corporate Systems Engineer.

Currently a Systems Engineer for New York and New Jersey, he is responsible for uncovering and understanding the business requirements of both prospective and existing customers across a wide range of verticals. This involves many introductory presentations, proof of concept installations, integration expansion discussions, and even the technical development of Varonis channel partners. Sean also leads a team of subject matter experts(SME) for our innovative DatAlert platform.

According to his manager Ben Lui:

Sean Campbell is one of the most talented engineers on my team. He is the regional DatAlert SME and bridged valuable feedback from both customers and the field back to product management. Sean is also an excellent team player and excels at identifying critical data exposure during customer engagements. Overall, Sean is a key contributor to the Varonis organization.”

The fast paced environment, challenge of data security, and the fact that the sales cycle is far from “cookie cutter” is what Sean enjoys most about his role here. He also values the relationships he has been given the ability to build up over the years on both the Varonis and customer side.

Read on to learn more about Sean  – this time, in his own words.

What would people never guess you do in your role?

I’ve done a lot of interesting behind the scenes work – from creating new hire training materials to assisting with customer data breach investigations.

How has Varonis helped you in your career development?

I didn’t begin my career in sales, so my perspective on security was pretty narrow. Varonis has broadened that tremendously. I’ve developed the skills needed to tailor a conversation to different audiences whether it be a CISO, Cloud Admin, or a room full of other Sales Professionals. My technical skills have evolved as well, from basic Windows knowledge to more complex troubleshooting skills of the different platforms we support here. Pays a little better than minor league baseball!

What advice do you have for prospective candidates?

Humanize people, no matter the job title or status. Empathetic conversations begin and sustain smoothly that way. Be clear, be concise, be quick to listen and slow to speak! Something I’m always practicing.

What do you like most about the company? 

There is a maniacal yet focused approach with everyone here. We have crazy high standards for ourselves, but a culture of togetherness. You get things done and grow! I’m always excited to see what we are innovating next!

What’s the biggest data security problem your customers/prospects are faced with?

The elusive “starting point” or where to begin is a huge up front challenge. Everyone has data to protect, everyone typically has gaps, but similar to the NFL it’s all the same league just different playbooks. A successful playbook might resemble this one.

What certificates do you have?

Does birth count? Kidding, I have a security certification exam coming up next year. Wish me luck!

What’s your all-time favorite movie or tv show?

Movie is The Sandlot and there are too many TV Shows I like to just pick one.

If you could choose any place in the world to live, where would it be and why?

Just give me warm weather year round with easy access for family and friends to visit.

My wife and I are good with that. I also wouldn’t mind a golf course within walking distance.

What is the first thing you would buy if you won the lottery?

If it’s the big one, get me Richard Branson’s island broker.

What is your fave time hack?

My Bulldog, Banks. He hacks a lot of my time.

What’s your favorite quote?

Inky Johnson said something at this year’s SKO that resonated well – “It’s very easy to be busy and accomplish nothing” A good reminder to do things with purpose.

Interested in becoming Sean’s colleague? Check out our open positions, here!

Transcript

Sean Campbell: Hi. My name is Sean Campbell and I’m currently a Systems Engineer for Strategic Accounts at Varonis, and this is how I work.

Cindy Ng: Thanks, Sean, for joining us today. How long have you been with Varonis?

Sean Campbell: I’ve been with Varonis, going on five years.

Cindy Ng: And what was your background prior to working at Varonis?

Sean Campbell: Before Varonis, I was actually playing professional baseball, and I was a baseball player in college while majoring in Information Technology. And after college, I’d gotten hurt my senior year so I didn’t pursue professional career right away. But I did rehab while actually working in Jersey City as a Security Analyst, and then I left that job to pursue my career in professional baseball, signed the contract. And then after I was released from the team, I had a little gap where I was deciding whether or not to continue playing or sharpen my resume and start looking for opportunities in security. And in the midst of all that, I was actually a counselor at the Boys and Girls Club. So I had gotten in touch with Varonis while debating continuing to play, and I decided to move forward with Varonis and I haven’t looked back.

Cindy Ng: Sounds great. What did you learn about yourself after working at Varonis?

Sean Campbell: Well, being personable comes pretty natural. I’m also grateful for the technical development that I’ve experienced here and the growth I’ve had in that area. But one thing I’ve noticed and I’ve learned about myself working at Varonis, was that I didn’t think I’d enjoy the challenge of a sales cycle as much as I do. My first job as a security analyst, had no selling at all, I was exposed to security tools and working with the team as far as cyber security strategy was concerned. And coming to Varonis as a vendor, it was a little bit of a learning process of what the sales cycle entails. It’s full of highs, lows, and in-betweens, but the challenge brings me back. It brings me back to my days competing on the baseball diamond and I think I’ve grown as a professional because of it.

Cindy Ng: What do you think the biggest data security problems our prospects are faced with?

Sean Campbell: I think collectively speaking, understanding what data is sensitive, who’s responsible for it and what’s a non-disruptive way to protect it.

Cindy Ng: So, take us through a customer’s operational journey from start to finish that you think might be helpful for our listeners to understand the important work you do.

Sean Campbell: So, I’ve done hundreds of demos and installations and worked alongside of our clients from a consultative perspective, but one that stands out in my mind, it was a media services agency. And some of the problems that we noticed, they had shared with us that they’d recently secured a large client from a competitor. It was sort of a big deal, really highlighting their growth as an organization. And while working through the terms of that contract, they decided in parallel to heighten their data security controls which actually included a Varonis data risk assessment. And within 30 days of that installation, we found over 200,000 social security numbers tied to client accounts, current and former employees of the firm. And this was amongst over 900,000 folders open to global access groups like everyone in domain users sitting on file servers.

Cindy Ng: Why were these problems a problem?

Sean Campbell: I mean, this is one of the things that as an engineer I really take pride in understanding. I mean, working with them to answer that question eases back to me.

So for this particular company, they manage advertising budgets and strategy for a who’s who’s list of some very high-profile clients. And this includes current and unreleased product marketing materials, other product strategy, very sensitive contract and financial information, among other forms of confidential data. And they are actually regularly audited by their clients to ensure that this information is handled properly. A blow to the trust of their clients because of a data breach or failed audits would mean loss business revenue.

Cindy Ng: And did they know they had problems?

Sean Campbell: They admitted to always knowing they had poor visibility into unstructured data. It was a gap they were aware of. And they knew that was our strength. So walking them through the process of identifying real areas of risk were eye openers. It helped us build credibility as a trusted advisor. They were receptive to our operational plan which actually stood out against other vendors they considered up to that point. And this project got visibility up to the CTO, so the overall response, it was something they needed to rectify moving forward.

Cindy Ng: And what were their use cases?

Sean Campbell: So a lot of the use cases really aligned to what we do, and it almost took the words out of my mouth in some instances. So some of the early things that I had jotted down and confirmed with my champion, they needed to better understand who had access to client project and HR data. And they had no efficient way to map all permissions across their file shares. They also wanted to monitor administrative changes, they wanted to be able to track both authorized and potentially malicious changes. So that meant auditing all file touches and having a way to detect or alert on something worth noting. They also mentioned that they were aware that they had sensitive data on their file servers. They knew that this data likely contained PII or other sensitive project information so they needed to understand where it live. So identifying where this data was, was another use case.

And then lastly, they were also prepared to begin the process of implementing a retention policy. So archiving data that they no longer needed to actually store on their file servers, and eliminating risk by just chopping off low hanging fruit in the form of unused or stale data archiving.

Cindy Ng: There’s a lot of competition out there. How did you convince them and how did they know that our methodology is the right one?

Sean Campbell: As I got to know my champion who had been a manager in IT at the time, he knew that their department was experiencing some growing pains. So as they kept hitting new revenue marks year over year, they weren’t really able to control from a data perspective how the data was growing, the rapid pace that…as it was being created, as the business was demanding new technologies to be rolled out, keeping up with the ability to secure those very technologies once they’re live and in production. So they had really no repeatable process, and especially as it pertains to data governance, it was a huge challenge. So they’d been relying on third-parties to do things like pen testing, network upgrades, just to keep the lights on, to keep productivity as efficient as they could.

Unstructured data, so information that’s living on file server, email, SharePoint, for example, this was an afterthought and they didn’t really have the manpower to attack it. So our methodology really handled a lot of the heavy lifting for them in a single pane of glass. So the solutions they invested in initially provided them a non-intrusive or destructive way of securing their most out-of-control asset which was the data living on file servers.

Cindy Ng: Ah, so you essentially told them where their data was in order to have proper data governance, you need to know where it’s at.

Sean Campbell: You got that right.

Cindy Ng: So which products did they buy?

Sean Campbell: I think this was truly a platform sale. They purchased DatAdvantage for Windows, Directory Services, our Data Classification Engine, our Data Transport Engine, DatAlert Suite and Automation Engine. And this was a sweet spot for our customer base as well, a thousand plus users.

Cindy Ng: And are you able to describe what fixing their problems looked like?

Sean Campbell: So I can tell you that each one of those solutions worked in tandem to correct the very complex problem. With that visibility into where the sensitive data is stored and where it’s at risk, they actually were able to now leverage our professional services to put a plan in place to fix it. So I really teed up for our services team to work with them to do things like locking down open shares, setting up reports, identify stale data for archival, and alerting on inadvertent or suspicious anomalies with products like DatAlert Suite.

Cindy Ng: And so are you able to say that you helped them achieve their data security goals? How did they know that they were progressing in a way that is helpful for them?

Sean Campbell: That’s a great question, and if promotion to Senior Vice President is any indication of my champion’s ability to present the business with a solution for securing their data, I think we gave them a lot of confidence with the solutions that they invested in. Because I didn’t have as much visibility into the account within the recent, I’d say past month or so because of the territory shift, I can’t give you actual metrics, but I’m confident that they cleaned up the open shares that we helped identify and they were able to determine where they have sensitive data, who has access to it, and who shouldn’t have access to it. I can also say that it was a very straightforward process to building out that game plan and making it a repeatable process and getting them to a more manageable state. And this began with simple reporting, showing off the ease of automation through solutions like Data Transport and Automation Engine, that Data Classification Engine, the out-the-box rules gave them less work to do, quickly identifying at-risk sensitive information. We were even able to highlight things like what’s your ransomware response readiness.

So, we even simulated ransomware to show the ability of our DatAlert Suite and how it can detect alert and arrest that form of malware. That really drove the overall aspect of the sale in my opinion. Single pane of glass, the automation, giving them less work in the midst of everything else that they have going on.

Cindy Ng: Ah, so they weren’t even really looking for ransomware prevention, but as sort of a byproduct of getting all the different products, you were able to help them with prevention and you’ve been able to help them find other ways to help their infrastructure be even more secure.

Sean Campbell: You got that right. I would say as that trusted advisor that I strive to be for a lot of my clients, preaching detect, prevent and sustain has been something that some of my customers have even adopted for their own teams, and that methodology that we helped walk them through as they’re, I would say, rounding up that uncontrolled, unstructured data environment, it really puts that process in place. So when we do things like detection, it makes it easier now to set up things like prevention. And in long-term, you’re just sustaining that. And helping them along the way makes that process, again, very manageable and it gets the state of their environment to a manageable place as well.

Cindy Ng: So after you help our customers go through the operational journey, how do you feel from a professional and personal perspective?

Sean Campbell: That actually brings me back to the first or second question you asked me in terms of what did I learn about myself. That’s one of the things that continue to drive me. I use those experiences in the next engagement that I encounter, and it’s able to really help me find my own repeatable process that I know works. Now that’s not to say it’s easy every single time, but when we get to that state where the outcome is evident that our operational plan, that our methodology works, really gratifying in that sense and it builds that confidence as an engineer that Varonis is truly the only solution out there that can help organizations really manage and protect their own structured data.

And as we start to get into areas of enrichment, I’m very confident that we can add even more value gathering things like perimeter insights and things like geolocation for example, which really it’s really gonna arm our clients to take a more secure approach to how they protect their data at the end of the day and keep that bottom line in the black.

Cindy Ng: You mentioned you played baseball prior to working at Varonis. Do you still get a chance to play for fun?

Sean Campbell: Oh, in my head. I haven’t had the time in a while, I should say that should get back out there, but I’m always looking for opportunities. There’re a few leagues that I’ve had my eyes on, I’ve been in touch, but I do keep tabs on some friends of mine that are still playing professionally and I always keep in touch with some of my teammates and just talking baseball kinda keeps me close to it.

Cindy Ng: Yeah, you even…you even used tee up earlier when you were describing your work, that baseball is still in your root. Do you have a favorite book?

Sean Campbell: There’s another interesting segue back to baseball. So my favorite was written by a guy by the name of Harvey Dorfman, it’s called “The Mental ABCs of Pitching.” It was a performance guide I’d read in college and I actually read it again in professional baseball. It was supposed to help propel me into a long career in the Major League, but ironically, a lot of the principles Harvey described, I still use in my career today. Slightly different paths but the same rules apply.

Cindy Ng: What were some of the principles that really helped guide you in your work?

Sean Campbell: So Harvey breaks the book down into different chapters, and the chapters simply have a keyword and he’ll describe that aspect of that keyword as it relates to pitching, so I was a pitcher. So for example, he’s got a chapter called Discipline, he’s got a chapter called Approach, Confidence, Self-esteem, Control. These are all things where he’ll break down the mental aspect of pitching and how it relates to that keyword for that particular chapter, as a matter of fact.

So I’ll give you example from the chapter on discipline, and he starts by just defining what discipline is and its training that’s expected to produce a specified character or pattern of behavior, especially that which is expected to produce moral or mental improvement. The interesting thing about pitching, it’s a very mental aspect of baseball. You’ve gotta have mental toughness, you gotta be precise, you have to trust what you’re doing, and how that relates back into being an engineer at Varonis, a lotta times, you know, your prospects or your customers, they can see right through simply being sold, you know, especially if you’re not confident with what you’re saying, if there is a low trust into what you’re trying to sell them on, especially if it’s just buy, buy, buy the product, figure everything else out later.

As an engineer, as I prepare, a lot of times I take the same approach that I used to have with baseball in my career as a sales engineer. That means learning up on the latest technologies, learning up on the latest threats and understanding how those threats augment the value we bring to the table and knowing when I walk into a room, I can identify a lot of the problem that they may be challenged by, listening to what they’re saying, taking back that information and formulating a plan. And that same process used to be how I would develop as a pitcher. I’d understand the team that we’re about to face, the strengths and weaknesses of those hitters, and how I would attack those hitters during that game that I was scheduled to pitch. So I knew if this particular person couldn’t hit a curveball, guess what, he was gonna see a lot of curveballs.

If an organization has poor visibility into who’s touching their file share data, well, guess what, we’re gonna augment where that file share data is, we’re gonna turn on auditing and we’re gonna break down for them how can we easily report on and alert on any anomalies with this information.

Or if we’re struggling to meet a regulation, you know, I’m gonna understand what that regulation is and I’m gonna put a preparation, I’m gonna put a plan of preparation in place to say, here’s how we can help you better meet that regulation, for example. So the reminders in this book a lot of times just keep me on that path of how to properly approach being a sales engineer. So it’s been very interesting. I didn’t think it could relate in that way, but again, that goes back to I never expected to enjoy the sales cycle the way I do. Because it’s not simply just straight B2B sales, “Hey, here’s a product, here’s the SKU number, here’s how much it costs. That’ll do it.”

There’s a process to it of understanding what the use cases are, like you asked me. Understanding, you know, why are these problems problem, right? And then really augmenting or walking them through our operational plan, our methodology to show how we’re best positioned as a solution.

 

[Podcast] I’m Brian Vecci, Technical Evangelist at Varonis, and This is H...

[Podcast] I’m Brian Vecci, Technical Evangelist at Varonis, and This is How I Work

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


If you’ve ever seen Technical Evangelist Brian Vecci present, his passion for Varonis is palpable. He makes presenting look effortless and easy, but as we all know, excellence requires a complete devotion to the craft. I recently spoke to him to gain insight into his work and to shed light on his process as a presenter.

“When I first started presenting for Varonis, I’d have the presentation open on one half of the screen and Evernote open on the other half and actually write out every word I was going to say for each slide,” said Brian.

From there, he improvises from the script.

“I’d often change things up while presenting based on people’s reactions or questions, but the process of actually writing everything out first made responding and reacting and changing the presentation a lot easier. I still do that, especially for new presentations.”

According to Varonis CMO David Gibson:

Brian’s high energy, curiosity, and multi-faceted skills – technical aptitude, communication skills, sales acumen, and organizational capabilities -make him an exceptional evangelist.

Read on to learn more about Brian – this time, in his own words.

What would people never guess you do in your role?

I’m really lucky that my role at Varonis lets me engage with people all over the company, including Marketing, Sales, Support, Engineering, and Product Management, so I’m not sure that there’s anything anyone would never guess about what I do.

When it comes to the more public aspects of what I do, like press, Connect events, and customer meetings, I spend more time drilling and practicing what I’m going to say so that when I’m on stage or in front of a camera, I can improvise off a script rather than trying to remember what I’m supposed to be talking about.

What did you learn about yourself after working at Varonis?

That I need to spend more time listening and less time talking. One of my first trips I made at Varonis was going to a few customer meetings in California and before I left David Gibson reminded me to “make the meeting about them,” meaning the people I was meeting with. It’s still something I’m working to get better at and have to consistently remind myself of.

How has Varonis helped you in your career development?

It would be hard to come up with ways that Varonis hasn’t helped me in my career.

I’ve become way more confident in front of audiences. I’ve gotten better at confidently talking about things I know well and I’ve gotten more comfortable with saying, “I don’t know.”

I was always in technical roles before coming to Varonis and sometimes it’s hard to admit that you don’t know something when it’s your job to.

What advice do you have for prospective candidates?

Varonis more than anywhere else I’ve ever worked rewards energy, enthusiasm, and hard work.

We’re much bigger than we were when I joined back in 2010, but there’s still so many things that we’re learning how to do well as a company.

The people who succeed here are the ones that do, fail, and get better.

What do you like most about the company? 

I admire how much of our leadership has been here for so long, and I think that’s reflective of everyone having the same goal.

It’s been rare in my career before coming to Varonis to feel like a part of an organization on a mission. That’s never been an issue here.

I know what it’s like to work somewhere where the leaders have no vision, let alone the ability to execute on it.

What’s the biggest data security problem your prospects are faced with?

When I first got here we were spending a lot of time just teaching our prospects that security on file systems was possible!

Making sure the right people had access to what they were supposed to was an impossible problem to solve for so many people for so long that we had to spend a lot of time just education people that we understood the root of their problems and could actually fix them.

These days everyone seems to know it’s a problem and the biggest challenge our prospects face is knowing how to get there.

“I get what you (Varonis) do, but tell me how we can actually get there” is something I hear a lot. That’s probably because I spend a lot of time talking about our Operational Journey these days.

What certificates do you have?

I’ve got a CISSP, which is the only certification I ever put a lot of work into.

Fave book?

I love to read and have a bunch. I read The Count of Monte Cristo every few years, so that’s up there. Dune is another one that I try and read every now and then. Gateway by Frederick Pohl as well. The book that helped me most with my job is Working with Emotional Intelligence by Daniel Coleman.

What is your fave time hack?

Adding my flights and hotels to my wife’s Gmail calendar because what do you mean you didn’t know I was going to be in London this week?

What’s your favorite quote?

Decisions are made by those who show up. I’m not sure who to attribute it to, but the first person I remember saying it to me was my father.

Interested in becoming Brian’s colleague? Check out our open positions, here!

Brian Vecci

Hi, my name is Brian Vecci and I’m currently a technical evangelist at Varonis, and this is how I work.

Cindy Ng

Thanks, Brian, for joining us today. How long have you been with Varonis?

Brian Vecci

That’s an interesting question. I’ve been with Varonis since March of 2010. But as some or many people may know, I actually left for about 10 months before coming back. I’m in my second term at Varonis, and I’ve been here now for…in my second stint for about two and a half years. But when I introduce myself I say I’ve been here since 2010.

Cindy Ng

What was your background prior to joining Varonis?

Brian Vecci

I went to college and studied computer science and music. And I came out of college and immediately went to work as a web developer. So I was an engineer, and I spent time doing web and applications development. And I discovered that I’m generally better at talking about the kinds of things that I was doing and helping other people understand the technology that I was building than actually building the technology which people that know me probably won’t surprise anybody.

So I was an engineer, an applications developer then I moved into project management. I was a project manager for a while, a systems architect. And right before I came to Varonis, I was in desktop architecture for an investment bank. And before that I had done project management at a law firm and I’d been in a publishing company. So I’d kind of been in IT and IT applications and a few different roles and hopped around a few different industries before coming to Varonis.

Cindy Ng

And how did you know that Varonis was a good fit for you?

Brian Vecci

I knew immediately that Varonis was a good fit for me because I needed a job and they offered me a job. So the fact that I got a job offer was the first big clue but really I connected with an old manager of mine at a law firm, Chadbourne & Parke who’s one of the best managers that I’d ever had up until that point, introduced me. He know I was looking for a job and introduced me to a friend of his at another law firm who had a friend who worked for this tiny startup company called Varonis who was looking for someone to do what they were calling technical marketing which is something that I’d never done before.

And so I interviewed with this guy, his name is David Gibson, and he was a former SE and was looking for someone technical, and I met him and we got along great. And then a couple of days later I met a guy named Mark Wilcox and we got along really well, and a couple of days later I sat in a windowless conference room in New York City, then a couple of guys named Ken Spinner and Jim O’Boyle and a few. About 30 minutes into that meeting I met a guy name Yaki Faitelson, and every single person that I met along the way was passionate and enthusiastic and super intelligent and seemed to work really hard and really believed really strongly in what they were doing, and I had no idea what we were doing at that point. I didn’t really know what Varonis did. I had some kind of inkling.

So it was less the company itself and more the people that I was about to start working with that made me pretty confident that this was gonna be a good fit and it turned out to be right.

Cindy Ng

And what did you learn about yourself after working at Varonis?

Brian Vecci

That I need to spend way less time talking and way more time listening. It’s one of the first lessons that David tried to impart on me. I remember before in one of my first trips out to do some customer meetings, he said to me, “You know, Brian, you’ve got to always remember make the meeting about them not about you.” And anybody who knows me well will hear me say that out loud and laugh at me because they realize that’s still something that I struggle with sometimes.

But learning how to shut up and listen, have a little bit of empathy and think about the people that you’re talking to and what they care about was one of the hardest lessons for me to learn because it’s something that I’m not naturally good at but it’s something that stuck with me for eight years and something that I continue to work on. I think about it as something that I’m hopefully a little bit better at than I used to be and that I continue to improve on. And every time I’m mindful and focused on listening to others I find that I get better at what I do and feel better about what I do.

Cindy Ng

And when you go to a meeting, when you talk to them, what is the biggest data security problem your prospects are faced with?

Brian Vecci

Well, I spend a lot of time in meetings talking these days about our operational journey. And that means the biggest data security problem, the prospects that I’m talking to when I’m talking to, the biggest problem that they face is, they know they have a big problem. They know they have a ton of data.

They may know that some of it is sensitive, they may not, they may have some ideas of where it is, they may have some sense of the scale of the problem that they’re facing trying to help the right people have access to the right data but the biggest problem they face is, “All right. We know we have these huge problems, we get it. How do we get there? How do we go from the state where everything is chaos to this vision that you’re talking about where only the right people have access to just what they’re supposed to and everything’s monitored. When something goes wrong we know about it?”

So the biggest problem these days is just how to get there. It’s less about a specific technical problem and more about, “I don’t know what I need to do first, second, third or fourth,” which is really different like even when you and I started here. Like seven, eight years ago the biggest problem that we faced was that our prospects had no idea that they had these problems. We spent so much time just educating people first of all, unstructured data or data on file systems is important and it was exposed and they had no idea how big of a problem they had, let alone what they needed to do to fix it. That’s changed. These days most people know that they have a big problem, they just don’t know how to get there.

So what I’m finding is when I am talking to a prospect it’s because they wanna learn about, you know, what our operational journey looks like. Those are words that we use, but what it really means is, “I know I have big problems. I have a sense that you can help me. How can we actually get to the state that you’re talking about?” If that makes some sense

Cindy Ng

Yeah. Take us through an operational journey from start to finish that you think might be helpful for our listeners to understand the important work you do. Let’s start with verticals. Do verticals matter? Does this journey apply to every company?

Brian Vecci

I think the journey applies to every company because every company has data but that doesn’t mean that verticals don’t matter. Verticals do matter because the ways a bank thinks about their data because they’re so highly regulated, because they know they’ve got, for instance, customer information, that if it was exposed or leaked improperly could result in big fines, the kinds of things highly regulated industries think about when it comes to their data are a little bit different than, for instance, a media company or somebody who’s not as regulated.

Everybody’s got the same problems but the vertical can really dictate sometimes how a prospect thinks about or even talks about their data. That said, the operational journey, it’s pretty much the same. We don’t have to change what our journey looks like depending on the vertical. Everybody gets a lot of data, and if they’ve never worked with Varonis before I’m pretty sure they don’t really have a handle on what kind of data they have, meaning what sensitive and what’s not. They really don’t have a handle on where it all is.

They’re probably not monitoring how it’s used. There’s a sound bite that I use often, you can’t catch what you can’t see, and you can’t manage what you don’t monitor, which sounds trite but are absolutely true. It’s really difficult to make decisions about something when you know nothing about it and so many companies know nothing about their data.

So the journey starts with, and this is gonna sound kind of sales-y because we spend a lot of time building content for Salesforce to learn, but turning on the light, just helping somebody understand, “Listen, here’s where your data is. Here’s who got access to it. Here’s what’s sensitive, here’s where it’s exposed, and look, here’s how it’s being used.” And when you do that, when you just start with that you’re often so much further ahead than you were before.

The journey then kind of moves on to not only understanding what you’ve got but fixing the biggest problems. When you turn on the lights you can start to prioritize and understand where you’re exposed and where you’re at risk.

One of the things that I talk a lot about, many of the presentations that I give is that risk is a pretty simple equation. It’s how valuable is something and how likely is it that something’s gonna go wrong with that asset or that data? So how valuable is our data? What’s the likelihood that it’s gonna get lost or stolen or misused? And our operational…a big part of our operational journey is helping our prospects to quantify that.

How many folders do you have that have sensitive data that are exposed to many people, that are exposed to global access groups? That’s easy for us to put numbers behind, very hard for someone to do without Varonis. But once you understand where you’re exposed, we call it prevent. We detect and then prevent, but preventing disaster means reducing exposure, making sure only the right people have access to what they’re supposed to, locking down sensitive data, getting rid of global access, and starting to figure out who this data belongs to so that you can get them involved in making decisions.

Finally, the last step of the journey is to automate things like entitlement reviews. Why should somebody at the helpdesk or somebody in security or somebody in IT be making regular decisions about who should and shouldn’t have access? It’s the data owners, it’s the people who understand and have real context that should be.

So automating entitlement reviews, automating authorization workflows, automating quarantining and retention and disposition, these are all kind of technical ways of saying, “Once you understand your data and you lock it down, you can start to treat it like you would anything else that’s valuable,” and Varonis can help you do that in an automated way so that you’re not going through endless projects for annual clean-ups and things like that, which is what we see our prospects either are doing or have done in the past in trying to solve some of these problems.

Cindy Ng

So how can you turn on the lights for our customers? How do they acknowledge their problem? Do they know that they have problems? How do they respond?

Brian Vecci

Customers who or prospects, I should say, who we do risk assessment for and we’re completely shocked by what we found. I hear stories a lot of sale teams being kicked out of the room when somebody says, “You know what? We had no idea that this much sensitive data was this exposed, that you can’t see this, like we could all get in a lot of trouble, you have to leave the room.” So sometimes it’s really surprising.

Other times and this is becoming more common these days, a prospect will know that they have a big problem but they didn’t realize maybe the extent of it or they’ve never seen it presented in such a comprehensive way. Our risk assessments are so valuable, and it’s one of the reasons we talk about or evaluations or our proofs of concept as a risk assessment these days because that’s really what they are.

We can go in and give somebody a pretty clear picture of what their environment looks like without a whole lot of work. We can tell them concretely, “Here’s how much data you have, here’s how much of it is sensitive and here’s how much of it is open. Here’s literally how much risk you’re facing right now and here’s how you can kind of fix all these problems.”

So, to answer it, I think your question is, “Do they know it’s a problem?” Sometimes they do, sometimes they don’t. Oftentimes they have no idea of the real scale of the problem or even if they do know they have a big problem it’s still eye-opening for us to do a risk assessment and show them really specifically exactly where the problems are and how they can actually fix them.

Cindy Ng

So after they kick you out and hopefully they bring you back in and that you try to convince them that our methodology is the right one to follow, how do you convince them that there’s so many solutions to a problem? Why is the Varonis way the right way?

Brian Vecci

I’m going to disagree with you that there’s so many solutions to a problem because this particular problem, especially when we’re talking about a data stores like file systems that are pretty chaotic, there aren’t a lot of solutions to that problem.

What we’re very fortunate in that Varonis has technology that’s unique. Nobody else does what we do the way that we do it. And I can speak from personal experience. Having spent some time at one of our competitors, nobody else does what we do the way that we do it. So when we can come in and present not just, “Hey, look, we showed you, you have a big problem, but we showed you you have a big problem and we have the technology to help you solve it, and we have the track record and experience to show you that we’re good at actually doing this.” Our methodology, it’s not pie in the sky, it’s not in theory. We’ve got more than 6250 as our last earnings call.

That’s a lot of customers who have used Varonis to actually solve some of these problems. So our methodology is based on experience and that carries a lot of weight. There’s lots of ways to solve this problem, it’s really, in our experience, there’s very, very few ways to solve this problem, and we’re fortunate enough that if you wanna solve it you need not only a methodology to do it, you need an approach, you need technology to enable that approach to actually work.

And I speak honestly in my experience, Varonis is the only way to do it, which it’s a lot of fun to work for a place where you can not only identify a big problem but help people solve it and you’re the only ones that can do it. We’re in a really unique situation.

Cindy Ng

What do they initially buy when they decide that Varonis is the only way?

Brian Vecci

Everybody has Windows data or CIFs data, whether it’s NAS or on Windows File Servers. So, most commonly it’s DatAdvantage for Windows because that’s what gives you the ability to not only monitor everything but map all of the identities and all of the permissions. That’s pretty critical to turning on the lights. Another big part of turning on the lights is understanding where sensitive data is. So data classification. And our data classification engine is kind of a no-brainer. So that’s a big…that’s a pretty common piece of that initial package.

And then the great thing about DatAlert and DatAlert suite is that it becomes more powerful the more ingredients, the more we call them behavior streams or metadata streams that you give it. The more information the DatAlert has to analyze and alert you the more valuable it is. So with DatAdvantage for Windows you’re mapping permissions, you’re monitoring Windows data and access activity for the users on that data. Data classification gives you some context in what’s sensitive and what’s not which is really important.

And Directory Services allows you to monitor Active Directory too, everybody has Active Directory. So those I think are the most common but I wanna be careful about saying what are, you know, our most common package is.

Cindy Ng

And then how do you quantify the improvement so that customers know that you’ve helped them and they wanna continue the journey with you?

Brian Vecci

It’s a really excellent question. And it’s a big part of our risk assessment, is to quantify what their risk is, what their risk profile is. And we quantify that by how much data do you have? How much of that is sensitive and how much of that is open? And if you just track those things, “All right. How many folders do I have? How many of those folders are open to everybody or, you know, open to lots of people? How many of those folders that are open are also contain sensitive information?”

If you take that number and you start tracking it over time and you see the number of, you know, folders that are sensitive and open and you see that number going down, you see the number of folders that are stale and you see that number going down because you’re deleting or archiving it, you see the number of things like users who are enabled but not active, or users that have passwords set not to expire or the number of file system artifacts like orphaned SIDs or individuals on access control lists or the number of issues that we find in Active Directory, there’s lots of really specific metrics that only we can measure, and I say only we because we’re the only ones that have the ability to scan every single folder and subfolder and every single sharepoint site and sub-sites, and we monitor every single data touch. We’re the only ones that can really do that especially at scale.

We can start to put really specific metrics behind, “All right. Here’s what you’ve got. Here’s where you’re at risk, and here’s how you can measure the improvement over time.” And that’s what we show our prospect in a risk assessment, and hopefully, that’s what we’re tracking as they go through our operational journey.

Cindy Ng

And describe what utopia would look like in a company’s file system?

Brian Vecci

I would say, here’s what utopia looks like, and this is part of a lot of the presentations that I give these days. Like what is the Varonis’ vision for how you can think about your data? And it’s pretty straightforward. You know where all your sensitive data is, you can make sure that only the right people have access to it, and really, people, users only have access to what they’re supposed to, that everything is monitored. Every time someone touches data it’s monitored and recorded.

So just like how a bank has a pretty good idea when your credit card is being misused because they know a lot about you, right? They know who you are, they know where you live, they know what you shop for, they know in the amounts that you shop for and where you shop, and really, really critically, they watch every dollar that goes in and out of your account because that’s their business.

Well, you can start to treat data that way if you know everything about your users and what they have access to and where sensitive data is and really critically, you watch every time someone opens, creates, moves, modify it and deletes data, you can start to treat your data like a bank treats your credit card, and that means you know when something goes wrong.

So not only do you know where your sensitive data is and you can make sure the right people have access to it but you also watch everything that every user in every service account does. So you know what’s normal and then you know what’s abnormal, and if something goes wrong you can respond to it intelligently and really really quickly. And then you can automate things like retention and dispositions.

And what that means is, when you don’t need data anymore you can delete it, archive it, move it somewhere else. If somebody put something sensitive where it’s not supposed to be, you’ve got automation in place to quarantine it. Somebody drops a sensitive file in an open share, it automatically gets moved somewhere else, that’s locked down and properly protected.

You know who data belongs to and you’ve got those owners involved. So when someone needs access to data it’s your data owners that are saying yes or no, and that whole process is recorded. The data owners are reviewing access on a regular basis. They’re doing access recertification, we call them entitlement reviews.

So once a quarter your owners are looking at who has access to the data and they’re making decisions about who should and shouldn’t have access to data. And then from a compliance standpoint, not only do you know what’s happening to your data and you know what’s sensitive, and you can make sure that it’s locked down, but when someone needs access to it you’ve got a record of who asked for it, who approved it, when they approved it, why they approved it because you’ve got DatAvantage monitoring everything for every single thing that they did while they had that data.

The vision is just to start treating data like a smart company treats anything else that’s valuable. And the biggest journey that we’ve been on as a company over the last…since I’ve been here since the last…in the last eight years, it’s helping the rest of the world understand just how valuable this data is and that it’s possible to put the kind of controls and protections and processes around file systems as they do anything else that’s really valuable in the company.

Cindy Ng

What other byproducts have you been able to help our customers find since they were looking to achieve these privilege model? Where they able to find other solutions that they didn’t initially realize that Varonis helped them with?

Brian Vecci

As for the kinds of things that companies tend to discover and the kind of use cases that gets opened up, but once you start treating data this way you can start connecting things like your SIM to your file systems, which is a…it’s really, really difficult to do unless you’ve got Varonis, by sending alerts from DatAlert off to the SIM for instance or connecting identity management to your file systems.

Cindy Ng

Outside of work when you’re not presenting or traveling to another meeting, what do you like to do?

Brian Vecci

I like to read a lot and I spend a lot of time on planes so I spend a lot of time reading. I play the guitar and I’m pretty confident that’s one of the reasons that David Gibson hired me, was that I was a guitar player. I have a little home studio in my basement. I recently moved from Brooklyn out to New Jersey. And I’ve been joking with a lot of people that I bought a farm. I didn’t actually buy a farm although I looked at it, but I’m just spending a lot of time learning what it’s like to own and run a house.

Having a house and having a kind of a big piece of property is something that’s new to me. So over the last year, really, the last six, eight months since I’ve done that, I’ve been learning a lot about what it means to kind of be a homeowner, which is exciting and fun and may sound l kind of pedestrian and not as exciting as some of the other stuff that I get to do, but for me, it’s been really, really interesting.

Cindy Ng

Well, thank you so much, Brian. And we wish you the best.

Brian Vecci

Thank you. It’s been great talking to you. And, Cindy, it’s been great working with you for the past eight years. And when did you join Varonis? You were the first person that was hired in our team after I joined.

Cindy Ng

It was 2010.

Brian Vecci

Yeah, 2010. So we’ve been here for a while. It’s been great working with you and I look forward to lots more in the future.

Australian Notifiable Data Breach Scheme, Explained

Australian Notifiable Data Breach Scheme, Explained

A third time is a charm, in life and in data breach notifications laws. On February 13, 2017, the Australian government, in its third attempt, passed the Notifiable Data Breaches scheme, which finally came into effect on February 22nd of this year.

While we all have a conceptual idea of what a data breach notification means, but when it comes to required action, we have to look at the nitty gritty details. Let’s start with how a data breach is defined down under.

Australia’s Definition of a Data Breach

Australia defines a breach broadly enough to include unauthorized disclosure or access of personal information, which means that a ransomware attack that encrypts but does not exfiltrate data can constitute a reportable breach.

Like the GDPR, Australia broadly considers personal data to be any information about an identified individual or that can be reasonably linked to an individual.

In real-world terms, it means that if hackers get phone numbers, bank account data, or medical records, then it’s considered a breach. For more examples on the kinds of data that may increase the risk of serious harm if there is a data breach, click here.

Australia’s Data Breach Notification Rules

The rules will apply to any organisation with an annual turnover of more than $3 million, but small businesses under that threshold will still be subject to compliance if they handle sensitive health documents or government contracts.

The new Australian amendment also has a harm threshold that has to be met for the breach to be reportable. This is not unusual–we’ve seen these same harm thresholds in US states breach notification laws, and even the EU’s GDPR and the NIS Directive.

In the Australian case, the language used is that the breach will “likely to result in serious harm.”  While not explicitly stated, the surrounding context in the amendment says that breach would have to cause serious physical, psychological, emotional, economic, reputational, and financial harm or other effect that a “reasonable” person would agree.

Australian Privacy Commissioner Timothy Pilgrim describes serious harm in the following way,Well, serious harm can manifest in a number of ways. It can be through financial harm, so, someone’s account’s been at risk in a financial institution. It can be psychological or emotional harm, for example, if someone’s health records were breached. There can be reputational harm, if the wrong information gets out, as well.”

After the Breach that Caused Serious Harm

As soon as an organisation is aware of a harmful breach event, then Australian organizations will have to notify the regulators as soon as possible after discovery. They’ll need to provide them with breach details, including the information accessed, as well as steps affected individuals should take.

If an organization fails to report a serious data breach, or fails to report a data breach on two or more separate occasions, the Office of the Australian Information Commissioner has the ability to seek a civil penalty order against the organisation of up to $2.1 million AU, depending on the significance and harm that may result from the breach.

Organisations can submit their notification of a data breach to the Australian Commissioner through the Notifiable Data Breach form. Afterwards, they can notify individuals as soon as possible.

Exceptions to the Rule

As a side note, the Australian breach notification rule goes further with explicit remediation exceptions that give the covered entities – privacy sector companies, government agencies, and health care providers – a bit of wiggle room.

If the breached entity can show that they have taken actions involving the disclosure or access before it results in serious harm, then they don’t have to report it.

Particularly with health care providers, this exception is intended to avoid duplication of notices under the NDB scheme and the data breach notification requirements in the My Health Record system.

But How Will You Know When There’s Breach?

Practically speaking, before assessing whether a breach is likely to result in serious harm, organisations first need to know when a data breach is taking place.

Employees, law enforcement agencies, customers and service providers are frequently the first to detect the problem. If you encounter ransomware, you won’t be able to get past the ransom note. But the reality is, a majority of data breach victims don’t have adequate security systems that would help them self-detect data breaches.

What’s more, while most breach compromises occur in days, most discoveries do not. The longer it takes to detect a breach, the more expensive it will be. In a recent study, US companies took an average of 206 days to detect a data breach.

In the case study below, the organisation discovered the breach through an employee.

Australian Data Breach Case Study

Recently, an Australian shipping company, Shvitzer announced that they suffered a data breach that lasted 339 days before they discovered and stopped it. Ahem, that’s 133 days over the average.

Between May 27, 2017 and March 1, 2018, up to 60,000 emails from three accounts in finance, payroll and operations were secretly auto-forwarded to two external accounts.

What initiated the investigation? Svitzer’s IT help desk got a call from an employee about an email rejection notice from an external email account.

Svitzer’s head of communications, Nicole Holyer said that the compromised email account owners couldn’t see that their emails were being forwarded.

“We’ve ruled out that it was someone internally,” Ms Holyer said. However, the outsider has not yet been identified.

What we know about outside hackers is that they can easily go around the perimeter and get inside. Without behavior-based anomaly detection, once an outsider is in, the attackers often appear as just another user.

How Alerting Assists in Incident Response

If you don’t want to wait for an employee to report suspicious behavior or wait for an anomaly to occur, alerting is a key factor in discovering and stopping exfiltration from doing even more damage.

Our friend, Australia-based security analyst Troy Hunt, said that it’s unusual to see information exfiltrated one email at a time, “One of the interesting things here is that many organisations configure their mail environment such that you cannot forward automatically to external addresses precisely because of things like this.”

For anomalies like Svitzer’s, it’s helpful to have an alerting system such as Varonis DatAlert to catch a breach similar to this one and more. With DatAlert, you can set up a rule to detect when automatic forwarding is enabled on mailboxes. This alert and others likely would have triggered and notified the proper individuals to stop the breach before any harm was done.

System Administrator Aaron Neilson of Nature’s Sunshine Products had this to say about how DatAlert helped bolster their security posture, “Certain alerts will trigger scripts that will disable accounts to prevent further harmful actions. This has helped minimize or eliminate the impact from ransomware attacks.”

What’s more, those with Exchange and Exchange Online can also leverage Varonis DatAdvantage and DatAlert to:

  • Monitor and report on all email activity (message opened, send message, edited message, mark email as read or unread)
  • Alert on abnormal email behavior such as forwarding thousands of emails to external email address
  • Alert when an account gains access to a mailbox other than their own
  • Alert when an IT admin accesses mailboxes in a suspicious way (e.g., reading the CEO’s inbox and marking messages as unread)

Today, Australian consumers who have had their personal data inappropriately accessed and put in serious harm will have the law on their side.  The Australian Notifiable Data Breach Scheme will effectively be their alerting system.

Meanwhile, organisations who need help with alerting in an incident response can try DatAlert and DatAdvantage for Exchange. The free risk assessment can help you decide.

 

 

[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and ...

[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk, Part Two

This article is part of the series "[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk". Check out the rest:

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


In part two of my interview with Varonis CFO & COO Guy Melamed, we get into the specifics with data breaches, breach notification and the stock price.

What’s clear from our conversation is that you can no longer ignore the risks of a potential breach. There are many ways you can reduce risk. However, if you choose not to take action, minimally, at least have a conversation about it.

Also, around 5:11, I asked a question about IT pros who might need some help getting budget. There’s a story that might help.

Do Data Breaches Impact the Stock Price?

Guy Melamed

My name’s Guy Melamed, CFO and COO for Varonis. I’ve been with the company since 2011, in charge of all the financial statements and execution of strategic operational plans, in charge of the legal department, and IR as well. And kind of enjoying the ride.

Cindy Ng

There’s a discrepancy online where there’ve been studies that say that breaches don’t impact the stock price. Sure, a breach will typically lead to a one-time large expense or maybe smaller reoccurring expenses. There might be a potential decrease in revenue, but in the long term, investors tend to look past the breach, and they really just focus on the strength of the business and the value of the company. What do you think about data breaches and their impact on the stock price?

Guy Melamed

I’m not so qualified to talk about statistics on stock price and how a breach would affect a stock price in the short term or in the long term. What I can say is that what we’ve seen in so many events, in so many breaches that have taken place in the last couple of years, is that if you go back to those companies, and ask them would they have rather dealt with a breach or just buy a software, take measures that can help them in protecting or preventing or minimizing the amount and the magnitude of the breach, I think the answer is pretty obvious.

So we’ve seen companies that have gone out of business because of breaches. We’ve seen companies that will have to deal with litigation for years ahead. So where’s that factored in? There’s just so many components. It’s more of a philosophy that if you can do something active to try and minimize risk, then why not do it?

I think companies, more from a philosophical perspective, should try and actively take action in order to minimize risk. And companies that are under the belief that it won’t affect them and that they’re going to be okay, I think are acting slightly irresponsible.

Data Breaches and Breach Notification

Cindy Ng

Let’s talk about breach notification. It’s said that the time to discovery increases the cost of a data breach, and research has said that most companies take over six months to detect data breaches. If you’re in the EU, article 31 of the GDPR says that data controllers, they’ll need to notify authorities of a breach within 72 hours at the latest upon learning about the exposure, if it results in a risk to a consumer. If you’re already protecting or in the process of protecting your data, how do you reconcile the time in figuring this element out? What do companies need to do? How much are we talking about?

Guy Melamed

So the surveys that we’ve been tracking show that 70% of the beaches are discovered within months or years. And I think a great example of a breach that affected a company years later was a Yahoo deal. This was a breach and I don’t know if it was four years ago or three or five years ago, but it was discovered as part of an M&A process and had an effect, an actual quantifiable number that impacted the transaction price.

So a company would obviously rather try and identify breaches as soon as possible, so they can take action, minimize some of the cost and be transparent with both the customers, the investors, and the shareholders.

GDPR definitely changes the reporting requirement, and if you’re breached, you have to provide that information within 72 hours. That’s a short period of time, and in order to be able to comply with that regulation, and in order to have better tracking, you really have to have systems, programs, personnel in place to try to identify this.

And the fines that come from GDPR, I’m talking about, you know, some of the requirements and some of the fines related to those requirements, are 4% of global revenue or $25 million, whichever is greater. That’s a huge number that could affect companies in so many ways, definitely something that from our perspective what we see is causing a lot of interest, causing a lot of discussion, and companies are not ignoring the regulation because of its significance.

Should You Just Pay the Fine?

Cindy Ng

So when you’ve done the risk analysis of viewing the GDPR fines, companies resigned to paying a fine because the fine isn’t that costly. And so let’s just pay the fine and get it over with.

Guy Melamed

My response is that it probably fits with an analysis or an analogy that says if I go through a red light, I know that the fine is probably minimal and I can live with a fine. There’s so many other consequences. First of all, there’s, the fine is pretty large when it comes to GDPR.

There’re so many other components that thinking that you can be okay, and just by paying the fine and being breached is definitely not the action that I would like to take as the company’s CFO and definitely would try and act in a way that would minimize the risk long term and short term.

A Story that Might Help IT Pros Get Budget

Cindy Ng

And what are your tips for IT managers who are trying to get budget to get a data security solution they need to help prevent a breach?

Guy Melamed

So I’m not sure I’m qualified to give tips, but I will share a story that I heard from one of our customers.

And during a discussion, he was asked, “What is the best way to get budget, in order to get the Varonis product or any other product for that matter that can protect the company in the long term?”

And his response was, “Make sure the risk assessment, the evaluation and whatever you’re doing in that demo is done on the finance documents. If the finance personnel, if the CFO can see how many people have access to the financial statements or any other sensitive information within his folders or her folders and have access to information they shouldn’t have access to, you’ll find the budget, they’ll find the budget.”

So that’s definitely something that I I could relate because if I would see risk on files that I know team members shouldn’t have access to, we could move things around within the budget to have something purchased that wasn’t necessarily budget initially when I can quantify the risk in my mind.

Minimally, You Should Have a Discussion

Cindy Ng

And any final thought as CFO as it relates to the cost if you don’t invest in security?

Guy Melamed

I think no one anymore can ignore the risk. I think three, four, five years ago, we would talk to companies, show them the risk assessment, show how vulnerable they are, how many sensitive files are open to everyone in the company, show them how much data is open to everyone.

And people could live with the risk. I don’t think people, after all the breaches that have taken place and the amount of risks that companies are dealing with, can ignore it anymore. I think they have to take measures, think about it, or at least have a discussion. If they decide that they want to live with the risk, it should definitely be done after discussion with the legal department, the HR department, CEO, CFO, CISO, if all parties agree that the risk is not worth doing any, taking any action, then at least you had a conversation.

But if it’s decided by one person within the organization and it’s not shared between the different departments, between the different roles that would eventually be responsible, then I think that’s just not good practice.

[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and ...

[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk, Part One

This article is part of the series "[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk". Check out the rest:

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Recently, the SEC issued guidance on cybersecurity disclosures, requesting public companies to report data security risk and incidents that have a “material impact” for which reasonable investors would want to know about.

How does the latest guidance impact a CFO’s responsibility in preventing data breaches?  Luckily, I was able to speak with Varonis’ CFO and COO Guy Melamed on his perspective.

In part one of my interview with Guy, we discuss the role a CFO has in preventing insider threats and cyberattacks and why companies might not take action until they see how vulnerable they are with their own data.

An interview well worth your time, by the end of the podcast, you’ll have a better understanding of what IT pros, finance, legal and HR have on their minds.

Data security and the CFO: Risk and Responsibility

Guy Melamed

My name is Guy Melamed, CFO and COO for Varonis. I have been with the company since 2011. In charge of all the financial statements and execution of strategic operational plans. In charge of the legal department and IR as well and I am enjoying the ride.

Cindy Ng

Sounds great. So, today we’re gonna be discussing how much it would cost if we don’t invest in data security, and let’s start with the role of a CFO.

Right now, data breaches are one of the biggest threats that all companies face, and companies are realizing this and increasingly, they’re delegating responsibilities to the CFO. According to a survey by the American Institute of CPAs, 72% of companies, they’ve asked the finance department to take on more of a responsibility to deal with data breaches and attacks. Why should the CFO be involved in protecting the organization’s most sensitive data?

Guy Melamed

I think the answer is comprised of a couple of components. One of them has to do with the fact that CFOs are responsible for the financial statements and with recent events and with the amount of breaches that have taken place, there’s much more emphasis on the type of disclosure the company has to provide as part of the 10-K and as part of the risk factors and even as part of the MD&A. Just to give you an example, in recent months, the SEC has provided guidance on cybersecurity, board consideration, and the amount of disclosure that needs to be provided. And just to give you a sense in the release, that, as a side note, was provided by the SEC chairman, post the breach that took place in the EDGAR system which is a system that you can log in and see all of the financial statements of all companies, and there was a breach in that system and as a result the SEC had to address from a disclosure perspective what was taken and how they’re addressing that event and future events and planning to protect any future event.

So, that kind of created the guidance that was provided to all of the big four accounting firms, and private, and especially public companies have to address that. That release talks about what is company doing from a risk management perspective, how are they protecting against cybersecurity? It talks about the board’s role in overseeing the management and any immaterial cybersecurity risk. And it has a lot of discussion as to what type of disclosure needs to be provided in what event. So, when we received that publication in preparation for our 10-K filing, we had to have a discussion, where to put it, what is the risk, how are we addressing it, and a conversation like that takes place with the legal department. It takes place even with the HR department, with some of the regulation and protecting data. So, there’s a lot of components that relate to the CFO’s role in order to making sure that we address it properly.

Cindy Ng

I actually wanna go back to all the different departments that are involved in addressing the need for preventing data breaches. How would an organization include that in a conversation if they didn’t have the structure for it?

Guy Melamed

Well, the organization first has to understand where the data resides and who has access to the data. And in a recent survey that we published, approximately 50% of the companies have more than a thousand sensitive files open to everyone in the company. That’s an unheard of number. Think about it. If you have one sensitive file, one file that has the full payroll information for an organization, and that file gets to the wrong hands, that can destroy a company, you have a little more than a thousand sensitive files. So, the risk is very significant and approximately 20% of the data on average is open to everyone in the company. That’s a risk a company must take action against. So, step number one is realize where your risk resides and if you don’t have access, and you don’t know who has access to what type of folder, who’s opening the folder, who’s deleting the folder, then you’re blindsided.

So, I think that’s step number one. There’s additional risks that take place on a day to day, and if I’ve given you an example from the finance department, if an employee is on warning, goes through a PIP, and he has access to sensitive information, you wanna make sure that that information that he has access to stays within the company, and that an employee isn’t accessing more and more information in preparation for departure. So, that’s a risk that relates to the finance organization, but relates to so many other departments as well. There’s IP that, you know, personnel within the R&D department wanna make sure is protected. There’s obviously information related to customers and payroll information and HR and legal and the list just goes on and on. So, the desire is first of all just to be able to know what you need to protect and then who’s protecting it, who has access to it and being able to see any abnormal behavior that’s taking place within an organization.

Don’t We Have an Audit Trail?

Cindy Ng

So, you have deep expertise in risk and some technical knowledge. There was a survey among cybersecurity professionals and 41% of them think that their CFOs have a major gap in their technical expertise in risk or they don’t understand their risk at all. You’ve alluded to some of their risk. What is your recommendation to other CFOs or other individuals who wanna improve their knowledge gap? Who should their trusted advisors be?

Guy Melamed

Well, first of all, I don’t think I have deep expertise on the technological side or in understanding risk, but I have been around enough to understand that the biggest gap between the finance department and the IT or security department has to do with misconceptions. And if you ask, and just to give an example, what we see many times that takes place in our selling process, our selling process, for anyone that doesn’t know, is very visual. So, we can talk about risk with our potential customers but a conversation doesn’t get elevated until customers see how vulnerable they are with their own data, and I guess that’s just human nature. Everyone thinks that they’ll be okay until they see how open and how much data is open to everyone in the company and how many sensitive files could be accessed by people that shouldn’t have access to that.

So, one of the examples that we see during a selling process is that if we sit showing that risk assessment or even having an initial conversation with someone from the IT or a CISO, and also with a legal department member or a finance member, and we ask one simple question, “If today, 10,000 files would have been deleted, would you know about it?” The answer from the CISO or from the IT personnel is, “Absolutely not. We don’t have any ability to know if someone deleted 10,000 files.”

But if you ask a finance person or someone from the legal department or an HR personnel, I think the misconception or their automatic reaction would be that there has to be a way and that it seems unreasonable that a company isn’t tracking if 10,000 files got deleted today. That, I believe is one of the gaps that has to be breached and the education from the finance side is making sure that you know what the company’s tracking and what we’re not tracking and if an employee is about to leave, do we have any type of monitoring to make sure that sensitive files aren’t taken and provided to a competitor or are even used in the future by that, what would be an ex-employee later on.

So, there’s a lot of components on the daily operations. There’s a lot of risks that company has to think about and always kind of go through the process of what can go wrong. Maybe it hasn’t happened and maybe everything is good now and we trust all of our employees, but what if? And I think the notion that when you have organizations with 1,000 employees or 20,000 employees or 50,000 employees, the notion that all of the employees are ethical is a bit scary and you have to think how to protect the company in the best way.

Cost of a Data Breach

Cindy Ng

What’s most compelling for me is that there’s a disconnect between IT and the rest of the departments, where IT thinks that, “I really wanna protect everyone’s data, but there’s no ability to do so.”

Meanwhile, finance, legal, and HR, they think, “Oh, hasn’t that problem been already solved? It’s a little unreasonable,” as you’ve said, “if we weren’t able to figure that out.”

So, let’s talk about the cost of a breach. So, it’s been said that the average cost of a data breach is about four million, and there are many organizations that have paid tens of millions of dollars. What are some direct costs and indirect costs to businesses associated with data breaches?

Guy Melamed

So, a data breach, from a quantifiable perspective, depends on what was taken, when it was taken, who was it taken by, and who was it provided to. So, there is a lot of components, and I think it would be very hard for me to throw out a number. But what I would say is that a breach is a disruption to the business in so many levels. It’s a disruption from the sense of finding out what was taken, the risk of that information being provided to your competitors, even the risk of taking financial information and providing it before it was published.

What I would think about is would a CFO, or a COO for that matter, be comfortable with providing their financial statements to a competitor two weeks before they were published? Obviously the answer is, no, and there could be detrimental consequences to that type of breach.

But the breach isn’t just on the financial information. There is customer information, there is payroll information. There’s just so much sensitive file that sits there that people within the organization have access, and it doesn’t necessarily mean that they would break bad. It could be a situation where someone from the outside took control of the credentials of an employee within the organization and starts using that access in the wrong way. So, the notion, and I think what we’ve seen as a company, as one of the most interesting phenomenas, is that some of the breaches that took place in 2014 really generated a knee jerk reaction and there was a significant IT spent during the beginning of 2015. But that spent at the beginning of the year was mostly towards perimeter defense security. The notion was that if you’re protecting the border, you’ll be okay. And I think what’s been proven day in, day out is that perimeter defense security is absolutely important but the notion that that’s the only type of defense that you need has been thrown out the window.

And if you use the same analogy of border patrol or protecting a country, the fact that you have protection on the border doesn’t mean that you don’t have any other measures and any other organizations that protect you from the inside. Because at one point there is gonna be someone that will be able to overcome that border. Not only that, how are you protecting your organization or your country from people from the inside? So, what we’ve seen in the last couple years is that the amount of breaches that have taken place have increased significantly. The magnitude has increased significantly, the implications on those companies has increased significantly.

And I know there was an article a couple years ago that discussed the cost of a breach and how you shouldn’t buy any software and you can just deal with a breach. That notion has been thrown out the window and, you know, it’s obviously that the consequences of a breach that we see it on the news and on the front page of “The Wall Street Journal” and “The Financial Times.” It’s happening in rates that we haven’t seen before and I don’t see that going away.

 

Continue reading the next post in "[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk"

[Podcast] Dr. Wolter Pieters on Information Ethics, Part Two

[Podcast] Dr. Wolter Pieters on Information Ethics, Part Two

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


In part two of my interview with Delft University of Technology’s assistant professor of cyber risk, Dr. Wolter Pieters, we continue our discussion on transparency versus secrecy in security.

We also cover ways organizations can present themselves as trustworthy. How? Be very clear about managing expectations. Declare your principles so that end users can trust that you’ll be executing by the principles you advocate. Lastly, have a plan for know what to do when something goes wrong.

And of course there’s a caveat, Wolter reminds us that there’s also a very important place in this world for ethical hackers. Why? Not all security issues can be solved during the design stage.

Transparency versus Secrecy

Wolter Pieters

My name is Wolter Pieters. I have a background in both computer science and philosophy of technology. I’m very much interested in studying cyber security from an angle that either goes a bit more towards the social science, so, why do people behave in certain ways in the cyber security space. But also more towards philosophy and ethics, so, what would be reasons for doing things differently in order to support certain values.

Privacy, but then again, I think privacy is a bit overrated. This is really about power balance. It’s because everything we do in security will give some people access and exclude other people, and that’s a very fundamental thing. It’s basically about power balance that is through security we embed into technology. And that is what fundamentally interests me in relation to security and ethics.

Cindy Ng

How do we live in now world where you just don’t know whether or not organizations or governments are behaving in a way that’s trustworthy?

Wolter Pieters

You know, transparency versus secrecy is a very important debate within the security space. This already starts out very fundamentally from the question like, “Should methods for protecting information be publicly known or should they be kept secret because otherwise we may be giving too much information away to hackers, etc?” So, this is a very fundamental thing and in terms of encryption already, there’s the principle like, “Hey, encryption algorithms should be publicly known because otherwise we can’t even tell how well our information is being protected by means of that encryption and only the keys using encryption should be kept secret.” This is a principle called Kerckhoff’s Principle. This is very old and information in security and a lot of the current encryption algorithms actually adhere to that principle and we’ve also seen encryption algorithms not adhering to that principle.

So, algorithms that were secrets, trade secrets, etc. being broken very moments the algorithm became known. So, in that sense there I think most researchers would agree this is good practice. On the other hand it’s seems that there’s also a certain limit to what we want to be transparent there. Both in terms of security controls, we’re not giving away every single thing governments do in terms of security online. So, there is some level of security by obscurity there and more generally to what extent is transparency a good thing. This again ties in with who is a threat. I mean, we have the whole WikiLeaks endeavor and some people will say, “Well, this is great. The government shouldn’t be keeping all that stuff secret.” So, it’s great for trust that this is now all out in the open. On the other hand, you could argue all this and this is actually a threat to trust in the government. So, this form of transparency would be very bad for trust.

So, there’s clearly a tension there. Some level of transparency may help people trust in the protections embedded in the technology and in the actors that use those technologies online. But on the other hand, if there’s too much transparency all the nitty-gritty details may actually decrease trust. You see this all over the place. We’ve seen it through with the electronic voting as well. If you provide some level of explanation on how certain technologies are being secured, that may help. If you provide too much detail people won’t understand it and it will only increase distrust. There is a kind of golden middle there in terms of how much explanation you should give to make people trust in certain forms of security encryption, etc. And again, in the end people will have to rely on experts because physical forms of security, physical ballot boxes, it’s possible to explain and how these work and how they are being secured with digital that becomes much more complicated and for most people, they will have to trust the judgment of experts that these forms of security are actually good if the experts believe so.

What Trustworthy Organizations Do Differently

Cindy Ng

What’s something an organization can do in order to establish themselves as a trustworthy, morally-sound, ethical organization?

Wolter Pieters

I think the most important thing that companies can do is very clear in terms of managing expectations. So, couple of examples there, if as a company you decide to provide end-to-end encryption for communications. The people that use your or your jet app exchange messages get the assurance that the messages are encrypted between their device and the device of the one that they’re communicating with. And this is a clear statement like, “Hey, we’re doing it this way.” And that also means that then you shouldn’t have any backdoors or means to give this communication away to need the intelligence agencies anyway. Because if this is your standpoint, and people need to be able to trust in that. Similarly, if you are running a social network site and you want people to trust in your policies then you need to be crystal clear.

Not only that it’s possible to change your privacy settings, to regulate the access that other use of the social networking servers have to your data, but at the same time you need to be crystal clear about how you as a social network operator are using the kind of data. Because sometimes I get the big internet companies are offering all kinds of privacy settings which give people the impression that they can do a lot in terms of their privacy but, yes, this is true for the inter user data access but the provider still sees everything. This seems to be a way of framing privacy in terms of inter user data access. Whereas, I think it’s much more fundamental what these companies can do with all the data they gather for all their use and what that means in terms of their power and the position that they get in this whole area of cyberspace this whole arena.

So, managing expectations, I mean, there’s all kinds of different standpoints also based on different ethical theories, based on different political points of view that you could take in this space. If you want to behave ethically then make sure you list your principles, you list what you do in terms of security and privacy to adhere to those principles and make sure that people can actually trust that this is also what you do in practice. And also make sure that you know exactly what you’re going to do in case something goes wrong anyway. We’ve seen too many breaches where the responses by the companies were not quite up to standards in terms of delaying the announcement of the breach or it’s crucial to not only do some prevention in terms of security and privacy but also know what you’re going to do in case something goes wrong.

Doomsday Scenarios

Cindy Ng

Yeah, you say, if an IoT device gets created and they get to market their product first and then they’ll fix security and privacy later, that’s too late. Is it sort of like, “We’re doomed already and we’re just sort of managing the best way we know how?”

Wolter Pieters

In a way, it’s a good thing when we are nervous about where our society is going because in history at moments where people weren’t nervous enough about where society was going, we’ve seen things go terribly wrong. So, in a sense we need to get rid of the illusion that we can easily be in control or something like that because we can’t.

The same for elections, there is no neutral space from which people can cast their vote without being influenced and we’ve seen in recent elections that actually technology is playing more and more of a role in how people perceive political parties and how to make decisions in terms of voting. So, it’s inevitable that technology companies have a role in those elections and that’s also what they need to acknowledge.

And then of course, and I think this is a big question that needs to be asked, “Can we prevent the situation in which the power of certain online stakeholders whether those are companies or are there for a nation state or whatever. Can we prevent a situation in which they get so much power that they are able to influence our governments, either through elections or through other means?” That’s a situation that we really don’t want to be in and I’m not pretending that I have a crystal clear answers there but this is something that at least we should consider as a possible scenario.

And then there’s all these doomsday scenarios with Cyber Pearl Harbor and I’m not sure whether these doomsday scenarios are the best way to think about this but we should also not be naive and think that all of this will blow over because maybe indeed we have already been giving away too much power in a sense. So, what we should do is fundamentally rethink the way we think about security and privacy from, “Oh, damn, my photos are I don’t know whatever, in the hands of whoever.” That’s not the point. It’s about the scale in which the certain actors either get their hands on data or lots of individuals are able to influence lots of individuals. So, again scale comes in there. It’s not about our individual privacy, it’s about the power that these stakeholders get by having access to the data over by being able to influence lots and lots of people and that’s what the debate needs to be about.

Cindy Ng

Whoever has the data has power, is what you’re getting at.

Wolter Pieters

Whoever has the data and in a sense that data can then, again, be used also to influence people in a targeted way. If you know that somebody’s interested in something, you can try to influence their behavior by referring to the thing that they’re interested in.

Cindy Ng

That’s only if you have data integrity.

Wolter Pieters

Yes. Yes, of course. But on the other hand, little bit of noise in the data doesn’t matter too much because if you have data that’s more or less correct, you can still achieve a lot.

Ethical Hackers Have An Important Role

Cindy Ng

Anything that I didn’t touch upon that you think is important for our listeners to know?

Wolter Pieters

The one thing that I think is critically important is the role that ethical hackers can have in keeping people alert, in a way maybe even changing the rules of the game, because in the end I also don’t think that all security issues can be solved in the design of technology and it’s critically important that when technology are being deployed that people keep an eye on issues that may have been overlooked in the design stage of those technologies. We need some people that are paying attention and will be alerting us to issues that may emerge.

Cindy Ng

It’s a scary role to be in though if you’re an ethical hacker because what if the government comes around and accuses you being an unethical hacker?

Wolter Pieters: Yeah. I think that’s an issue but if that’s going to be happening, if people are afraid to play this role because legislation doesn’t protect them enough, then maybe we need to do something about that. If we don’t have people that point us to essential weaknesses in security, then what will happen is that those issues will be kept secret and that they will be misused in ways that we don’t know about and I think that’s much worse situation to be in.

 

What Experts Are Saying About GDPR

What Experts Are Saying About GDPR

You did get the the memo that GDPR goes into effect next month?

Good! This new EU regulation has a few nuances and uncertainties that will generate more questions than answers over the coming months. Fortunately, we’ve spoken to many attorneys with deep expertise in GDPR. To help you untangle GDPR, the IOS staff reviewed the old transcripts of our conversations, and pulled out a few nuggets that we think will help you get ready.

Does the GDPR cover US businesses? Is the 72-hour breach notification rule strict? Do you need a DPO?  We have the answers below!  If you have more time, listen to our podcasts for deeper insights.

Privacy By Design Raised the Bar

Inside Out Security: Tell us about GDPR, and its implications on Privacy by Design.

Dr. Ann Cavoukian: For the first time, right now the EU has the General Data Protection Regulation, which passed for the first time, ever. It has the words, the actual words, “Privacy by Design” and “Privacy as the default” in the stature.

What I tell people everywhere that I go to speak is that if you follow the principles of Privacy by Design, which in itself raised the bar dramatically from most legislation, you will virtually be assured of complying with your regulations, whatever jurisdiction you’re in.

Because you’re following the highest level of protection. So that’s another attractive feature about Privacy by Design is it offers such a high level of protection that you’re virtually assured of regulatory compliance, whatever jurisdiction you’re in.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


US Businesses Also Need To Prepare for GDPR

Inside Out Security: What are some of the concerns you’re hearing from your clients on GDPR?

Sue Foster: When I speak to my U.S. clients, if they’re a non-resident company that promotes goods or services in the EU, including free services like a free app, for example, they’ll be subject to the GDPR. That’s very clear.

Also, if a non-resident company is monitoring the behavior of people who are located in the EU, including tracking and profiling people based on their internet or device usage, or making automated decisions about people based on their personal data, the company is subject to the GDPR.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Is the 72-hour rule as strict as it sounds?

Inside Out Security:  What we’re hearing from our customers is that the 72-hour breach rule for reporting is a concern. And our customers are confused and after looking at some of the fine print, we are as well!! So I’m wondering if you could explain the breach reporting in terms of thresholds, what needs to happen before a report is made to the DBA’s and consumers?

Sue Foster: So you have to report the breach to the Data Protection Authority as soon as possible, and where feasible, no later than 72 hours after becoming aware of the breach.

How do I know if a breach is likely to ‘result in a risk to the rights and freedoms of natural persons’?

There is actually a document you can look at to tell you what these rights and freedoms are. But you can think of it basically in common sense terms. Are the person’s privacy rights affected, are their rights and the integrity of their communications affected, or is their property affected?

If you decide that you’re not going to report after you go through this full analysis and the DPA disagrees with you, now you’re running the risk of a fine to 2% of the group’s global turnover …or gross revenue around the world.

But for now, and I think for the foreseeable future, it’s going to be about showing your work, making sure you’ve engaged, and that you’ve documented your engagement, so that if something does go wrong, at least you can show what you did.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


What To Do When You Discover A Breach

Inside Out Security: What are one the most important things you would do when you discover a breach? I mean if you could prioritize it in any way. How would you advise a customer about how to have a breach response program in a GDPR context?

Sheila FitzPatrick: Yeah. Well first and foremost, you do need to have in place, before a breach even occurs, an incident response team that’s not made up of just the IT. Because normally organizations have an IT focus. You need to have a response team that includes IT, your chief privacy officer. And if the person… normally a CPO would sit in legal. If he doesn’t sit in legally, you want a legal representative in there as well. You need someone from PR, communications that can actually be the public-facing voice for the company. You need to have someone within Finance and Risk Management that sits on there.

So the first thing to do is to make sure you have that group in place that goes into action immediately. Secondly, you need to determine what data has potentially been breached, even if it hasn’t. Because under GDPR, it’s not… previously it’s been if there’s definitely been a breach that can harm an individual. The definition is if it’s likely to affect an individual. That’s totally different than if the individual could be harmed. So you need to determine okay, what data has been breached, and does it impact an individual?

So, as opposed to if company-related information was breached, there’s a different process you go through. Individual employee or customer data has been breached, the individual, is it likely to affect them? So that’s pretty much anything. That’s a very broad definition. If someone gets a hold of their email address, yes, that could affect them. Someone could email them who is not authorized to email them.

So, you have to launch into that investigation right away and then classify the data that has been any intrusion into the data, what that data is classified as.

Is it personal data?

Is it personal sensitive data?

And then rank it based on is it likely to affect an individual?

Is it likely to impact an individual? Is it likely to harm an individual?

So there could be three levels.

Based on that, what kind of notification? So if it’s likely to affect or impact an individual, you would have to let them know. If it’s likely to harm an individual, you absolutely have to let them know and the data protection authorities know.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Do we need to hire a DPO?

Inside Out Security: An organization must appoint a data protection officer (“DPO”) if, among other things, “the core activities” of the organization require “regular and systematic monitoring of data subjects on a large scale.”  Many Varonis customers are in the B2B space, where they do not directly market to consumers. Their customer lists are perhaps in the tens of thousands of recipients up to the lower six-figure range. First, does the GDPR apply to personal data collected from individuals in a B2B context? And second, how when does data processing become sufficiency “large scale” to require the appointment of a DPO?

Bret Cohen and Sian Rudgard with Hogan Lovells: Yes, the GDPR applies to personal data collected from individuals in a B2B context (e.g., business contacts).  The GDPR’s DPO requirement, however, is not invoked through the maintenance of customer databases.

The DPO requirement is triggered when the core activities of an organization involve regular and systematic monitoring of data subjects on a large scale, or the core activities consist of large scale processing of special categories of data (which includes data relating to health, sex life or sexual orientation, racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, or biometric or genetic data).

“Monitoring” requires an ongoing tracking of the behaviors, personal characteristics, or movements of individuals, such that the controller can ascertain additional details about those individuals that it would not have known through the discrete collection of information.

Therefore, from what we understand of Varonis’ customers’ activities, it is unlikely that a DPO will be required, although this is another area on which we can expect to see guidance from the DPAs, particularly in the European Member States where having a DPO is an existing requirement (such as Germany).

Whether or not a company is required to appoint a DPO, if the company will be subject to the GDPR, it will still need to be able to comply with the “Accountability” record-keeping requirements of the Regulation and demonstrate how it meets the required standards. This will involve designating a responsible person or team to put in place and maintain appropriate  policies and procedures , including data privacy training programs.