Category Archives: Compliance & Regulation

Interview with Attorneys Bret Cohen and Sian Rudgard, Hogan Lovells’ GDPR Experts

Interview with Attorneys Bret Cohen and Sian Rudgard, Hogan Lovells’ GDPR Experts

We are very thankful that Bret Cohen and Sian Rudgard took some time out of their busy schedules at the international law firm of Hogan Lovells to answer this humble blogger’s questions on the EU General Data Protection Regulation (GDPR). Thanks Bret and Sian!

Bret writes regularly on GDPR for HL’s Chronicle of Data Protection blog, one of our favorite resources. Sian had worked at the ICO, the UK’s data protection authority, and helped draft the binding corporate rules (BCR) governing internal transfers of personal data within companies for the EU’s Article 29 Working Party.

So we’re in good hands with these two. Both are now part of HL’s Privacy and Cybersecurity practice. The interview (via email) is heavily based on questions our own customers have been asking us. We’re very excited to share with you their legal expertise of the GDPR.

Inside Out Security:  When exactly is a data controller required to conduct a data protection impact assessment (“DPIA”)?  Must the controller always undertake a DPIA for new uses of certain types of data (e.g., biometrics, facial images)?

Hogan Lovells: The DPIA requirement is linked to processing “likely to result in a high risk for the rights and freedoms of natural persons,” taking into account “the nature, scope, context and purposes of the processing.”  This is a fact-specific standard, and one therefore that is likely to be interpreted differently by different data protection authorities (“DPAs”), although it is generally understood to refer to significant detrimental consequences for individuals.

The GDPR requires a DPIA in three specific circumstances:

  • Where the processing involves a “systematic and extensive” evaluation of an individual in order to make an automated decision about the individual (e.g., profiling) that has a legal effect on that individuals (e.g., denial of benefits).
  • The processing “on a large scale” of sensitive categories of personal data, specifically (a) personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, (b) genetic data, biometric data for the purpose of uniquely identifying an individual, data concerning health, or data concerning an individual’s sex life or sexual orientation, and (c) personal data pertaining to criminal convictions and offenses.
  • A systematic monitoring of a publicly accessible area on a large scale (e.g., through CCTV).

The data protection authorities (“DPAs”) have promised to provide guidance before the end of 2016 on this aspect of the Regulation and the individual DPAs have authority to publish guidance on the kinds of processing operations that require a DPIA and those that do not, and these individual guidance documents might differ from country to country.

IOS: For what types of data and processing would data controllers be required to engage in a “prior consultation” with a DPA?

HL: Data controllers are required to consult with a DPA prior to engaging in data processing where a DPIA indicates that the processing “would result in a high risk” in the absence of measures taken by the controller to mitigate the risk . “High risk” is not defined, but it likely to carry a similar meaning to the threshold DPIA requirement described above:  that is, a significant detrimental consequence for individuals. The requirement to engage in a prior consultation will also likely be influenced by DPA guidance on the issue and we can expect further guidance on this point before the end of this year.


IOS: What does a DPIA under the GDPR look like?

HL: A DPIA is not too complicated.  A suggested approach is to undertake a review of what the proposed data processing activities involve as against the relevant requirements of the Regulation (transparency, legal basis for processing,  data retention, data security, etc).   This can be done by creating a standard DPIA template that requires the filling in of these fields which then allows for a final assessment of risk by an assessor who, based on the analysis provided, will recommend the steps to take to address any risks identified..


IOS: A lot of Varonis customers already follow data security standards (principally ISO 27001). Articles 42 and 43 of the GDPR seem to suggest outside, accredited certification and other bodies will have the power to issue a GDPR certification. Does this mean that existing accreditations will be sufficient to comply with the GDPR standard?

HL: Under Articles 42 and 43, DPAs can approve certifications issued by certain certification bodies as creating the presumption of compliance with various parts of the GDPR.  However, the relevant DPA or the EU-wide group of DPAs, the European Data Protection Board (currently known as the Article 29 Working Party), would have to approve a particular certification before it can be deemed to be sufficient to comply with the GDPR standard.

The Article 29 Working Party has indicated that it intends to provide further guidance on this topic before the end of 2016.


IOS: The GDPR requires notification of data breaches within 72 hours of discovery, including “the measures taken or proposed to be taken by the controller to address the personal data breach, including, where appropriate, measures to mitigate its potential adverse effects.”  What types of “measures to mitigate” the breach’s “potential adverse effects” will be required?

HL:The “adverse effects” mentioned here reference potential adverse effects to the individual. There are no hard and fast rules as of yet, but this standard encourages companies to provide mitigating measures as may be appropriate in a given situation to help protect individuals.  This can include information about the offering of credit monitoring services, information about the offering of identity theft insurance, or steps taken to confirm the deletion of the personal data by unauthorized third parties. It is also possible that some DPAs may request information on technical and other remedial security measures taken by the company. The specific requirements are likely to be borne out through additional regulatory guidance and practice.


IOS: When a data subject requests erasure of personal data, does that mean that data must be deleted everywhere the personal data is located (including in emails, memos, spreadsheets, etc.)?

HL: The right to erasure—also known as the “right to be forgotten”— was one of the more controversial aspects of the GDPR when it was first published, not least because the practical limits on a controller’s obligation to delete personal data were unclear.  The right to erasure is not unlimited and an organization is only required to erase the data when one of the grounds specified in the Regulation apply. These include that the personal data is no longer needed for its original purpose; the individual withdraws consent, or objects to the processing; the data must be erased in order to comply with a legal obligation to which the controller is subject; the data has been collected in relation to the offering of information society services to children; or the processing is unlawful.

There are exemptions to the erasure right, including where an individual objects to the processing, but the organization can establish an overriding legitimate ground to continue the processing; or where the individual withdraws consent to the processing, and the organization has another basis on which to rely to continue the processing. Other exemptions include where processing is necessary for exercising a right of freedom of information or expression, for compliance with a legal obligation, for reasons of public interest in relation to health care, or for exercising or defending legal claims.

In theory, this means that an organization should take reasonable steps to delete personal data subject to a valid erasure request wherever it resides, although we recognize that there may be practical limitations on the ability of an organization to delete certain information. The DPAs do have the ability under the GDPR to introduce further exemptions to this provision but we do not know yet what these will look like.

Organizations do have room to put forward arguments that they have overriding legitimate grounds to continue processing personal data in certain circumstances. Where consent has been withdrawn in many cases it is also likely that there will be another basis on which organizations can continue to process at least some of the data (e.g., legitimate business interests). Organizations should document the steps they take to comply (or choose not to comply) with erasure requests, to justify the reasonableness of those steps if pressed by a DPA.

Where a data controller has made personal data public (e.g., by publishing it on a website) and receives a valid erasure request for that personal data, the GDPR requires the controller to, “taking account of available technology and the cost of implementation,” take “reasonable steps” to inform other third-party controllers who have access to the personal data of the erasure request.

This is an area on which we can expect further guidance from the DPAs, although it is not in the list of first wave guidance that we are expecting from the Article 29 Working Party this year.


IOS: An organization must appoint a data protection officer (“DPO”) if, among other things, “the core activities” of the organization require “regular and systematic monitoring of data subjects on a large scale.”  Many Varonis customers are in the B2B space, where they do not directly market to consumers. Their customer lists are perhaps in the tens of thousands of recipients up to the lower six-figure range. First, does the GDPR apply to personal data collected from individuals in a B2B context? And second, how when does data processing become sufficiency “large scale” to require the appointment of a DPO?

HL: Yes, the GDPR applies to personal data collected from individuals in a B2B context (e.g., business contacts).  The GDPR’s DPO requirement, however, is not invoked through the maintenance of customer databases.  The DPO requirement is triggered when the core activities of an organization involve regular and systematic monitoring of data subjects on a large scale, or the core activities consist of large scale processing of special categories of data (which includes data relating to health, sex life or sexual orientation, racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, or biometric or genetic data).

“Monitoring” requires an ongoing tracking of the behaviors, personal characteristics, or movements of individuals, such that the controller can ascertain additional details about those individuals that it would not have known through the discrete collection of information.

Therefore, from what we understand of Varonis’ customers’ activities, it is unlikely that a DPO will be required, although this is another area on which we can expect to see guidance from the DPAs, particularly in the European Member States where having a DPO is an existing requirement (such as Germany).

Whether or not a company is required to appoint a DPO, if the company will be subject to the GDPR, it will still need to be able to comply with the “Accountability” record-keeping requirements of the Regulation and demonstrate how it meets the required standards. This will involve designating a responsible person or team to put in place and maintain appropriate  policies and procedures , including data privacy training programs.

[Podcast] Attorney and Data Scientist Bennett Borden: Data Analysis Techniques (Part 1)

[Podcast] Attorney and Data Scientist Bennett Borden: Data Analysis Techniques (Part 1)

Once we heard Bennett Borden, a partner at the Washington law firm of DrinkerBiddle, speak at the CDO Summit about data science, privacy, and metadata, we knew we had to reengage him to continue the conversation.

His bio is quite interesting: in addition to being a litigator, he’s also a data scientist. He’s a sought after speaker on legal tech issues. Bennett has written law journal articles about the application of machine learning and document analysis to ediscovery and other legal transactions.

In this first part in a series of podcasts, Bennett discusses the discovery process and how data analysis techniques came to be used by the legal world. His unique insights on the value of the file system as a knowledge asset as well as his perspective as an attorney made for a really interesting discussion.

Subscribe Now

Add us to your favorite podcasting app:

Follow the Inside Out Security Show panel on Twitter @infosec_podcast


HHS to Investigate Smaller HIPAA Privacy Breaches

HHS to Investigate Smaller HIPAA Privacy Breaches

As  a reader of this blog, you know all about Health and Human Services’ (HHS) wall of shame. That’s where breaches involving protected health information (PHI) affecting 500 or more records are posted for the world to see. It’s actually a requirement of HIPAA – technically the HITECH Act. But now there’s been a slight change in breach policy.

The Office of Civil Rights (OCR), which is part of HHS, investigates all large HIPAA breaches. But this month they announced they will increase efforts to look into smaller breaches that come to their attention.

Regional offices will be given discretion to prioritize which smaller breaches to look into. Some of the factors that they’ll take into account are “breaches that involve unwanted intrusions to IT systems (for example, by hacking)” and “instances where numerous breach reports from a particular covered entity or business associate raise similar issues.”

The investigations will likely take the form of offsite “desk audits”.

Attorneys in data compliance will tell you that to pass these audits you’ll need to have your HIPAA paper work in order — documented security and privacy policies, recent risk assessments, and breach reporting procedures are top on the list.

This is just another indication of how HHS/OCR is stepping up its auditing and HIPAA enforcement.

Covered entities: you’ve been warned!


Data Privacy US-Style: Our National Privacy Research Strategy

Data Privacy US-Style: Our National Privacy Research Strategy

While the EU has been speeding ahead with its own digital privacy laws, the US has been taking its own steps. Did you know there’s a National Privacy Research Strategy (NPRS) white paper that lays out plans for federally funded research projects into data privacy?

Sure, the Federal Trade Commission has taken up the data privacy mantle in the US, bringing actions against social media, hotels, and data brokers. But there’s still more to do.

So you can think of the NPRS as a blue-print for the research and development phase of the US’s own privacy initiatives. By the way, the US government spent about $80 million on privacy research efforts in 2014.

What’s the Plan?

I scanned through this 30+ page report looking for some blog worthy points to share. I found a few

First, the NPRS has interesting ideas on how to define privacy.

The authors of the paper, representing major federal agencies including the FTC, don’t have a firm definition but instead they view privacy as being characterized by four areas: subjects, data, actions, and context.

Essentially, consumers release data into a larger community (based on explicit or implicit rules about how the data will be used) on which certain actions are then taken – processing, analysis, sharing, and retention. The idea (see diagram) parallels our own Varonis approach to using metadata to provide context to user actions to file data. We approve of the NPRS approach!


The larger point being that privacy has a context that shapes our privacy expectations and what we consider a privacy harm.

NPRS is partially focused on understanding our expectations in different contexts and ways to incentivize us to make better choices.

Second, the plan takes up the sexier matter of privacy engineering. In other words, research into building privacy widgets that software engineers can assemble together to meet certain objectives.

I for one am waiting for a Privacy by Design (PbD) toolkit. We’ll see.

The third major leg in this initiative targets the transparency of  data collection, sharing, and retention. As it stands now, you click “yes” affirming you’ve read the multi-page legalese in online privacy agreements. And then are surprised that you’re being spammed at some point by alternative medical therapy companies.

The good news is that some are experimenting with “just in time” disclosures that provide bite-size nuggets of information at various points in the transaction — allowing you, potentially, to opt out.

More research needs to be undertaken, and NPRS calls for developing automated tools to watch personal data information flows and report back to consumers.

And this leads to the another priority: ensuring that personal information flows meet agreed upon privacy objectives.

Some of the fine print for this goal will sound familiar to Varonis-istas. NPRS suggests adding tags to personal data — essentially metadata — and processing the data so that consumer privacy preferences are then honored.

Of course, this would require privacy standardization and software technology that could quickly read the tags to see if the processing meets legal and regulatory standards. This is an important area of research in the NPRS.

In the Meantime, the FTC Needs You!

You’re now fired up reading this post and wondering whether you—white hat researcher, academic, or industry pro—can have his or her voice regarding privacy heard by the US government.

You can!

The FTC is now calling for personal privacy papers and presentations for its second annual Privacy Con to be held in Washington in January 2017. You can check out the papers from last year’s conference here.

If you do submit and plan to speak, let us know! We’d love to follow up with you.

What is the Minimum Acceptable Risk Standards for Exchanges (MAR-E)?

What is the Minimum Acceptable Risk Standards for Exchanges (MAR-E)?

Under the Affordable Care Act (ACA) of 2010, there are now online marketplaces to buy health insurance. These are essentially websites that allow consumers to shop around for an insurance policy by comparing plans from different private providers.

Result: US consumers can purchase health insurance using the same technology that allows them to buy books, gadgets, and artisanal coffees on the web.

I think we can agree that health data that’s collected on these web sites deserves some extra protections.

The Origin of MARS

To address security issues of the exchanges, the ACA required the Department of Health and Human Services (HHS) to come up with data security standards.

Specifically, the Centers for Medicare & Medicaid Services (CMS), a part of HHS, was made responsible for providing guidance and oversight for the exchanges, including defining technical standards.

CMS then established the Minimum Acceptable Risk Standards for Exchanges (MARS-E), which defines a series of security controls. MARS-E is now in its second version, which was released in 2015.

Those familiar with NIST 800-53 — a security standard underlying other federal data laws such as FISMA — will immediately recognize the two-letter abbreviations used by MARS. They borrowed 17 control families from NIST 800-53, which for the record are:

Access Control (AC), Awareness and Training (AT), Audit and Accountability (AU), Security Assessment and Authorization (CA), Configuration Management (CM), Contingency Planning (CP), Identification and Authentication (IA), Incident Response (IR), Maintenance (MA), Media Protection (MP), Physical and Environment Protection (PE), Planning (PL), Personnel Security (PS), Risk Assessment (RA), System and Services Acquisition (SA), Systems and Communication Protection (SC), Systems and Information Integrity (SI).

The complete catalog of controls can be found here.

The controls provide only guidance — they are not meant to force specific security technologies on the exchanges!

HIPAA Confusion

You may ask whether HIPAA rules on privacy and security for protected health information (PHI) also apply to the health exchanges?

Great question!

Health exchanges are not covered entities under HIPAA.  So HIPAA’s Privacy and Security rules wouldn’t seem to apply.

But  … are they Business Associates (BAs) of the covered entity?

As you may recall, after the new rules that were published back in 2013 (the “HIPAA Omnibus Final Rule”) third-party contractors and their subcontractors who handle or process PHI would fall under HIPAA.

The short answer is that the exchanges can be BAs if they perform more than minimal data functions and have a deeper relationship with the insurer.

It’s really the same question that comes up with health wearables. HIPAA doesn’t apply to these gadgets, unless the gadget provider has a direct relationship with the insurer or health plan – for example, through a corporate wellness plan.

To get a little more insight into this confusing issue of health exchanges and HIPAA, read this article.

In the meantime, you can peruse the table below showing the mapping of relevant MARS-E controls to Varonis products.



MARS Control Family Requirement Varonis Solution
AC Access Control

AC-2 Account Management

a. Identifying account types (i.e., individual, group, system, application, guest/anonymous, and temporary);

b. Establishing conditions for group membership;

c. Identifying authorized users of the information system and specifying access privileges;

By combining user and group information taken directly from Active Directory, LDAP, NIS, or other directory services with a complete picture of the file system, Varonis DatAdvantage gives organizations a complete picture of their permissions structures. Both logical and physical permissions are displayed and organized highlighting and optionally aggregating NTFS and share permissions. Flag, tag and annotate your files and folders to track, analyze and report on users, groups and data. Varonis DatAdvantage also shows you every user and group that can access data as well as every folder that can be accessed by any user or group.
AC-6 Least Privilege


a. Employs the concept of least privilege, allowing only authorized accesses for users (and processes acting on behalf of users) that are necessary to accomplish assigned tasks in accordance with Exchange missions and business functions

Varonis DataPrivilege helps organizations not only define the policies that govern who can access, and who can grant access to unstructured data, but it also enforces the workflow and the desired action to be taken (i.e. allow, deny, allow for a certain time period). This has a two-fold effect on the consistent and broad communication of the access policy: 1) it unites all of the parties responsible including data owners, auditors, data users and IT around the same set of information and 2) it allows organizations to continually monitor the access framework in order to make changes and optimize both for compliance and for continuous enforcement of warranted access.
AU Audit and Accountability

AU-2 Auditable Events

(a) … that the information system must be capable of auditing the list of auditable events specified in the Implementation Standards;

Implementation Standards

Generate audit records for the following events …

h. File creation,

i. File deletion

j. File modification,

m. use of administrator privileges

Varonis DatAdvantage helps organizations examine and audit the use of ordinary and privileged access accounts to detect and prevent abuse. With a continual audit record of all file, email, SharePoint, and Directory Services activity, DatAdvantage provides visibility into users’ actions. The log can be viewed interactively or via email reports. DatAdvantage can also identify when users have administrative rights they do not use or need and provides a way to safely remove excess privileges without impacting the business.

Through Varonis DataPrivilege, membership in administrative and other groups can be tightly controlled, audited and reviewed.

Varonis DatAlert can be configured to send real-time alerts on a number of actions including the granting of administrative rights to a user or group. This allows the organization to detect, in real-time, when privileged access has been granted erroneously and act before abuse occurs. Real-time alerts can also be triggered when administrative users access, modify, or delete business data.

AU-6 Audit Review, Analysis, and Reporting

a) Reviews and analyzes information system audit records regularly for indications of inappropriate or unusual activity, and reports findings to designated organizational officials …

Implementation standards

5. Use automated utilities to review audit records at least once every seven (7) days for unusual, unexpected, or suspicious behavior.


Varonis DatAlert provides real-time alerting based on file activity, Active Directory changes, permissions changes, and other events. Alert criteria and output are easily configurable so that the right people and systems can be notified about the right things, at the right times in the right ways. DatAlert improves your ability to detect possible security breaches, and misconfigurations. DatAlert can be configured to alert on changes made outside a particular time window.

Varonis DatAdvantage monitors every touch of every file on the file system, normalizes, processes, and stores them in a normalized database so that they are quickly sortable and searchable. Detailed information for every file event is provided; all data can be reported on and provided to data owners. Data collection does not require native object success auditing on Windows.

IR Incident Response

IR-6.1 Incident Reporting

The organization employs automated mechanisms to assist in the reporting of security incidents


Varonis DatAlert provides real-time alerting based on file activity, Active Directory changes, permissions changes, and other events. Alert criteria and output are easily configurable so that the right people and systems can be notified about the right things, at the right times in the right ways. DatAlert improves your ability to detect possible security breaches, and misconfigurations. DatAlert can be configured to alert on changes made outside a particular time window


Hospitals (and Other Covered Entities) Will Be Randomly Selected for HIPAA Audits in 2016

Hospitals (and Other Covered Entities) Will Be Randomly Selected for HIPAA Audits in 2016

With July coming to an end and the year more than half over, it’s a good time to look at where we stand breach-wise. Your intuition may be telling you that 2016 has been a bad year with hacking attacks reported daily. You intuition is right.

The Identity Theft Resource Center is my go-to resource for current breach stats. As of July 19, the ITRC has tallied an astonishing 538 breach incidents, setting us up for the worst year ever.[1]

The mid-year mark is also a good point for IT security groups to collectively take a deep breath and re-evaluate their IT spending. And if you’re in a healthcare organization, there are additional considerations due to a new wave of HIPAA audits (more on that below).

Health Care Data Security Crisis

To get a big picture view of breach statistics, you should gaze upon ITRC’s table of stats from 2005 -2015.

The ITRC breaks out their numbers based on five broad industry sectors: business, educational, government, health, and banking.  Since 2012, medical has led the other groups in all but one year.

So far in 2016, medical is putting in another solid performance: it’s in 2nd place with 185 incidents. [2]


2016 breach incidents by industry sector (ITRC)

What’s Going on in Health Care IT Security?

Adam Tanner, a fellow at Harvard University’s Institute for Quantitative Social Science, has suggested that hackers see health care organizations as an easy target.

For a recent article in Scientific American, Tanner interviewed security experts who pointed out that health care industry doesn’t have the “security maturity” of other industries and underinvests compared to banking.

Point taken.

Though, you can also understand hospitals that look at their limited financial resources, and decide to invest in new treatment centers and lifesaving equipment rather than re-engineering their IT.

This may have to change.

Randomly Selected Covered Entities and Business Associates Will Receive Audits

Earlier this month, HHS launched Phase 2 of its HIPAA audit program. Over 160 randomly selected covered entities received a notice of a “desk audit”. These chosen healthcare organization will be required to answer questions—see this.

The second part of this program will involve audits of the business associates of these covered entities. Business associates, as you may remember, do additional processing of protected health information (PHI) for hospitals, and are now also covered by HIPAA’s rules.  We’re talking about you cloud-based service providers!

And then in the final round of audits, 50 hospitals and business associates can expect an onsite visit from HHS starting in 2017.


[1] Let’s also add the usual caveats.

The ITRC numbers are based on breaches reported in major publications. The breaches are not verified by ITRC or other security experts. And finally these are US-based incidents.

In other words, we’ll likely see different numbers from other sources, particularly Verizon’s DBIR, as they come up with their final 2016 results.  I suspect, though, that Verizon will also report record-breaking breach incident numbers.

[2] For those who want to see a more granular report of healthcare breach data, they should check out the Health and Human Services’ (HHS) “wall of shame”—healthcare providers and other covered entities who’ve reported HIPAA medical information exposures. By the way, the HIPAA breaches include hacking incidents as well as unauthorized data access, theft, and loss.

Understanding Canada: Ontario’s New Medical Breach Notification Provision (and Other Canadian Data Privacy Facts)

Understanding Canada: Ontario’s New Medical Breach Notification Provision (and Other Canadian Data Privacy Facts)

Remember Canada’s profusion of data privacy laws?

The Personal Information Protection and Electronic Documents Act (PIPEDA) is the law that covers all commercial organizations across Canada.

Canadian federal government agencies, though, are under a different law known as the Privacy Act.

But then there are overriding laws at the provincial level.

If a Canadian province adopts substantially similar data privacy legislation to PIPEDA, then a local organization would instead fall under the provincial law.

To date, Alberta and British Columbia have adopted their own laws, each known as the Personal Information Protection Act (PIPA). Alors, Québec has its own data privacy law.

Adding to the plenitude of provincial privacy laws, Ontario, New Brunswick, and Newfoundland have adopted similar privacy legislation with regard to health records.

Ontario’s PHIPA

So that brings us to Ontario’s Personal Health Information Protection Act (PHIPA).  Recently, PHIPA was amended to include a breach notification provision.

If personal health information is “stolen or lost or if it is used or disclosed without authority”, a healthcare organization in Ontario will have to notify the consumer “at the first reasonable opportunity”, as well as the provincial government.

Alberta, by the way, has had a breach notification requirement for all personal data since 2010.

What About Breach Notification at the Federal Level?

In June 2015, the Digital Privacy Act amended PIPEDA to include breach notification. Organizations must notify affected individuals and the Privacy Commissioner of Canada when there is a breach that creates a “real risk of significant harm” to an individual.

Notice that the federal law has a risk threshold for exposed personal information, whereas the new Ontario law for health records doesn’t. Alberta’s breach notification requirement, by the way, has a similar risk threshold to PIPEDA

Confused by all this? Get a good Canadian privacy lawyer!

Don’t be confused by how to detect and stop breaches! Learn more about Varonis DatAlert.









Is Browsing Facebook While in the Hospital a HIPAA Violation?

Is Browsing Facebook While in the Hospital a HIPAA Violation?

A recently filed federal class-action suit claims that several healthcare providers are violating HIPAA’s rules on protected health information (PHI). If the suit succeeds, privacy advocates say it has the potential to disrupt the way the ad targeting industry deals with the healthcare sector.

To really understand what’s going on, you’ll need some background on HIPAA.

HIPAA Privacy and Authorization

According to HIPAA’s Privacy Rule, covered entities – healthcare providers, insurers, and clearinghouses—require patients to give explicit authorization (as in ‘check box to approve PHI transfer to third-party’ in an online form) for their PHI to be used outside of a few very specific areas (payment, treatment, healthcare operations ).

PHI for marketing purposes definitely requires the covered entity to get authorization.

Hospitals, Patients, and Facebook

Suppose you’re a hospital patient waiting (and waiting) to see your doctor, and browsing the hospital website on your laptop looking for answers to a medical question. And let’s assume the hospital website also has a Facebook plugin that supports “like”.

As an active Facebook user, you are also keeping friends informed of your medical adventure.

Unbeknownst to you, URLs are being sent back to Facebook based on your hospital website browsing. The Facebook cookies on your laptop adds identifier information that lets Facebook then target information to its subscribers.

So as you’re lying in bed looking at friends’ Facebook status updates while dealing with amazing amounts of pain, you might be served up an ad about, say morphine drips, which are based on browsing the pain management section of the hospital website.

Of course, this is a huge part of the way Facebook makes its money. And this is what the suit is alleging took place with the hospitals and healthcare organizations that were named: webpages with Facebook plugins were sending browsing histories back to the FB mothership.

So What’s the Problem?

Another crucial fact: PHI covers more than a name, address, and other obvious identifiers.

While the healthcare organizations  in the suit are not sending classic identifiers, they are potentially providing URLs, IP addresses and sub state-level geo data back to FB.

According to HIPAA, these would qualify as PHI — based on the Department of Health and Human Services’ 18 element safe harbor list. And therefore, it would require patient authorization, which the websites did not request from users.

We’ve written previously about the broad definition of identifiable data used by HIPAA. In this case, these providers seemed to have been caught in the PHI’s very wide net.

In short: PHI is being sent from these websites to Facebook without patient permission. A big HIPAA violation.

Legal Questions

As a non-lawyer, this suit does raise an issue or two for me.

If you’re not a patient of a healthcare provider but use the site anyway, are you covered by HIPAA?

One argument I read is that if a hospital is a covered entity in the context of a patient-provider relationship, they’re a covered entity in all contexts, including the more typical user-website relationship.

So it doesn’t matter that you’re not a patient when browsing a hospital website: HIPAA would still apply!

The suit essentially says a hospital website can’t take online user information and send it to an ad network without violating HIPAA. If this claim is proven right, it will have enormous implications for the use of health and possibly non-health data by ad networks.

Facebook is clearly not a covered entity, so what did they do wrong?

The class-action suit says that Facebook violated state laws on health information, and — get this! — the federal Wiretap Act.

There’s a California law, for example, that requires explicit consent for health information to be sent to third parties. And if we use the broad PHI definition of identifiers, then Facebook could have violated that state’s law.

And the Wiretap law may kick in when you collect information over the Intertoobz without authorization. To me, though, this last one seems a bit of a — ahem — legal stretch.

This law suit is being closely watched by privacy pros. We’ll keep you posted if we hear anything new.

Confused by HIPAA? Then take our five-part email  HIPAA class. and soar like a legal eagle (or at least be able to answer a few legally related HIPAA questions).



We’ve been writing about the GDPR for the past few months now and with the GDPR recently passed into law, we thought it was worth bringing together a panel to discuss its implications.

In this episode of the Inside Out Security Show, we discuss how the GDPR will impact businesses, Brexit, first steps you should take in order to protect EU consumer data and much more.

Go from beginning to end, or feel free to bounce around.

Cindy: Hi and welcome to another edition of the Inside Out Security show. I’m Cindy Ng, a writer for Varonis’s Inside Out Security blog. And as always, I’m joined by security experts Mike Buckbee, Rob Sobers, and Kilian Englert. Hey, Kilian.

Kilian: Hi Cindy.

Cindy: Hey Rob.

Rob: Hey Cindy, how is it going?

Cindy: Good. And hey, Mike.

Mike: Hey Cindy, you made me go last this week. That’s all right.

Cindy: This week, we also have two special guests, also security experts. Andy Green, who is based in New York, and Dietrich Benjies who is based in the UK. And they’re here to join us to share their insights on the latest General Data Protection Regulation that was just passed with an aim to protect consumer data that will impact not only businesses in the EU, Britain and the US and the rest of the world. So Hi Andy.

Andy: Hey Cindy.

Cindy: Hey Dietrich.

Dietrich: Hi Cindy.

What is the EU General Data Protection Regulation?

Cindy: So, let’s start with the facts. First, what is GDPR and what are its goals?

Andy: In one sentence? Can I get two?

Cindy: You get two and a half.

Andy: Okay, two and a half.

So it stands for General Data Protection Regulation. It’s a successor to the EU’s current data security directive which is called the Data Protection Directive, DPD. And it really…I mean if you are under the rules now, the GDPR will not be a major change but it does add a few key major additions. And one of those is…well there is a stronger rules on, let’s say right to access your data. You really have … almost like a bill of rights.

One of them is that you can see your data, which is maybe not something in the US we are experienced with.

Also, another new thing is you have a right of portability, which is something that Facebook probably hates. In other words, you can download the [personal] data. If I were, I assume this would happen in the UK or the EU, that if you are a Facebook customer you will be able to download everything that Facebook has and have it in some sort of portable format.

And I guess that  [if you have another] social media service, you can then upload that data to that social media service and say goodbye to Facebook, which is kind of not something they’re very happy about.

… You have almost like a consumer data rights under the new rule. I don’t know if anyone has any comments on some of these things but I think that’s…that, I think, is like a big deal.

Dietrich: I’m sorry Mike. Were you going to go next? I chimed in so I suppose I’ll carry on-

Cindy: Go ahead, Dietrich.

Dietrich: So I think in terms of your attendance, it’s the European Union recognizing that data is…the European citizens recognize their data as important and historically, recently and historically, there has been many cases where it hasn’t been demonstrated to be appropriately controlled.

And as it’s a commodity, the information on them is a commodity traded on the open market to a degree that there has just been an increasing demand to have greater safeguards on their data. And those greater safeguards on European citizen data gives them greater confidence in the market, in the electronic market that the world economic market has become.

So that the two pillars, which we’ll get to, or the two tenants are Privacy by Design and accountability by design … we’ll get to a lot of things but that’s synopsis on it.

Mike: I was curious about to what extent this was targeting enterprises or is it targeting, say like you brought up Facebook, which I consider an application, like a web application service. Was there an intent behind this, that it’s targeting more one or the other?

Andy: Yeah. It’s definitely, I would say consumers. I mean it’s really very consumer-oriented.

Dietrich: Mike do you mean in terms of it’s targeting the consumers? Yes, it’s consumer data. It’s related to but do you mean in terms of the types of businesses where it’s most applicable? Is that what you mean Mike?

Mike: Well, you know, there is a decision-making framework that, so now with GDPR as the Data protection Directive to need to make decisions, that I’m building an application, I’m going to need to have new privacy features. We talked about Privacy by Design which has its own sort of tenets. Or I’m building out the policies for my company which has satellite offices all over the world and some of them happen to be in the EU. Just trying to look at the impact and look at how this should change my decision making on the business.

Dietrich: Well, it’d be cynical. I’d say if you want to avoid it totally and entirely, just don’t sell to an EU citizen.

Rob: Yeah, I think, to answer your question, Mike, the Facebooks of the world and these global web services are going to have to worry about it if they are collecting data. And we all know Facebook not only collects the data that you give them but it also ascertains data through your actions.

And I think that’s what Andy was talking about is that it’s not just the ability to click a button and say give me my profile data back now so I can take it with me. It’s like I put that data in but I think what the GDPR is aiming to do is give you back the data that they’ve gathered on you from other sources. So tell me everything you know about me because I want to know what you know about me. And that’s, I think, a very important thing. And I really hope that the US goes in that direction.

But outside of those web services, think about like any bank that serves an EU customer. So any bank, any healthcare organization, so other businesses outside of these big global web services certainly do have to worry about it, especially if you look in your customer database or any kind of…if you are a retailer, your transaction database, and you have information that belongs to EU citizens then this is something that you should at least be thinking through.

Who will be tasked to implement GDPR?

Cindy: So who needs to really pay close attention to the law so that you are executing all the requirements properly?

Dietrich: Who needs to pay attention to it in terms of those organizations and scope? It’s pretty well spelled out that the organizations who deal with, who transfer, who process big things on processing and doing this information associated to European citizens.

So if I backtrack a bit, it was where we are starting with the portability of the data, the information that we have, that organizations have on individuals and those subject access request, right to erasure, kind of the first and foremost is the protection element. Making sure that the data is protected, that we are not…organizations aren’t putting us at risk by the fact that they are holding our data and making that overexposed.

Kilian: To kind of address the question more technically speaking, I think … everybody involved in the process needs to pay attention to it. From the people designing the app, Mike, if you want to launch your business, you need to realize that there are…boundaries are kind of made up anymore with technology.

So right from the beginning, we’ll talk about Privacy by Design. But that needs to be the first step, all the way up to the CEO of the company or the board realizing that this is a global marketplace. So they want to get the most amount of customers, so they have to take it seriously.

Andy: Yeah, I was going to say that they do have a heart at the EU … and they do make an exception …  there is some language for making exceptions for smaller businesses or businesses that are not sort of collecting data on, what they say, like on a really large scale–whatever that means!

What you are saying is all true but I think they do say that they will sort of scale some of the interpretations for smaller businesses so the enforcement is not as rough. And there may even be an exclusion, I forget, for under 250 employee companies.

But I think you are right. This is really meant for the, especially with the fines, it’s really meant to get to C-Level and higher executive’s attention.

What’s the first step you need to take to take when implementing GDPR?

Cindy: So if you are a higher up or someone responsible for implementing GDPR, what’s the first step you need to look for and so you don’t miss any deadlines, so that you are planning ahead?

Andy: I think we had to talk about this the other day. I’ve actually talked about it with Dietrich. Some of this is really, I’d say, like common IT sense and that if you are following any kind of IT best practices and there are a bunch of them or some standards, you are probably like 60 or 70% there, I think.

I mean if you are, let’s say you are handling credit card transactions and you are trying to deal with PCI DSS or you are following some of the– forget what they call — the SANS Top 20 … So maybe I’ll say it’s sort of like putting laws around some common sense ideas. But I realize the executives don’t see it that way.

Kilian: Yeah. I think the first thing you have to do is figure out if you have that data, to begin with, or where it’s at. I mean the common knowledge is you probably do. If you do some type of commerce or interact with anybody really, you are going to store some information. But kind of nailing it down where it’s at or where it might be is I think the key first step.

Dietrich: And in terms of deadlines, I suppose to answer your question very directly, the deadline is May 25th, 2018, is when it comes into full force. That is the, I wouldn’t say it’s fast approaching. We still have 23 months.

Dietrich: I’ve got a clock on my laptop right there. Deadline to GDPR.

Data Breach Notification

Cindy: So there is also a data breach notification. What does that process entail? Like how do you get fined and how do you know that personal data has been lost or breached? What’s defined as personal data? Because there is a difference between leaking like company ID, company IP versus leaking personal data.

Andy: Actually I happen to have the definition right in front of me. So it’s any information related to a person. And in particular, it can be…so it says an “identifiable person is one who can be identified directly or indirectly in particular by reference to an identifier such as a name, an identification number, location data, or an online identifier”.

So it’s really, I guess what we would call in the US, PII [personally identifiable information], but it’s broad. It’s not just a strict list of social security number or specific account numbers. Those are examples of the types of identifiers. So it’s very broad but it has to relate back to a person and they do consider the online identifiers as “relatable to a person”.

Brexit and GDPR

Cindy: And kind of I can’t help but ask Dietrich, will Brexiters be exempt from GDPR?

Dietrich: No. Not at all.

So, first off, yes. A week ago today, we cast our votes. And then a week ago tomorrow it was found out that yes, in fact, we are leaving the European Union. So the reality of that is we haven’t invoked article 50. So article 50 is that yes, we are definitely doing it. We are doing it and then we have 24 months for them to get the heck out of the European Union.

The starting of that clock isn’t likely to happen for some time. For one David Cameron, who is currently our prime minister is stepping down…has stepped down. We have to wait. He said, “I’m not going to invoke. I’m going to let somebody else handle not only that process of invoking article 50 but in addition to that, negotiating the trade policies and all the things associated with the exit.”

In addition to all the things associated with the exit is the adoption or exclusion of a lot of the European directives, GDPR being one. So we could just sit there and not only, so if you take that time scale that will come into play if article 50, and there is some questions on the legality of the referendum, which I won’t go into in detail but there is a lot of debate going on in the moment that we voted leave if it’s actually something that will happen.

If it happens, and let’s say it will, the time scale of that activity is likely to be well after GDPR is in effect. And if GDPR does come…sorry, and even if we leave and the likelihood as in democratic country in which we live, we have cast a vote that we will leave, we could still take on GDPR as our own.

We have our own Data Protection Act here in the UK. We could just bump it up with GDPR at a stroke of a pen. And that’s quite likely considering we are debating in negotiation. We will negotiate for, hopefully, as freer trade as we can do within the European Union and I’m sure that will be…it would make sense that that would be a dependent clause.

Andy: And I was going to say, it looks like if you’re…since the UK has to trade with the EU, the EU countries are going to put in higher standards for e-commerce transactions.

Dietrich: Yeah. They are out biggest trading partner. I believe and don’t quote me on this but I could be wrong. I think it’s 54, 54% of our exports go to the EU. And likewise, we are one of the biggest trading partners for France, for Germany, etc.

Territorial Scope

Cindy: So, the US, we trade with the EU and the…

Dietrich: Do you? (sarcasm)

Cindy: I’m really talking about territorial scope. And I’m curious if I start a business or Mike starts a business, we talked about this earlier, how will I…what’s the law in terms of me needing to protect an EU consumer’s personal data? That’s a little controversial. Go ahead Dietrich.

Dietrich: Can I give you some examples on this?

In the last 48 hours, I have purchased a flight from Southwest Airlines, United Airlines, I’m a European citizen. I have purchased a backpack from some random site that’s being shipped to my father.

Look, I hope I’m not debt dipping myself in tax loss but anyway, you know what I mean. As a European citizen, I’m going to be in the States for three weeks as of next week. So I’m a European citizen who is going to be transacting, who is going to be purchasing stuff over there. So, considering the freedom of movement that exists, the small world in which we live where European citizens regularly travel to the US, regularly buy from sites online, I can’t see how the border is going to make any difference.

Most, if not, I’d say the vast majority of organizations in the US will deal with European citizens and therefore at least for that subset of data related to European citizens, they will be…they’ll have to put in controls if they want to carry on trading with European citizens.

Cindy: Go ahead, Mike.

Mike: Well, I was trying to think of parallels to this. And there is one that I think a lot of people are aware of which is like the Cookie Law which is, there were some European directives around like you should have, like if you land on a website, sometimes you see those banners at the bottom that says this website uses cookies and then click to, which came out of a similar thing. That’s really only been European websites that are doing that, but that sort of a half step into this. I just wonder if that shows a model for how this is going to be adopted so that it’s only the very strictly EU sites.

Andy: Yeah. I think that was, that came out of, I forget, it may have been the Data Protection Directive but you’ve got to gain consent from the consumer and they apply it to cookies, accepting cookies. So you do see that on a lot of the EU sites, that’s right.

Mike: It just seems very odd because there is no…it doesn’t seem like it will improve things. It just seems like, yeah, we are getting cookies off you so here is this giant banner that gets in the way.

Andy: Will they ever click no?

Mike: Well, what’s interesting is that I don’t think I’ve ever actually seen like, “Yeah, no, don’t collect my cookies.” It just says like, “Hey, we are doing this so accept it or leave.” You are on my website now, so probably with a French accent.

Tension between Innovation and Security

Cindy: So in terms of, we talked about the cookie law, we’re talking about the GDPR.

If you are a CEO and you know that there is a potential risk of anything really, and let’s say data breach, if something happens, they’re often asking, “okay, higher ups, can we work through this? Will our companies survive?”

It sounds like people don’t like to be strong-armed into following certain laws. Like if I’m an entrepreneur, I’m going to come up with an idea. And the last thing I would want is like, oh, I have to follow privacy by design. It’s annoying.

Rob: Yeah. I mean it’s a push and pull between innovation and security. You see this with all sorts of things. You know, Snapchat is famous for its explosive growth, hundreds of millions of active users a day. And in the beginning, they didn’t pay attention to security and privacy. They kind of consciously put that on the back burner because they knew it would slow their growth.

And it wouldn’t have mattered as much if they never became a giant company like they are today. But then it came back to bite them, like they’ve had multiple situations where they’ve had data breaches that they’ve had to deal with and I’m sure devote a lot of resources to recovering from, not only on the technical side of things but also on the legal and PR side. So it is a push and pull but we see it in varying degrees everywhere.

Look what Uber is doing as they expand into different markets and they have to deal with all of the individual regulations in each state that they expand to, each country. And they would love to just close a blind eye and focus on improving their technology and recruiting new drivers and making their businesses a success.

But the fact of the matter is — and the EU is way out in front of everybody else on this — is that somebody has to look out for the customers. Because we just see it over and over again where in the US, it’s almost like flipping. Like we see these massive breaches where people’s healthcare information is exposed on the public web or their credit card numbers get leaked or God knows what kind of information. And it just doesn’t ever feel like there is enough teeth to make organizations really assess their situation.

Like every time I apply– and I don’t do this very often, thank God!–apply for a mortgage in the US, the process, it scares me. You have to email sensitive information to your mortgage broker in plain text. They are asking for PDFs, scans of your bank account. And where that information goes, you’re just not that confident in a lot of these companies that they are actually looking at information and putting it in sensitive secure depositories, monitoring who has access to it. It’s just…without this regulation, it would be…without regulations like GDPr, it would be way worse and there would be no one looking after us.

Kilian: You actually kind of beat me to the point I was going to make there Rob by couple of sentences. But, you know, fine. The businesses don’t like being strong-armed but the consumers don’t like having their entire lives aired out on the Internet.

And I think you are 100% right there. It is a pain in the butt in some cases for innovation, but we keep going back to it or I will but Privacy by Design. You don’t have to make an and/or decision. If you start with that mind to begin with you can achieve both things. You can still achieve massive growth and avoid some of the problems instead of trying to patch up the holes later on.

Dietrich: One thing in terms of the strong arm, in terms of the regulatory fatigue that organizations get, I have been dealing with organizations for some time and it seems that regulations are at points that the external world makes organizations focus on the only things they will focus on.

And this is important. It’s important for us. I mean I kind of like…I don’t kind of like. I quite like the intent of the regulation. It’s down to protect me. It’s not something that’s esoteric. It’s something that’s quite explicit to protect more information. And if it requires a regulation for them to take heed and pay note and to get over the fact that regardless if they have been ignoring data breaches in the past, to do so in the future may cost them more than it had, then that’s probably a good thing.

Andy: I was just going to say that one of the, like the one word they use in a lot of the law is just it has to do with Privacy by Design. It’s just minimize. I think if you just show that you’re aware of what you are collecting and trying to minimize it and minimizing what you collect, put a time limit on the data that you do collect, the personal data, in other words, if you’ve collected it and processed it and you no longer have a need for it, then get rid of it.

It seems common sense and I think they want the companies to be thinking along these lines of, as I say, just minimize. And that shouldn’t be too much of a burden, I think. I don’t know. I mean I think as Rob was saying, some of these web companies are just going crazy, collecting everything, and it comes out to sort of bite them in the end.

Mike: And this is me being cynical but I wonder if this is going to be a new attack vector. If there is like an easy way to get all your information out of Facebook, then that’s the attack vector and you just steal everyone’s information through the export feature.

I don’t know if anyone else saw there is a thing that you could hijack someone’s Facebook account by sending in a faxed version of your passport. That was a means by which they would reset your password if you couldn’t do anything else and you lost access to it. They are like, “Well, this whole rigamarole, but fax in your passport,” and so people were doing that as a…I think its good intentions. I just wonder about the actual implementation, like how much of a difference it will actually make.

Rob: Yeah, and I think you are right Mike that the execution is everything in this. With these regulations, we see it with failing PCI audits. PCI auditors that are checking boxes. And having worked for a software company that, in a previous job, that did retail software and was heavily dependent on collecting credit card information from certain devices and terminals and keyboard swipes and all sorts of things and gone through a PCI audit, knowing that there were holes that weren’t done by the auditors, it’s all about the execution. It’s all about following through on best practices for data security. And the regulation itself isn’t going to make you excellent at security.

Tips on Protecting Customer Data

Cindy: So if I’m trying to catch up… in terms… if I am not following PCI or if I am not following the SANS top 20, which is now renamed to something else like Critical Security Controls… so what are some of the things that I can start with in terms of protecting my customers’ data? Any tips?

Rob: Well I mean one thing and Andy kind of touched on this is don’t collect it if you don’t have to. I think that’s the number one thing. I mean certain services out there actually make it easy for you not to touch your customers’ data. For instance, Stripe, which is a pretty popular payment provider now, if you are collecting payment information on the web from customers, you should never know their credit card number. It should never hit your servers. If you’re using something that Stripe, it basically goes from the web form, off to Stripe and you get at most the last four digits and maybe the expiration number. But as a business, you never have to worry about that part of their profile, that sensitive data.

So to me, start with asking that question of what do we actually have to have. And if we don’t need it, get rid of it and let’s look at all of our data collection processes, whether it’s by paper form or web form or API, whatever the method is and decide what can we ax to just cut out the fat. Like we don’t want to have to hold your information if we don’t have to. Now, failing that, I know a lot of companies cannot do that, like Facebook’s business is knowing everything about everybody and the connections. And so in that situation, it’s a little bit different.

Cindy: It’s hard because what if I’m a company and I just what if I’m a hoarder? Like I hoard my…I live in New York, my studio is tiny, what if I like to hoard?

And it’s kind of like you are digitally hoarding stuff. And …. storage is cheap, why not get more? What would you say to a digital hoarder in terms of I might need this information later?

Rob: I would say stop. Stop doing that! There are data retention policies that prevent you from doing that that you can implement. It’s an organization culture thing, I think. Some organizations are great at data retention, others are hoarders. It’s just bad data protection.

Dietrich: Greater data retention and hoarders. We’d love to retain data. Most of the organizations we’ve talked to love to retain data. It’s nice having something to get in that stick which sits there and goes, just get rid of it. I talk to organizations now and I’ll go finally this is being implemented in such a way that we actually can go back to the business. Who doesn’t want the data deleted? It’s usually people in the business who says I may, at some time in the future, need that document that I created 15 years ago. Well not if it has anything related to an individual associated with it.

In that case, you can only keep it for as long as it is a demonstrable requirement to have that. So I think it’s something at that level, which should be welcomed by organizations, not unless they are really…I mean my wife’s a bit of a hoarder. If she was running a business, she would definitely have many petabytes of information. But related to individuals, it would give me the excuse to throw it out when she isn’t looking.

Andy: Right. I was going to add that the GDPR says, I mean yes, you can collect the data, you can keep it, but I think there is somewhere that says that you have to put a time stamp on it. You have to say, “This is the data I have and, okay,” if it’s five years or ten years, but put some reasonable time stamp on this data and then follow through. So sure, collect it. But make sure it has a shelf life on it.

Final Thoughts

Cindy: Any final thoughts before we wrap up? Silence, I love it.

Michael: I was on mute, so I was talking extremely loudly while no one heard me. I was going to say my final thought was that, we kind of started this with Andy saying that a lot of this was common sense IT things.

And I think that’s probably the biggest takeaway. The thing to do immediately is to, I think, just do an audit of all of your data. That’s just good practice anyway. If you don’t have that at hand, you should start doing that. Whatever the regulations are, whatever your situation, it’s very, very hard to think of a situation where that wouldn’t be to your advantage. So I think that’s the first thing and most immediate thing any company should do.

Dietrich: That’s a very good point and something that also, related to GDPR, is the point within GDPR in terms of the data breach impact disbursements. That’s also understanding what you have, making sure that you have the appropriate controls around it. So that’s just understanding, going through that audit directly helps you for GDPR.

Upcoming Webinars: July 21st English, July 28th German and French

Cindy: Rob, you mentioned there is a webinar on GDPR. When can people tune in?

Mike: Rob told me there was a barbecue at his house for the next GDPR meeting. Just come on over, we’ll talk European regulations, smoke some brisket.

Cindy: I need some help from people de-hoarding my studio. First, I need to go home and change all my passwords because I have a password problem. Now you all know I’m a hoarder.

Mike: This is just leading up to you having your own Lifetime television series I mean.

Cindy: That will be exciting.

Mike: I’d watch it.

Cindy: It will be Tiger Mom, 2.0.

Rob: So yeah, so we’re having a webinar on July 21st in English and we are having another one on July 28th in German. So for anybody that’s interested in the GDPR, we are also doing it on the 28th in French. So we are having multiple languages for you and they can go to and just search for GDPR in the upper right-hand corner and you should be able to find the registration form.

Cindy: Thanks so much, Rob.

Dietrich: Whether you speak it or not. Yeah, fantastic.

Cindy: Thank you so much Mike, Rob, Kilian, Dietrich, and Andy. And thank you all our listeners and viewers for joining us today.

If you want to follow us on Twitter and see what we are up to, you can find us @varonis, V-A-R-O-N-I-S. And if you want to subscribe to this podcast, you can go to iTunes and search for the Inside Out Security show.

There is a video version of this on YouTube then you can subscribe to on the Varonis channel.

And thank you and we’ll see you next week. Bye guys.

Subscribe Now

Add us to your favorite podcasting app:

Follow the Inside Out Security Show panel on Twitter @infosec_podcast




The General Data Protection Regulation (GDPR) took years to become law as the relevant parties engaged in endless rounds of negotiations. It’s not surprising that there are some controversial elements. Time for another GDPR infographic!

We’ve boiled down the controversies into three areas: territorial scope, right to be forgotten, and steep fines.

Large US and other multinational companies, especially those in social media and search space, will have some real issues with the way the GDPR views where the law applies, the meaning of data erasure, and its assumptions about non-EU revenue as a source for fines.

The infographic below will give you a quick lesson in the GDPR’s more argumentative aspects.


Want to learn more about the GDPR?

Check out our free 6-part email course (and earn CPE credits!)

Sign me up

EU GDPR: Data Rights and Security Obligations [INFOGRAPHIC]

EU GDPR: Data Rights and Security Obligations [INFOGRAPHIC]

The EU General Data Protection Regulation (GDPR) isn’t light reading. However, it doesn’t mean that this law’s essential ideas can’t be compressed and rendered into a highly informative infographic.

We’ve been spending the last few months untangling the legalese and looking for ways to simplify the GDPR’s key requirements. One way to view the new EU law is to think of it as giving consumers certain rights over their data while also placing security obligations on companies holding their data.

And that’s the approach we took with the infographic below. With just a glance, you’ll get a very good sense of what you’ll need to do to become compliant. Of course, you should read our content to achieve full GDPR enlightenment,.

But in the meantime, use the infographic as a visual “cheat sheet”.


Want to learn more about the GDPR?

Check out our free 6-part email course (and earn CPE credits!)

Sign me up