Varonis announces strategic partnership with Microsoft to accelerate the secure adoption of Copilot.

Learn more

Interview With Alexandra Ross, The Privacy Guru

Alexandra Ross is not your ordinary attorney practicing privacy law. Her CV includes a stint as Associate General Counsel for Wal-Mart Stores, where she built this giant retailer’s privacy policies...
Michael Buckbee
7 min read
Published March 11, 2015
Last updated May 9, 2022
ar-300x300
Alexandra Ross: Privacy Attorney and Guru

Alexandra Ross is not your ordinary attorney practicing privacy law. Her CV includes a stint as Associate General Counsel for Wal-Mart Stores, where she built this giant retailer’s privacy policies and procedures from the ground up. She is San Francisco based and consults with many Bay area tech companies. Her point of view on privacy is based heavily on Privacy by Design principles. And more provocatively as The Privacy Guru, she believes that mindfulness has important lessons for how we as consumers interact with online services.

Metadata Era: Privacy by Design is something we should all be doing. And certainly something that our regulators, at least, at the FTC, have bought into  in a big way. So we know from Privacy by Design that data minimization is an important principle.

Get the Free Pen Testing Active Directory Environments EBook

“This really opened my eyes to AD security in a way defensive work never did.”

In your work with companies in the Bay area, do they have some kind of strong reaction to this? At least for consumer-oriented Internet companies, I imagine they may take more than a little convincing on this.

Alexandra Ross: It’s a really good question. But before I get into it, what’s interesting is how you operationalize Privacy by Design (PbD) principles into business initiatives or the technology that’s being developed or the code that’s being written. That’s the intersection of Privacy by Design and privacy engineering.  If you look at PbD, it’s been around for several years.

Yes, companies are receptive in general to the idea of PbD. But when the rubber hits the road, it becomes a question of how do you really convey the principles and get them to ‘operationalize’ it. And then make it part of the culture of an organization so that it’s ‘baked in’.

For example, data minimization is a really good point. If you’re speaking to the marketing team at a technology company—yeah, the default often is let’s collect everything. In other words, let’s have this very expansive user profile so that every base is covered and we have all these great data points.

But if you explain, or ask questions—what is the intent of this initiative, what are you trying to accomplish?—then you can drill down to learn what’s really necessary for the data collection.

As opposed to coming in and saying ‘you can’t do this, it’s not appropriate by Privacy by Design data minimization principles or you not adhering to the Privacy-by-Design philosophy’.

ME: I see, so make a business case instead of appearing as the proverbial ‘bad guy’.

AR:  Exactly. You work with them and ask them ‘what is this particular marketing campaign intended to achieve?’ and then come up with actual necessary data points.

You can also point out that as a best practice data minimization is intended to make life easier for a company. Because the fewer data points, the less data selected and maintained over time, the less risk there is of a data breach, and less cost for data storage.

If you think about it more strategically, you can point out issues they may not have considered, instead of being the bad privacy enforcer.

This is where privacy engineering can assist. If you have principles and guidelines that developers can actually use, they can implement data minimization principles when making coding decisions.

ME: I absolutely agree with you that having more data brings risks—more data for hackers to work with. I’m wondering if the large breaches over the last year and a half have changed how people at higher level view their data?

AR: So you don’t always want to use these fear tactics with decision makers, but they are instructive. Last year, after the Target and other breaches, it brought the data risk factor home to the executive suite, the board of directors, and they start asking questions to legal departments and CSOs.

The Sony breach in particular really brought it home in a different way, because it wasn’t customer data, it was private emails of employees. And that made it more personal. From what I’ve heard anecdotally, it appears to have scared people.

I do think that Sony breach acted as an example to wake people up to the risks. Embarrassing or inappropriate emails are out there for the world to see. Overall, it’s a good thing that there’s more coverage of these data breaches. Though I wish we were more proactive rather than reacting to crises.

Also we are seeing more interest from policy makers—the FTC, the White House—in taking cyber security seriously.

ME: Would regulations help, say a national breach law?

AR:  We’re kind of at a tipping point with breach notification laws in particular. So many states have their own bespoke laws [don’t we know!]. It may or may not be a positive to have a federal standard. I work in California, which typically is a leader in privacy. The data breach law in California is one of the strictest in terms of how it protects data.

ME: Wish we all could be California!

AR: On the other hand, some say it would be beneficial to have a federal breach law because it would level the field and make it easier to know what the standard actually is.

We’ll see!

Companies are taking it more seriously. If nothing else, they’re now responding to questions from their customers and the board of directors. And they need to have some answers of what they’re currently doing in terms of remediation plans and data breach readiness. So I do think there’s more awareness, and it’s starting to turn into action.

ME: I like you’re use of the words awareness and mindfulness, which shows up in your blog and website, The Privacy Guru.  On your blog, you recommend that as consumers we all need to be more mindful.

AR:  It’s a team effort: it’s government legislation, it’s best practices in an industry, it’s companies doing the right thing, and it’s consumers taking responsibility for their privacy. We’re online all the time, so we have to take responsibility. It takes a bit of work, yes, but it’s possible.

And companies can do their part by being more transparent. So that we can look at a privacy policy and understand what it means. And look at privacy settings and make an informed decision.

That’s a lot of what I’m trying to do with The Privacy Guru. I want to make privacy engaging for the ordinary consumer and for them to have some curiosity about privacy issues. I focus on helping them find resources so that they can be educated and incorporate private practices in their daily life.

But I do think it’s up to the individual to take some responsibility, as with anything else in adult life—like paying taxes and voting.

ME: I agree with you on all these points. Do you have any simple tips about what you need to look at when online?

AR:  One key thing is to stop, take a moment, and be mindful of what’s going on. What data am I being asked to submit when I sign up for a social media service?  And question why it’s being asked.

It’s worth the effort to try to read the privacy policies, or read consumer reviews of the app or online service.

Certainly if you’re, say, purchasing a smart phone or smart TV you may want to take more time to understand the data collection practices of that product. Are you comfortable with it? Reading the privacy policy may influence your purchasing decision!

ME:  At the Metadata Era, we’ve written about health wearables and their privacies policies. And it’s not unusual for the policy to say that the data can be given to third parties. They’ll ‘anonymize’ that data, but still you effectively lose control. I’m wondering how you feel about that?

AR: Great question. Third parties can mean a variety of things.  It can mean a service provider that a company has a contract with—hosting or data processer—where there’s some control over how it uses the data.

If it’s broader than that-say, third parties for analytics or marketing purposes—those have different considerations.

You brought up anonymizing data, which is an interesting term. I prefer the term de-identified, which means you can’t identify the person from the data. But there’s been some recent studies and research that question the effectiveness of de-identification practices and the ability to re-identify data sets.

One company may de-identify the data, store it, and use it in a particular way, but there’s the possibility to re-identify the data. That’s a concern for consumers.

ME: Yes, it’s an area of ongoing research, which we’ve been writing about. It’s something that consumers should be aware of. Even when a company doesn’t have very specific information—name, address—you can still be giving them lots of information that when correlated with some other data can be used re-identify the consumer.

So does some of this come up when you consult with companies—de-identification?

AR: It does in my role as an in-house attorney for various technology companies. And that goes back to the Privacy by Design and privacy engineering in terms of how the company looks at the data it collects from customer and employees and what it may get from data brokers or other public records.

There’s definitely best practices and regulations that are starting to be developed about how a company should address governance of big data collection. A lot of it has to with how the data is collected and what the consumer understands is being collected and also how you tie those data points together.

Consumer expectation is key. So by providing certain information, the end result could be a user profile that is broader or more encompassing than that particular user had anticipated.

It’s something that I advise on. There’s a large grey area.

ME: But there are other aspects to big data that can be helpful on a societal level.

AR: There’s an upside to the big data analytics and the data we can collect for positive purpose–medical is a good example.

Another issue of course are the ethical implications. This comes up sometimes in medical research. When you’re providing genetic data — is that completely anonymized or de-identified and can it be re-identified at some point?

If I submit my genetic data for the good of some study and I expect it to be de-identified and it’s not, that’s a problem!

ME:  In total agreement about the issue of expectations of privacy not being met. There are some social media companies that specialize in transient, self-deleting media and others that make more definite guarantees about privacy—say, no third-party access.

Do you think there will be more of a market for privacy?

AR: Sure there are some companies and technologies that are trying to differentiate themselves by being more privacy protective—Wickr come to mind.

These are great business models, and they’re starting to be more adopted by the mainstream consumer. There are a whole host of privacy protective products—sophisticated or middle of the road or very user friendly—that are either stand-alone or work with other products or services we use every day. That’s one set.

But we’re also seeing ‘traditional’ companies that have more to say about privacy and security practices. We’ve seen Apple come out with being pro-privacy as a way to differentiate itself.

ME: So we have choices.

AR: A lot of times people complain we don’t have privacy choices — the choice they say becomes either use it with privacy compromise or not at all. More and more we’re seeing companies adopting privacy and security practices and using that as a marketing tool and competitive differentiator.

And I think consumers will have more choices and be able to decide for themselves whether convenience trumps privacy in a particular use case.

ME:  This has been great! And a good point to close out the discussion, because the scale is tipping back from conveniences toward privacy and security concerns.

Thank you Privacy Guru!

AR: Thank you!

What you should do now

Below are three ways we can help you begin your journey to reducing data risk at your company:

  1. Schedule a demo session with us, where we can show you around, answer your questions, and help you see if Varonis is right for you.
  2. Download our free report and learn the risks associated with SaaS data exposure.
  3. Share this blog post with someone you know who'd enjoy reading it. Share it with them via email, LinkedIn, Reddit, or Facebook.

Try Varonis free.

Get a detailed data risk report based on your company’s data.
Deploys in minutes.

Keep reading

Varonis tackles hundreds of use cases, making it the ultimate platform to stop data breaches and ensure compliance.

privacy-by-design-cheat-sheet
Privacy by Design Cheat Sheet
Privacy by Design (PbD) has been coming up more and more in data security discussions. Alexandra Ross, the Privacy Guru, often brings it up in her consultations with her high...
browsing-anonymously:-is-it-really-anonymous?
Browsing Anonymously: Is It Really Anonymous?
What can tools like private browsing and VPNs really deliver in terms of privacy? See for yourself as we take a deep dive into popular privacy tools
australian-privacy-act-2022-updates
Australian Privacy Act 2022 Updates
A series of stunning data breaches in 2022 has prompted lawmakers to begin making changes to the 1988 Australian Privacy Act in the form of the new Privacy Legislation Amendment (Enforcement and Other Measures) Bill 2022.
how-privacy-policies-have-changed-since-gdpr
How Privacy Policies Have Changed Since GDPR
In March the EU's General Data Protection Regulation went into effect. The data privacy law aims to create greater transparency around how personal data is handled. As a result of GDPR, privacy policies across the web were changed. We look at how GDPR changed the policies of some of tech's biggest names.